360º degree camera in three.js - camera

Does anyone know how to create a 360º camera in three.js?
I'm trying to render the entire scene as a 360º panorama like you would with a go pro 360 rig.
I'm trying to recreate a panorama by arranging several screens in a circle and stretch a threejs window across all the screens.
For this I need a very wide window that has a tree.js camera that captures the entire scene in 360º
Is this possible?

It is definitely possible, as it was already implemented: https://github.com/spite/THREE.CubemapToEquirectangular
That library will just export snapshots as png, but looking at the code it should be possible to integrate the same method it uses into realtime-rendering if you want to...

You can't represent a 360-degree view with a conventional view matrix. You need to render to a set of textures (e.g. the six faces of a cube) then combine them into a 360 degree mapping such as equirectangular.

Related

Camera Stacking in AR Game to Apply Overlay

I'm trying to create an AR game (similar to FPS), which requires camera stacking to allow for an AR base camera, as well as an overlay camera with overlays such as health points, bullet count etc. However, upon creating an empty GameObject and adding a camera component, I realise that I do not see the Render Type option in the inspector pane.
May I know what went wrong? Or how should I go about creating an AR game with a live AR camera feed with an overlay displaying health points etc?
Thanks and cheers!
enter image description here

blender render object from multiple angles

I have an object in blender + an HDRI background / environment map. I am using cycles to render the object and I have Blender 2.8.
I would like to take multiple pictures of the rendered object (with its background) so that I end up with multiple views of the object (say, about 5/10).
I have seen some posts out there but they're not quite what I want because they just render in solid mode whereas I actually want the whole render.
I am a newbie with blender and I don't even know where to start with this. Thank you
You can just render multiple pictures by rendering, saving the image then moving the camera and repeating the process again. Or, you could render an animation with the camera moving to different angle and setting the output to an image format.

Create scene in blender

How can one create a scene/terrain like clash royal's game scene in blender? The game scene seems like top view but is not exactly top view. At what angle should the camera be placed to get that effect?
You would be looking for isometric projection or maybe 2.5D, also called 3/4 perspective. You can find several isometric tutorials on youtube.
While I'm not sure if there are any strict rules to follow unless you are making graphics that must fit in an existing game, it is common to use an orthographic camera at about a 45 degree angle.
Clash Royale looks like it could be close to a 45 degree orthographic camera angle.

Replicating Preview image viewing with NSPageController

Thanks to Apple's PictureSwiper sample code and the very nice NSPageController tutorial from juniperi here on stackoverflow, it's pretty easy to get tantalizing close to image viewing capabilities in Preview. Specifically I want to replicate the ability to swipe forwards/backwards between images/pages, use pinch-to-zoom resize the images, gesture to rotate the images/pages, and support two-page mode.
But there are some hurdles that make me wonder if NSPageController is the right approach or if it is too limiting and a custom view controller is needed.
1) Images of varying sizes are simply displayed stacked and if the top/upper layer image is smaller, the underlying image(s) show through. Using the same images in preview, they hide the larger "underlying" images/pages and fade the underlying image in/out with the swipe transition. I could hide underlying images by linking the page controller to the view rather than the image cell (like PictureSwiper), but that causes the entire view to scale on pinch to zoom and overall looks clunky.
2) Is it possible to use NSPageController with more than one image cell, e.g. two-page mode?
3) Is page/image rotation possible with NSPageController?
4) Is it possible to lock the zoom level for all the images, so they are uniformly displayed as navigated?
My apologies if this too general of a question, but the gist is whether NSPageController too limited and problematic to extend which would necessitate building a custom controller from scratch.
Thanks.

ipad frame max size is not enough

I'm developing an ipad application about 2d drawing.
I need a UIView.frame size of 4000x4000. But if I set a frame with size 4000x4000 the application
crash since i get memory warning.
Right night I'm using 1600*1000 frame size and the user can add new object (rectangle) on frame. User can also translate fram along x and y axis using pan gesture in order to see or add new object.
Have you got some suggestion? how can I tackle this problem?
thanks
Well, I would suggest what is used in video games for a long time - creating a tiled LOD mechanism, where only when you zoom in toward specific tiles, they are rendered at an increasing resolution, while when zoomed out, you only render lower resolution.
If the drawing in based on shapes (rectangles, points, lines, or anything can be represented by simple vector data) there is no reason to create a UIView for the entire size of the drawing. You just redraw the currently visible view as the user pans across the drawing using the stored vector data. There is no persistent bitmapped representation of the drawing.
If using bitmap data for drawing (i.e. a Photoshop type of app) then you'll likely need to use a mechanism that caches off-screen data into secondary storage and loads it back onto the screen as the user pans across it. In either case, the UIView only needs to be as big as the physical screen size.
Sorry I don't have any iOS code examples for any of this - take this as a high-level abstraction and work from there.
Sounds like you want to be using UIScrollView.