threejs InstanceMesh CylinderGeometry different textures - threejs-editor

threejs uses InstanceMesh to instantiate multiple CylinderGeometry how to apply different textures to the top and side of different CylinderGeometry?

Related

Texture and normal map on the same object in blender native renderer

I'm learning blender and I saw you can have both mapped texture and normal map on the same object via mixer node in Cycles renderer. Can I do the same in Blender Renderer?
Yes, you can add several procedural or image textures if you wish (there is a limit of 18 textures to a Blender Render material).
In the texture properties there is a list of available texture slots, just add a new one to an unused slot and adjust it's properties.

When Are Texture Atlases Appropriate In SpriteKit?

I've seen that texture atlases are used for animations, but would it be appropriate to use for storing UI elements or items that are unrelated?
Short answer is yes, it can be appropriate. You can put them in a texture atlas if they are related UI elements.
From Apple's Documentation:
Using Texture Atlases to Collect Related Art Assets
Art assets stored in your app bundle aren’t always unrelated images. Sometimes they are collections of images that are being used together for the same sprite. For example, here are a few common collections of art assets:
Animation frames for a character
Terrain tiles used to create a game level or puzzle
Images used for user interface controls, such as buttons, switches,
and sliders
If each texture is treated as a separate object, then Sprite Kit and the graphics hardware must work harder to render scenes—and your game’s performance might suffer. Specifically, Sprite Kit must make at least one drawing pass per texture. To avoid making multiple drawing passes, Sprite Kit uses texture atlases to collect related images together. You specify which assets should be collected together, and Xcode builds a texture atlas automatically. Then, when your game loads the texture atlas, Sprite Kit manages all the images inside the atlas as if they were a single texture. You continue to use SKTexture objects to access the elements contained in the atlas.

UV mapping in blender not showing properly when unwrapping

I am trying to uv map a cube in blender 2.74, but even though all six faces are placed on the image on the left hand side, only two of them actually show on the cube on the right hand side. I have tried unwrapping in different ways and moving the squares on the left hand side around, but still only two sides show the image.
When I try more complicated shapes (a tree), none of the faces show the texture, no matter how I unwrap.
However, when I export as a .obj file and draw it with opengl, all sides are textured, with the texture coordinates in the places where I uv mapped them to.
So my problem is that I don't know what it is going to look like until I actually export the file.
How do I get all faces of an object to show textured as I do the mapping?
simple solution, duplicate a couple of those light sources(the dashed line outlined black sphere that always appears in the cube startup program), and place them around your object. Also switch between shading options.

wxWidgets multi layer transparency-enabled drawing

I am using wxWidgets to design a GUI that draws multiple layers with transparency on top of each other.
Therefore I have one method for each layer that draws with wxGraphicsContext onto the "shared" wxImage, which is then plotted to the wxWindow in the paintEvent method.
I have the layer data in arrays exactly of the same dimension as my wxImage and therefore I need to draw/manipulate pixel-wise, of course. Currently I am doing that with the drawRectangle-routine. My guess is that this is quite inefficient.
Is there a clever way to manipulate wxImage's pixel data directly, enabling me to still use transparency of each separate layer in the resulting image? Or is the 1x1 pixel drawing with drawRectangle sufficient?
Thanks for any thoughts on this!
You can efficiently manipulate wxImage pixels by just directly accessing them, they are stored in two contiguous RGB and alpha arrays which you can work with directly.
The problem is usually converting this wxImage to wxBitmap which can be displayed -- this is the expensive operation, and to avoid it raw bitmap access can be used to manipulate wxBitmap directly instead.

Disable mipmapping in OpenGL ES 2.0

I would like to draw some of the same figures (with the same texture) on screen (OpenGL ES 2.0). These figures will be different in magnification and minification filters. And different states mipmapping.
The issue is: if I use mipmapping in draw any figure ( if I called glGenerateMipmap() function) I can't switch off mipmapping mode.
Is it possible to switch off mipmapping mode, if I call glGenerateMipmap() at least once?
glGenerateMipmap only generates the smaller mipmap images (based on the top-level image). But those mipmaps are not used for filtering if you don't use a proper mipmapping filter mode (through glTexParamteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_..._MIPMAP_...)). So if you don't want your texture mipmap filtered, just disable it for this particular texture by setting either GL_NEAREST or GL_LINEAR as minification filter. Likewise does not calling glGenerateMipmap not mean that there is no mipmapping going on. A possible mipmapping filter mode (which is also the default for a newly created texture) will still be used, just that the mipmap images contain rubbish (or the texture is actually incomplete, resulting in implementation-defined behaviour, but usually a black texture).
Likewise you shouldn't call glGenerateMipmap each frame before rendering. Call it once after setting the base image of the texture. Like said it generates the mipmap images, those won't go away after they've been generated. What decides if mipmapping is actually used is the texture object's filter mode.