We're trying to do tiling textures for AR Quick Look (iOS in USDZ (pixar) format), but issuing a problem.
What we have:
Project in the blender, where we use scaling texture via mapping (screen below) and everything looks fine like it is tiled properly.
When I do export in GLTF 2.0 you can see, that texture is not scaled (scale should be 100, 100) and that is why it looks bad. Doing not tiled textures for (for example) roads, is bad idea, so, that is why i'm using it.
The same goes to usdz.But i think that it is because of GLTF format
Not sure if while exporting from blender to gltf i should do something correctly
This question might be better suited for Blender SE, not here.
The glTF exporter is looking for a shader node called "UV Map" instead of that "Texture Coordinate" node you have there. I realize the names are almost synonymous, but the "UV Map" node has a chooser for which UV Map, and that's what the exporter wants to find. (For more detail, there is documentation.)
Also I don't know if glTF export supports that little splitter node you have in your graph there. Try drawing individual lines from the "mapping" box to each of the image textures.
My setup is modelling in Blender 2.8 and texturing in Substance Painter. Finished models are exported via GLTF and loaded in A-Frame. My models looks either washed out or oversaturated and when color management is on/off.
Any idea whats wrong with my workflow?
Blender 2.80b
A-Frame 0.90
Chrome 73.0.3683.86
MacOS 10.14.3
This is my typical node setup in blender:
Here's an example with sRGB encoding in A-Frame:
And one with linear encoding:
Finally this is the preview in Substance Painter:
I'm creating an animation with a simple rig in Blender and using it A-Frame 0.70. I have a box with a lid that opens via Euler rotation, but all children of the armature rotate with the lid even if they are not supposed to animate. Wiring up the flaps to their respective bones also distorts the geometry of the flaps - but I am simplifying this problem to just the lid for now to try to understand what is happening.
Animation works fine in Blender and apparently works in UX3D.
Attempts to separate the mesh into pieces and de-parent them from the armature results in the de-parented meshes to not render at all despite exporting all objects.
Tried Blender 2.78c and 2.79 and virtually all combinations of glTF export options with latest Blender glTF 2.0 exporter from Khronos.
Blender Screenshot
A-Frame Demo
<a-gltf-model cursor-listener id="gift" src="#rigged-gift" animation-mixer=""></a-gltf-model>
Blender source included in CodePen link
Appreciate any direction I can get on this problem!
This is now resolved with Blender 2.8 and above.
How can one bring Blender camera paths into Blender to use as Camera Paths in Threejs?
Is there a way to import the spline of blender into Threejs for further use as a camera path?
Can threejs extract the points of a blender spline and create the path for its cameras?
I am using the open-source structure-from-motion program: insight3D to reconstruct a 3D model from a set of 2D images. I was able to export the resulting model as an VRML (.wrl) file but the texture and texture coordinates were not included when I inspected the resulting file in Blender/Meshlab. Has anyone been successful with exporting a textured model with insight3D? Thanks!