QML apply a texture to a Mesh - qml

I am trying to apply an image texture to a Mesh in QML (Qt 5.6.2). I started from the sample "Shadow Map QML" and I want to texture the GroundPlane. Material and Effect qml classes are applied to that GroundPlane mesh but I can't see how to apply an image texture. In QML, there is TextureImage, ShaderEffect but nothing about how they can be applied to a Mesh.
Any ideas?
EDIT:
Qt 5.6.2 is not the good version to work with to use Qt3D as the first "fully supported release of a stable Qt 3D module" was in Qt 5.7. So, I'll have a look at Qt 5.7, maybe 5.8 now! And at first glance, there is some texture propperties for the mesh.

there is a simple example for you
https://github.com/tripolskypetr/simpleqml3d
Watch on IronMan.qml

Related

How to draw using Vulkan on gtk4 widget?

Is there an example of how to run vulkan commands on a gtk4 widget?
There are some explanations that it can be done by using Gtk OpenGL widget and then to draw on GL Texture. However, I am not sure how to do it.

Assimp FBX loader and PBR textures

I would like to know if the assimp FBX loader does supports PBR materials.
I am currently using it with glTF/glb files and it perfectly loads my PBR textures.
I am loading PBR textures via the "assimp/pbrmaterial.h" header file, but this file is only defining glTF macros.
How can I load PBR textures when using the FBX file format with assimp ?
Regards.
I don't think it can. glTF 2.0 uses a single texture that contains: metallic on the blue channel, roughness on the green. And from my own testing using Blender v2.93.3 (the latest right now), if you use its Shader Editor to split that single texture into separate RGB channels, the FBX won't get saved with any paths to it. Even when you import the FBX back into Blender it will only have the base color and normal map applied, nothing else. So I don't think you can expect Assimp (v5.0.1) to work with it either... But this might just be a bug in Blender, I'm not sure. Because it seems that if metallic and roughness are individual textures, Blender can correctly import the FBX back. It's a pretty big oversight that you can't export (as FBX) models that use multi-channel textures. Pretty much all PBR workflows involve having them merged into a single texture, to reduce texture lookups.
I don't know... seems like glTF 2.0 is a much better format. It comes with a GPU-friendly binary (compared to something like Wavefront OBJ which is very slow), and you can even have the textures separately if you choose the "glTF Separate" format when you export it. You automatically get a merged PNG with both metallic and roughness like I said before:
If you really wanna have support for FBX files (I know I do; it's a popular format), what you could do, is to have it correctly identify and load the base color and normal map, but then you have to manually load the "PBR" texture somewhere before the render loop starts, and then manually bind the texture and send it as a uniform to the fragment shader before drawing it. I tested this and it works.
glActiveTexture(GL_TEXTURE2); // Texture unit 2
glBindTexture(GL_TEXTURE_2D, *pMyMetRoughTexture);
pShader->SetInt("uMaterial.MetallicRoughnessTexture", 2); // Tell the sampler2D from the fragment shader to use texture unit 2
pMyModel->Draw(pShader);
But having 2/3 textures loaded automatically and 1 left up to you, to manually handle, for every... single... model... is just... bleh. I'm sorry if this isn't a "proper" answer. I'm really disappointed by the lack of PBR support, for something that's used so ubiquitously in I think all AAA games in the last few years.

GLTF and USDZ repeat and scale texture problem (tiling texture)

We're trying to do tiling textures for AR Quick Look (iOS in USDZ (pixar) format), but issuing a problem.
What we have:
Project in the blender, where we use scaling texture via mapping (screen below) and everything looks fine like it is tiled properly.
When I do export in GLTF 2.0 you can see, that texture is not scaled (scale should be 100, 100) and that is why it looks bad. Doing not tiled textures for (for example) roads, is bad idea, so, that is why i'm using it.
The same goes to usdz.But i think that it is because of GLTF format
Not sure if while exporting from blender to gltf i should do something correctly
This question might be better suited for Blender SE, not here.
The glTF exporter is looking for a shader node called "UV Map" instead of that "Texture Coordinate" node you have there. I realize the names are almost synonymous, but the "UV Map" node has a chooser for which UV Map, and that's what the exporter wants to find. (For more detail, there is documentation.)
Also I don't know if glTF export supports that little splitter node you have in your graph there. Try drawing individual lines from the "mapping" box to each of the image textures.

A-Frame 0.70: Animated Blender-Exported glTF 2.0 w/ Simple Rig Doesn't Render Correctly

I'm creating an animation with a simple rig in Blender and using it A-Frame 0.70. I have a box with a lid that opens via Euler rotation, but all children of the armature rotate with the lid even if they are not supposed to animate. Wiring up the flaps to their respective bones also distorts the geometry of the flaps - but I am simplifying this problem to just the lid for now to try to understand what is happening.
Animation works fine in Blender and apparently works in UX3D.
Attempts to separate the mesh into pieces and de-parent them from the armature results in the de-parented meshes to not render at all despite exporting all objects.
Tried Blender 2.78c and 2.79 and virtually all combinations of glTF export options with latest Blender glTF 2.0 exporter from Khronos.
Blender Screenshot
A-Frame Demo
<a-gltf-model cursor-listener id="gift" src="#rigged-gift" animation-mixer=""></a-gltf-model>
Blender source included in CodePen link
Appreciate any direction I can get on this problem!
This is now resolved with Blender 2.8 and above.

How to map kinect skeleton data to a model?

I have set up a Kinect device and written a simple program that reads the stream to a QImage using OpenNI 2.0. I have set up skeleton tracking with NiTE 2.0, so I have access to the coordinates of all the 15 joints. I have also set up a simple scene using SceniX. The hand coordinates provided by the skeleton tracking are beeing used to draw 2 boxes to represent the hands.
I would like to bind the whole skeleton to a (rigged)model, and cant seem to find any good tutorials. Anyone have any idea how I should proceed?
depending on your requirements you could look at something like this for Unity Engine https://www.assetstore.unity3d.com/en/#!/content/10693
There is also a Plugin for the Unreal 4 Engine called KINECT 4 UNREAL FROM OPAQUE MULTIMEDIA
But if you have to write it all by hand for yourself, i have done something similar using OpenGL.
I used Assimp http://assimp.sourceforge.net/ to be able to load animated Collada models and OpenNi with NiTE for skeletal tracking. I then used the rotation data from the Nite skeleton and applied it to the corresponding Bones of my rigged mesh, overwriting the rotation values of the animation. Don't use positional Data. It will strech your bones and distort the mesh.
There are many sources for free 3D Models, like TF3DM.com . I for myself used a custom Rig for my models to be suitable for my code. So you might look into using Blender and how to Rig a Model.
Also remember that the Nite Skeleton has no joint for the Pelvis, and that Nite joints don't inherit their parents rotation, contrary to the bones in a rigged model.
I hope this helps to have something to go on.
You can try DigitalRune, they have examples of binding a rigged model to joints. They have mentioned some examples too. try http://www.digitalrune.com/Support/Blog/tabid/719/EntryId/155/Research-Augmented-Reality-with-Microsoft-Kinect.aspx
Also you would need to know to animate model in blender and export it to XNA or to your working graphics framework. Eg:-http://www.codeproject.com/Articles/230540/Animating-single-bones-in-a-Blender-3D-model-with#SkinningSampleProject132