Assimp FBX loader and PBR textures - fbx

I would like to know if the assimp FBX loader does supports PBR materials.
I am currently using it with glTF/glb files and it perfectly loads my PBR textures.
I am loading PBR textures via the "assimp/pbrmaterial.h" header file, but this file is only defining glTF macros.
How can I load PBR textures when using the FBX file format with assimp ?
Regards.

I don't think it can. glTF 2.0 uses a single texture that contains: metallic on the blue channel, roughness on the green. And from my own testing using Blender v2.93.3 (the latest right now), if you use its Shader Editor to split that single texture into separate RGB channels, the FBX won't get saved with any paths to it. Even when you import the FBX back into Blender it will only have the base color and normal map applied, nothing else. So I don't think you can expect Assimp (v5.0.1) to work with it either... But this might just be a bug in Blender, I'm not sure. Because it seems that if metallic and roughness are individual textures, Blender can correctly import the FBX back. It's a pretty big oversight that you can't export (as FBX) models that use multi-channel textures. Pretty much all PBR workflows involve having them merged into a single texture, to reduce texture lookups.
I don't know... seems like glTF 2.0 is a much better format. It comes with a GPU-friendly binary (compared to something like Wavefront OBJ which is very slow), and you can even have the textures separately if you choose the "glTF Separate" format when you export it. You automatically get a merged PNG with both metallic and roughness like I said before:
If you really wanna have support for FBX files (I know I do; it's a popular format), what you could do, is to have it correctly identify and load the base color and normal map, but then you have to manually load the "PBR" texture somewhere before the render loop starts, and then manually bind the texture and send it as a uniform to the fragment shader before drawing it. I tested this and it works.
glActiveTexture(GL_TEXTURE2); // Texture unit 2
glBindTexture(GL_TEXTURE_2D, *pMyMetRoughTexture);
pShader->SetInt("uMaterial.MetallicRoughnessTexture", 2); // Tell the sampler2D from the fragment shader to use texture unit 2
pMyModel->Draw(pShader);
But having 2/3 textures loaded automatically and 1 left up to you, to manually handle, for every... single... model... is just... bleh. I'm sorry if this isn't a "proper" answer. I'm really disappointed by the lack of PBR support, for something that's used so ubiquitously in I think all AAA games in the last few years.

Related

How can I render text using font files on Direct3D 11?

I've read many questions about this, but they don't satisfy what I'm trying to do. I'm trying to use a TTF file for the text's font on my application, I thought of using Direct Draw, but the tutorial from Microsoft website only explains how to use it with Direct2D. How am I supposed to load data from this file and render text for my Direct3D application using this file's font? I've also read about the AddFontResourceExA() function, but I didn't find any content of how I could use this. I'm really lost here, so any help is appreciated.
There are basically two approaches for rendering text on a Direct3D 11 Render Target / Texture.
Rendering using a 'sprite sheet'. Here you capture the font at a particular resolution and generate a texture from it. Then you use the texture to render the glyphs as textured triangles. This is very fast and inexpensive to render, but does not scale to arbitrary resolutions (you can capture the 'sprite sheet' at multiple point sizes to get some scaling) and does not work well with "CKJ" languages due to the large size of the fonts. For an example of this, see SpriteFont in the DirectX Tool Kit. This is what legacy D3DX9/D3DX10 did as well.
Rendering using vector fonts directly. Here you have some kind of library that generates triangles 'on-the-fly' from the "TrueType" vector font data. This is what Direct2D+DirectWrite is designed to do. You can use interop with Direct3D 11 surfaces, but essentially you are using DirectWrite -> Direct2D -> shared texture. Then you draw the shared texture with Direct3D as a 'sprite'. This is more complicated to setup, but results in arbitrary resolutions scaling, support for large character set fonts, and handles complex writing systems.

Godot doesn't import glTF material shaders correctly

I have a model that I am using as my gameworld in Godot and I am putting some textures on it in Blender. When exporting the model to .glb it seems to not be exporting some of the materials correctly.
The ones that are an image texture that I scaled using mapping.
It just skips the first two nodes. When I disconnect them the material looks like it does in Godot.
What it should look like (what it looks like in Blender):
What it looks like in Godot:
There also are some materials that use mix shaders to mix images together. These are also skipped and it just uses one of the two instead of mixing them.
What it should look like:
What it looks like in Godot:
The F is from an image with text that is overlayed ontop of the other image.
I am not really sure if these issues lie with the glTF format and certain shaders just don't work with it or that Godot doesn't want to import them.
I hope someone can help me with this.
Thanks in advance and have a nice day,
Rover
In general, arbitrary node graphs cannot be exported from Blender — see https://blender.stackexchange.com/a/57541/43930 for the long explanation of that. To ensure that the material will export correctly, you'll need to refer to the documentation for the Blender glTF exporter and configure your material accordingly. For more complicated effects, you can also bake Blender nodes down to simpler textures.

GLTF and USDZ repeat and scale texture problem (tiling texture)

We're trying to do tiling textures for AR Quick Look (iOS in USDZ (pixar) format), but issuing a problem.
What we have:
Project in the blender, where we use scaling texture via mapping (screen below) and everything looks fine like it is tiled properly.
When I do export in GLTF 2.0 you can see, that texture is not scaled (scale should be 100, 100) and that is why it looks bad. Doing not tiled textures for (for example) roads, is bad idea, so, that is why i'm using it.
The same goes to usdz.But i think that it is because of GLTF format
Not sure if while exporting from blender to gltf i should do something correctly
This question might be better suited for Blender SE, not here.
The glTF exporter is looking for a shader node called "UV Map" instead of that "Texture Coordinate" node you have there. I realize the names are almost synonymous, but the "UV Map" node has a chooser for which UV Map, and that's what the exporter wants to find. (For more detail, there is documentation.)
Also I don't know if glTF export supports that little splitter node you have in your graph there. Try drawing individual lines from the "mapping" box to each of the image textures.

Obj-C method to assign colours to pixels directly?

Currently, I am using SKSpriteKit in order to do all of my graphics stuff in any of my programs. Recently, I’ve been interested in drawing things like the Mandelbrot set, Bifurcation curve, etc.
So to draw these on my screen, I use 1 node per pixel… obviously this means that my program has very low performance with over 100000 nodes on the screen.
I want to find a way of colouring in pixels directly with some command without drawing any nodes. (But I want to stick to Obj-C, Xcode)
Is there some way by accessing Core graphics, or something?
Generally you would use OpenGL ES or Metal to do this.
Here is a tutorial that describes using OpenGL ES shaders with SpriteKit to draw the mandelbrot set:
https://www.weheartswift.com/fractals-xcode-6/

Using vectors in iOS

I'm working on a simple iOS game that's always drawing 5 to 10 layers of 32bit png images which requires enough memory to crash on the ipod touch 4g when retina enabled. On other devices it works just fine. I'm not even getting memory warnings. So I was trying with lower quality images, like RGB5_A1 format, but it looks really bad because I need alpha transparency and lots of gradients.
Since all the images are exports from Illustrator I was thinking that maybe i could just export a vector image and draw in on iOS. From what i was researching hardly anyone tried this and the only option I've come across was to implement a SVG parser for Quartz.
Did I miss anything?
Also I'm worried about performance, but I couldn't find any benchmarks.
Without knowing specifics of your game, I'm going to make a few assumptions based on normal use...
You are not going to want to use straight vector graphics for this. Stick with your raster graphics.
If you are talking about 32 bit color space for your PNG images, then you need to scale back. iOS uses 24 bit images and that includes 8 bits each for red, green, blue, and alpha. As it stands, you have an extra byte for every pixel shown.
If you are using Adobe products, import the Illustrator file into Photoshop and use the "Save for Web..." option. Choose PNG-24 and you'll be all set.