ARKit face tracking PBR - cocoa-touch

I am usig the ARKit face tracking configuration and displaying the face mesh in realtime, i can successfully add diffuse and normal maps to it, and they display correctly, but no luck with roughness or metalness, roughness has no effect and metalness is rendering all black and dull/opaque, things i have tried:
self.contentNode?.geometry?.firstMaterial?.roughness.contents = UIColor.black //UIColor.white
self.contentNode?.geometry?.firstMaterial?.metalness.contents = UIColor.black //UIColor.white
self.contentNode?.geometry?.firstMaterial?.roughness.contents = myMetallnessTextureImage
self.contentNode?.geometry?.firstMaterial?.metalness.contents = myRoughnessTextureImage
It is worth noting that most of the light management is done by the session, thats why the mesh responds to the ambient and direct light when using it, and when there is only ambient light, the mesh looks di-electric/rough, and when there is a strong directional light it looks metallic/smooth, is it the session taking control of these params?
If i export the face mesh as an .obj, when opening it on Xcode i can tweak the material properties, and i can get a fully metallic shiny face, but when applying the same params to my mesh in realtime they do not work.
In the end, what i want is to be able to have some part of the face metallic by using a grayscale map.
Also i am aware that the face tracking config does not allow an environment map(correct me if wrong) so a fully mirror like texture will look quite un-realistic but anyway, it should work.
Thanks!

Solved, i was missing this parameter in my setup method:
sceneView.autoenablesDefaultLighting = true
This adds what seems to be an ambient light to the mesh reflections, and roughness and metalness are taking effect.

Related

Hide an object for a specific camera

I use godot to create my 3d game. I ran into a problem while creating portals using camera viewport rendering to texture. The problem is that the camera captures unnecessary objects that are behind portal. I partially solved this problem by setting the parameter "near " for the camera at a distance from the camera itself to the portal, but the part behind the portal began to be cut off.
The question is, is it possible to hide objects for a particular camera so that other cameras can see them? Perhaps there is another way to do this, for example by creating a static clipping plane?
Proximity Fade
Probably not what you are looking for, but I'll mention it for completeness sake.
The default material has proximity fade and distance fade, which you can use to make the material disappear if it is too close or to distant from the camera, respectively.
It is important to note that this is not a cull plane, and that the fading is gradual.
Thus, using proximity fade you can make objects near the camera appear semitransparent.
Using Visibility layers and cull mask
is it possible to hide objects for a particular camera so that other cameras can see them?
Every VisualInstance (you know, all things that are visible in 3D) has layers. And every Camera has a cull_mask. If the cull_mask of the Camera does not include any of the layers of a VisualInstance, then the Camera does not see that VisualInstance.
A VisualInstance with no layers will not show on no Camera, even if the Camera has all the layers in its cull_mask (which is the default).
You can either edit the cull_mask of the camera to not include the layers of the VisualInstance, or edit the layers of the VisualInstance, or both.
Using a custom shader cull plane
Perhaps there is another way to do this, for example by creating a static clipping plane?
You can use a custom spatial shader to cut things out based on a plane.
You need to define the plane as a uniforms. For this answer I'll use a point-normal definition of a plane:
n·(r - r_0)
That is:
dot(plane_normal, (world_position - plane_point)
Thus, we define a plane_normal and plane_point uniforms:
uniform vec3 plane_normal;
uniform vec3 plane_point;
The plane_normal gives us the orientation of the plane, while the plane_point is a point on the plane which allows us to position it.
And then use this logic:
vec3 wold_position = (CAMERA_MATRIX * vec4(VERTEX, 1.0)).xyz;
ALPHA = clamp(sign(dot(plane_normal, wold_position - plane_point)), 0.0, 1.0);
Here we are converting the coordinates of the current point to world space, and then using definition of the plane to find the points on one side (using sign), and set ALPHA based on that, such that everything on one side of the plane becomes invisible.
Note: This is not the only way to define the plane. Another popular definition is a 4D vector, where the xyz are the normal, and the w is the distance from plane to the origin.
Sadly, I don't think there is a way to make this work with multiple material passes, because ALPHA controls the blending of the passes, and will not result in transparency. And no, using discard; does not solve it either, because the other passes can write the fragment regardless. Thus, you are going to need to modify your materials to include that.
Further Sadly Godot 3.x does not support global uniforms (see Godot 4.0 gets global and per-instance shader uniforms). Which means you will have to set these parameter everywhere you need them.
Using Constructive Solid Geometry (CSG)
Add a CSGCombiner make the geometry that needs to disappear with other CSG nodes as children.
Then you can, for example, add a CSGSphere with operation set to "Subtraction", and move it with the Camera (for this purpose, I suggest to add a RemoteTransform node as child to the Camera and set its remote path to the CSGSphere).
Of course, it does not have to be a CSGSphere, you can use any CSG nodes for this purpose. For the portal, I imagine you could use a CSGBox and align it to the portal plane.
Note: Currently on Godot 3.3 CSG nodes do not support baking lights. This is a regression. See: Unable to bake lightmap with CSG due to the lack of ability to generate UV2 for CSG nodes.
Portals, actually
Bartleby Lawnjelly has a portal (godot-lportal) module for Godot 3.x.
Being a module, they require to build Godot from source. See Compiling on the official Godot documentation. It is not that bad, I promise. Or use build from godot-titan.
I have to explain that these portals are not portals in the Valve Portal video game series sense… The module lets you define areas as "rooms", and planes as "portals" that connect those rooms, in such way that you can look from one to the other. The purpose of this is to cull entire rooms unless you are looking through one of the portals.
Hopefully that makes more sense with a video. This is a somewhat old one, but good to get the idea across: Portal rendering module in Godot 3.2 - Improved performance. Seeing shadow pooping in the video? Bartleby Lawnjelly also has a custom lightmapper.

Baked lighting seems to have no effect on Blender model

I'm having a terrible time trying to figure out what's going on with my baked lighting. It appears that only Realtime lights affect my model. I've attached 2 images to demonstrate the problem. I have several point lights in the interior of my model. If I set them to Realtime everything looks great. However, if I set the, to Baked and change the GI accordingly they don't seem to interact with the model at all. Oddly enough the Directional Light on the exterior (and you can see it poking through the hallway door) Seems to display fine when set to Baked.
The model is generated in Blender and I do have the "Generate Lightmap UVs" import option selected. I've tried just about every combination of settings I can think of.
It turns out the interior lights were just a few pixels above the surface of my ceiling cube, causing the light to never reach the interior of the room :/

Refraction for object { mesh {...}} surface shows artifacts

We want to render a parametrized surface in front of a grid plane and observe the transformation of the grid due to refraction happening at the surface. Our surface is in this simple example a 2D normal distribution which we will view directly from above and the grid plane is placed below:
The surface is given in many triangle directives which we put together in a mesh and used it with
object {
fovea
scale <1,1,3>
texture { pigment {color rgbt <0,0,1,0.5> }}
interior {ior 1.4}
}
The scale here is not necessary and used only to amplify the artifacts. What you see in the image below is, that the refraction seems not to happen smoothly, but creates some sharp artifacts in the underlying grid pattern.
This image was created with Povray 3.6.1 under MacOS X 10.5.6 with the settings +Q9, +A and -J. Can anyone point out a hint? Thanks.
This was a stupid mistake. Since in Mathematica the surface looked really smooth, I assumed that it created a large number of triangle-faces. This assumption was wrong. The rendering engine Mathematica uses, seems to interpolate the normals given for each vertex and therefore the surfaces only looks as it has a high resolution.
A check of the underlying polygons reveals the truth:
Therefore, what looks like refraction artifacts in the rendered image above is actually correct behavior, because the face-normals of neighboring triangles really change that much.
Increasing the resolution of the surface grid solves the problem.

OpenglES - Transparent texture blocking objects behind

I have some quads that have a texture with transparency and some objects behind these quads. However, these don't seem to be shown. I know it's something about GL_BLEND but I can't manage to make the objects behind show.
I've tried with:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
but still not working. What I basically have is:
// I paint the object
draw_ac3d_file([actualObject getCurrentObject3d]);
// I paint the quad
paintQuadWithAlphaTexture();
There are two common scenarios that create this situation, and it is difficult to tell which one your program is doing, if either at all.
Draw Order
First, make sure you are drawing your objects in the correct order. You must draw from back-to-front or else the models will not be blended properly.
http://www.opengl.org/wiki/Transparency_Sorting
note as Arne Bergene Fossaa pointed out, front-to-back is the proper way to render objects that are not transparent from a performance stand point. Because of this, most renderers first draw all the models that have no transparency front-to-back, and then they go back and render all models that have transparency back-to-front. This is covered in most 3D-graphic texts out there.
back-to-front
front-to-back
image credit to Geoff Leach at RMIT University
Lighting
The second most common issue is improper use of lighting. Normally in this case if you were using the fixed-function pipeline, people would advise you to simply call glDisable(GL_LIGHTING);
Now this should work (if it is the cause at all) but what if you want lighting? Then you would either have to employ custom shaders or set up proper material settings for the models.
A discussion of using the material properties can be found at http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=285889

In OpenGL ES 2.0, how can I draw a wireframe of triangles except for the lines on adjacent coplanar faces?

I vaguely remember seeing something in OpenGL (not ES, which was still at v1.0 on the iPhone when I came across this, which is why I never used it) that let me specify which edges of my polygons were considered outlines vs those that made up the interior of faces. As such, this isn't the same as the outline of the entire model (which I know how to do), but rather the outline of a planar face with all its tris basically blended into one poly. For instance, in a cube made up of tri's, each face is actually two tris. I want to render the outline of the square, but not the diagonal across the face. Same thing with a hexagon. That takes four tris, but just one outline for the face.
Now yes, I know I can simply test all the edges to see if they share coplanar faces, but I could have sworn I remember seeing somewhere when you're defining the tri mesh data where you could say 'this line outlines a face whereas this one is inside a face.' That way when rendering, you could set a flag that basically says 'Give me a wireframe, but only the wires around the edges of complete faces, not around the tris that make them up.'
BTW, my target is all platforms that support OpenGL ES 2.0 but my dev platform is iOS. Again, this Im pretty sure was originally in OpenGL and may have been depreciated once shaders came on the scene, but I can't even find a reference to this feature to check if that's the case.
The only way I know now is to have one set of vertices, but two separate sets of indices... one for rendering tris, and another for rendering the wireframes of the faces. It's a real pain since I end up hand-coding a lot of this, which again, I'm 99% sure you can define when rendering the lines.
GL_QUADS, glEdgeFlag and glPolygonMode are not supported in OpenGL ES.
You could use LINES to draw the wireframe: To get hidden lines, first draw black filled triangles (with DEPTH on) and then draw the edges you are interested in with GL_LINES.