What is wrong with this Blender node setup? - blender

When I attach the normal map texture node to the Principled BSDF node for my plane I get a distorted reflection from the single light source (a point light). What am I doing wrong here? I've attached an image of my node setup.

I learned from an online tutorial that I need a Normal Map node between the normal.jpg texture node and the BSDF.normal socket. Once I put that in, everything worked as expected.

Related

Objc & Metal Compute depth texture in cube form result to strange depth map in other five faces except the +X depth texture

I am implementing point light dynamic shadow using Metal in my project. The point light shadow requires a depth cube texture (depth texture from 6 different directions) to compute shadow maps and perform depth tests from all directions.
Currently, I tried as the sample code Reflection With Layer Selection from Apple's documentation pages, but the depth cube texture seems not correct in other five directions except +X direction.
[ Sorry I can't post image here. Instead I uploaded images to github, so please refer to the following links >.< ]
In Xcode debug view, step into Reflections Command buffer -> ReflectionPass -> DrawActors 1 to view the depth cube texture (the selected one in right panel):
https://github.com/kiorisyshen/RepoForStackOverflow/blob/master/metalDepthCubeTexture_01.png
The +X direction seems correct:
https://github.com/kiorisyshen/RepoForStackOverflow/blob/master/metalDepthCubeTexture_02.png
But the other directions look strange (here I only uploaded +Z direction):
https://github.com/kiorisyshen/RepoForStackOverflow/blob/master/metalDepthCubeTexture_03.png
Are there settings should be cared about or actions should be taken to get correct depth cube texture? Any help is appreciated.

Light problems nwjs and threejs

I'm having problems with the lighting in my proyect, I'm just using normal direct light.
light = new THREE.PointLight( 0xfefffe);
But the problem is that with the 0.12.3 version of nwjs the objects in the scene are black (like if there was no lights) and sometimes the start flickering in red, black and green.
If I change the original libEGL.dll and libGLESv2.dll with the ones in the 0.13.0 version of nwjs it works fine but only in some hardware... I don't know whtas going on, what can I do to make everything work just fine?
Thanks
So, this is what hardware limitation it seems. I used PowerVr device driver *.dll ( EGL and opengles ) for solution. Curious to know this problem happen on ubuntu/linux devices?
Since you are creating material which take light vector as input check on the basic material and see the result.
Plus try to make the custom material shader which has (ambient,specular, (diffused optional) ) and then see result on machine.
Since the dll contains the implemenation of gles stuff onto window machine I believe you see this issue on window itself.
Black only comes when it require light vec into fragment shader and it is not been passed so texture2D result with light of undefined give you blackish output

Cube won't render BLENDER

I am using blender to make a character, but every time I add a mesh cube, it won't render. I've done several attempts to make it work, but it just won't. I'm sure I'm not the only one with this problem.
If you can see it in the 3d viewer but not in the render, check in the list in the Outliner where you have all your objects and see if the little camera is clicked on.
if you can't see it in the 3d viewer but you have the point try to scale using S

texture does not apply correctly on several faces

After rendering a simple object, some faces have a corrupt texture, even all faces should use the same texture.
Did i miss something or does anyone knows a workaround for this?
Removing Smooth shows the correct texture.
Similar issue with other textures
Removing and recreating faces does not solve the problem.
Below a link to the Blender File.
Thank you very much.
Blender File example:
https://skydrive.live.com/redir?resid=9260E6210D9A2E5B!807&authkey=!AEKG2Qgg1spEHCc
Edit: Im using Blender 2.68a Windows 64 bits on Win8
Under the Mapping section in the texture settings, change the texture coordinates from Normal to Generated.

Refraction for object { mesh {...}} surface shows artifacts

We want to render a parametrized surface in front of a grid plane and observe the transformation of the grid due to refraction happening at the surface. Our surface is in this simple example a 2D normal distribution which we will view directly from above and the grid plane is placed below:
The surface is given in many triangle directives which we put together in a mesh and used it with
object {
fovea
scale <1,1,3>
texture { pigment {color rgbt <0,0,1,0.5> }}
interior {ior 1.4}
}
The scale here is not necessary and used only to amplify the artifacts. What you see in the image below is, that the refraction seems not to happen smoothly, but creates some sharp artifacts in the underlying grid pattern.
This image was created with Povray 3.6.1 under MacOS X 10.5.6 with the settings +Q9, +A and -J. Can anyone point out a hint? Thanks.
This was a stupid mistake. Since in Mathematica the surface looked really smooth, I assumed that it created a large number of triangle-faces. This assumption was wrong. The rendering engine Mathematica uses, seems to interpolate the normals given for each vertex and therefore the surfaces only looks as it has a high resolution.
A check of the underlying polygons reveals the truth:
Therefore, what looks like refraction artifacts in the rendered image above is actually correct behavior, because the face-normals of neighboring triangles really change that much.
Increasing the resolution of the surface grid solves the problem.