blender's triangle is CW order or CCW order by default? - blender

I have a model exported from blender.
I checked and I found that each triangle is in the clock wise order. Is this the default order for blender?

You can see the generation of triangles with "Ctrl+T" or Mesh>Faces>Triangulate Faces
There you get a triangulated mesh which will be the same as the exported order.
Basically the is no clock or counter clock wise. It all depends on your model's shape and how you rotate or even mirror it. But with the quads to tris you already see how your model will look after exporting without importing into another application (unless the application you import to changes the mesh).

Related

Blender glft mirrors armature after exporting model

I have gltf model with rig and animation sequences (with .bin), exported from video game League of Legends. After mport into Blender, everything looks fine (almost, but that's not important for rig), but after exporting into other format (I export as .smd to use in Source Filmmaker, but also tried othe formats) and importing into Blender exported model armature mirrors along global Y. Here's how animation has to look
And here is what comes after export-import:
import mesh with armature, but without animation
and what comes with adding animation
(for this after previous screenshot I flipped mesh in edit mode along Y global).
I tried to mirror before/after export-import both armature and mesh. Nothing helped before. On last screenshot everything is flipped along global X. One day I fixed it, but her weapon (circle thing) is always behind her after export-import when it must be ahead of her.

Large (in meters) landscape mesh has artifacts on peaks only at certain scale

I made a mesh from a Digital Elevation Map that spanned 1x1 degree box of geography, but when I scale the mesh up to 11139m in blender I get these visible jagged shadows on the peaks of the mesh. I'd prefer to not scale everything down but I suppose I can, it just seems like a strange issue I want to better understand.
My goal is to use the landscape in a WebVR application, but when I put this mesh into an Aframe scene it also has this issue. Thanks for any tips!
Quick answer:
I think this may be caused by the clipping start/end values. Also called near/far clipping planes. Adjusting them may fix the issue but also limit the rendering distance.
Longer explanation:
Take a look at this:
It's a simple grayscale, but imagine it is scaled across your entire scene depth (Z depth buffer). The range of this buffer is set by the start/stop clipping (near/far) camera setting.
By default Blender has its start/stop (near/far) clipping set to 0.01 - 1000.
While A-Frame has it like 0.005 - 10000. You may find more information here: A-Frame camera #properties
That means the renderer has to somehow fit every single point in that range somewhere on the grayscale. That may cause overlapping or Z-fighting because it is simply lacking precision to distinguish the details. And that is mainly visible at edges/peaks because the polygons are connected there at acute angles and the program has to round up the Z-values. That causes overlapping visible as darker shadows (most likely the backside of the polygon behind).
You may also want to read more about Z-fighting because it is somewhat related.
Example

Why did Blender change my dimensions?

I was working with a simple cube in Blender which I wanted to use in Unity. I was using Metric units, with a scale of X=1, Y=1, Z=1 and Dimensions of X=1, Y=1, Z=1. I pulled it into Unity and it was working fine!
I know I definitely saved the Blender file, because this has happened twice now.
When I came back to it later, the scale was the same, but the dimensions changed to X=1.058, Y=1.058, and Z=1.058. Why did this happen? Thankfully it's already working in Unity so I don't have to reimport, but it's a little weird that the dimensions changed.
While I would expect the object scale is the culprit, you seem to have checked that. Also check any parent objects or armature bones, an object with a parent or an armature will show a scale of 1.0 but will be altered by the scale of the parent. A lattice or mesh deform modifier can also alter the dimension of an object without altering the scale. I am not sure that adjustments inherited from any parent objects will export to unity, but modifier deforms can alter dimensions if modifiers are applied during export.
You should also check that the scale is not keyframed, a coloured background means it is keyed.
Some constraints can alter the dimension of an object without appearing to alter the scale.
Check that your scene scale is 1.0
Another possibility is the exporter settings, I expect you would be using FBX for unity -
If the scene and fbx scales are both 1.0 I would try exporting to unity with the units set to none, metric and also imperial, and see if you get the same size each time. If there is a variation from changing unit settings (allowing for possible imperial to metric conversion) then you should report it as a bug.

blender export / one sided polygons

I'm really new to 3d modeling, blender, etc.
I created a model with blender (a room). Now I exported it (as .obj) so that I can import it to CopperCube (a tool to create 3d scences).
The problem is, that the walls are only visible from outside. Take a look into the pictures:
Blender:
http://imageshack.us/photo/my-images/341/blenderg.png/
CopperCube:
http://imageshack.us/photo/my-images/829/coppercube.png/
I asked the forum of CopperCube and they said that there are only one-side polygons (or flipped). Is there a way to change this? Sorry, but I am a total beginner with this...
Here's the answer of the CopperCube forum:
I don't know blender, but are there any options you can change for exporting? It looks like your model just has one sided polygons, or the normals are flipped for some of them.
Make sure you have the normals checkbox checked in OBJ export options (at the left side, it's off by default):
You will need to model your room to have slim cubes instead of planes whenever they should be visible from both sides.
You can display the normals in Blender in edit mode. In Properties (N) scroll down to Mesh Display and check the type of normals you want to see and their length.
To recalculate the normals or flip their direction go to the Tool Shelf (T) in the Normals section.

Refraction for object { mesh {...}} surface shows artifacts

We want to render a parametrized surface in front of a grid plane and observe the transformation of the grid due to refraction happening at the surface. Our surface is in this simple example a 2D normal distribution which we will view directly from above and the grid plane is placed below:
The surface is given in many triangle directives which we put together in a mesh and used it with
object {
fovea
scale <1,1,3>
texture { pigment {color rgbt <0,0,1,0.5> }}
interior {ior 1.4}
}
The scale here is not necessary and used only to amplify the artifacts. What you see in the image below is, that the refraction seems not to happen smoothly, but creates some sharp artifacts in the underlying grid pattern.
This image was created with Povray 3.6.1 under MacOS X 10.5.6 with the settings +Q9, +A and -J. Can anyone point out a hint? Thanks.
This was a stupid mistake. Since in Mathematica the surface looked really smooth, I assumed that it created a large number of triangle-faces. This assumption was wrong. The rendering engine Mathematica uses, seems to interpolate the normals given for each vertex and therefore the surfaces only looks as it has a high resolution.
A check of the underlying polygons reveals the truth:
Therefore, what looks like refraction artifacts in the rendered image above is actually correct behavior, because the face-normals of neighboring triangles really change that much.
Increasing the resolution of the surface grid solves the problem.