Why did Blender change my dimensions? - blender

I was working with a simple cube in Blender which I wanted to use in Unity. I was using Metric units, with a scale of X=1, Y=1, Z=1 and Dimensions of X=1, Y=1, Z=1. I pulled it into Unity and it was working fine!
I know I definitely saved the Blender file, because this has happened twice now.
When I came back to it later, the scale was the same, but the dimensions changed to X=1.058, Y=1.058, and Z=1.058. Why did this happen? Thankfully it's already working in Unity so I don't have to reimport, but it's a little weird that the dimensions changed.

While I would expect the object scale is the culprit, you seem to have checked that. Also check any parent objects or armature bones, an object with a parent or an armature will show a scale of 1.0 but will be altered by the scale of the parent. A lattice or mesh deform modifier can also alter the dimension of an object without altering the scale. I am not sure that adjustments inherited from any parent objects will export to unity, but modifier deforms can alter dimensions if modifiers are applied during export.
You should also check that the scale is not keyframed, a coloured background means it is keyed.
Some constraints can alter the dimension of an object without appearing to alter the scale.
Check that your scene scale is 1.0
Another possibility is the exporter settings, I expect you would be using FBX for unity -
If the scene and fbx scales are both 1.0 I would try exporting to unity with the units set to none, metric and also imperial, and see if you get the same size each time. If there is a variation from changing unit settings (allowing for possible imperial to metric conversion) then you should report it as a bug.

Related

Blender glft mirrors armature after exporting model

I have gltf model with rig and animation sequences (with .bin), exported from video game League of Legends. After mport into Blender, everything looks fine (almost, but that's not important for rig), but after exporting into other format (I export as .smd to use in Source Filmmaker, but also tried othe formats) and importing into Blender exported model armature mirrors along global Y. Here's how animation has to look
And here is what comes after export-import:
import mesh with armature, but without animation
and what comes with adding animation
(for this after previous screenshot I flipped mesh in edit mode along Y global).
I tried to mirror before/after export-import both armature and mesh. Nothing helped before. On last screenshot everything is flipped along global X. One day I fixed it, but her weapon (circle thing) is always behind her after export-import when it must be ahead of her.

Blender .fbx -> Spark AR Studio scaling issue for skeleton animations

I'm trying to create a character with skeleton animation in Blender to bring into Spark AR Studio. In Spark I want to use the baked animation. The .fbx brings the model and skeleton into Spark's scene just fine, until a new animation controller is selected via the object's inspector window and the animation is selected for use.
At that point, the Empty object named "Armature" is scaled to 100 instead of 1 and cannot be changed.
As a workaround, the Skeleton child object named "skeleton" can be scaled to 0.01. In Blender, I tried changing the scene's units and made sure the object's scales were all applied. Nothing is scaled to 100, everything is scaled to 1.
Since the object from the .fbx imports into Spark with correct scaling, I expect the animation to maintain that, but once the animation is selected the scale jumps from 1 to 100.
Put you animated object inside some nullObject and then scale down not animated but nullObject. Hope it is clear.

Blender: How to correctly export gltf/glb with armature applied

I am having a persistent issue when exporting glb and gltf models from blender (2.79) that have armature applied.
I am using this exporter.
Exporting the model with an armature NOT applied gives me the expected result (ie it appears as it does in Blender) but as soon as I apply an armature (before I have even added any animation) the model exports with all the shapes rotated.
There is at least some consistency to the problem in that it appears as though all the shapes are rotated through 90 degrees on the X axis (although some are positive and some are negative).
To compare I have exported a glb with no armature, a glb with armature, and an obj with armature just to check that there is no issue with my original file. (I have done the same with gltf just in case. You can see a screenshot of that comparison below brought into a-frame.
Here is a side view for ease of comparison. You can see how the individual shapes are flipped 90 on the x axis.
Here as well is a link to that aframe scene
And here is a link to a zip of the blender file
I have looked at many similar issues that suggest applying all rotation/scale etc. tidying up the model etc. I have done all that and tidied up my model as much as I know how. It seems as though the model and armature is ok as the obj export works fine.
I have seen similar questions such as this one but they are mostly to do with distortion of models once animated. I have worked backwards to understand the root of the problem and it does seem that simply applying the armature is what explodes the model in this way.
Am I doing something wrong? Is there an export setting I am missing?
Any help greatly appreciated as ever. And if any more info is needed, please let me know.

blender's triangle is CW order or CCW order by default?

I have a model exported from blender.
I checked and I found that each triangle is in the clock wise order. Is this the default order for blender?
You can see the generation of triangles with "Ctrl+T" or Mesh>Faces>Triangulate Faces
There you get a triangulated mesh which will be the same as the exported order.
Basically the is no clock or counter clock wise. It all depends on your model's shape and how you rotate or even mirror it. But with the quads to tris you already see how your model will look after exporting without importing into another application (unless the application you import to changes the mesh).

Java 3d: Unable to get Shape3D to be affected by lights

I am attempting to get a custom Shape3D to be affected by a DirectedLight in java 3D, but nothing I do seems to work.
The Shape has a geometry that is an IndexedQuadArray, with the NORMAL flag set and applied, ensuring the normal vectors are applied to the correct vertices - using indexed vectors
I have given the Appearance a Material (both with specified colors and shininess, and without)
I have also put the light on the same BranchGroup as the Shape, but it still does not work.
In fact, when I add in the normals to the shape, the object appears to disappear - without them, it's flat shaded, so that all faces are the same shade.
I can only think that I am forgetting to include something ridiculously simple, or have done something wrong.
To test the lights were actually, I put in a Sphere beside the Shape, and the sphere was affected and lit correctly, but the shape still wasn't. Both were on the same BranchGroup
[Small oddity too - if I translate the sphere, it vanishes if I move it greater than 31 in any direction... [my view is set about 700 back as I'm dealing with objects of sizes up to 600 in width]
Edit: found this in the official tutorials that is probably related
A visual object properly specified for shading (i.e., one with a Material object) in a live scene graph but outside the influencing bounds of all light source objects renders black.
The light's setInfluencingBounds() was not set correctly, so that the shapes in the scene were not being included in the bounds.
This was corrected by setting a BoundingBox to encompass the entire area, and assigning that into the influencing bounds