Made a plane model in blender and exported it to substance painter. But when i bake it in painter it gets these wierd map errors. Its mainly in the ambient occlusion and curveture maps. Does anybody have an idea what might couse this or how to fix it
Picture of plane with the wierd "artifacts" in Painter
Can you upload the export window from blender? You should check what is wrong there, or when you create a new file in substance painter you should also check the import options.
Related
I have gltf model with rig and animation sequences (with .bin), exported from video game League of Legends. After mport into Blender, everything looks fine (almost, but that's not important for rig), but after exporting into other format (I export as .smd to use in Source Filmmaker, but also tried othe formats) and importing into Blender exported model armature mirrors along global Y. Here's how animation has to look
And here is what comes after export-import:
import mesh with armature, but without animation
and what comes with adding animation
(for this after previous screenshot I flipped mesh in edit mode along Y global).
I tried to mirror before/after export-import both armature and mesh. Nothing helped before. On last screenshot everything is flipped along global X. One day I fixed it, but her weapon (circle thing) is always behind her after export-import when it must be ahead of her.
I am usig the ARKit face tracking configuration and displaying the face mesh in realtime, i can successfully add diffuse and normal maps to it, and they display correctly, but no luck with roughness or metalness, roughness has no effect and metalness is rendering all black and dull/opaque, things i have tried:
self.contentNode?.geometry?.firstMaterial?.roughness.contents = UIColor.black //UIColor.white
self.contentNode?.geometry?.firstMaterial?.metalness.contents = UIColor.black //UIColor.white
self.contentNode?.geometry?.firstMaterial?.roughness.contents = myMetallnessTextureImage
self.contentNode?.geometry?.firstMaterial?.metalness.contents = myRoughnessTextureImage
It is worth noting that most of the light management is done by the session, thats why the mesh responds to the ambient and direct light when using it, and when there is only ambient light, the mesh looks di-electric/rough, and when there is a strong directional light it looks metallic/smooth, is it the session taking control of these params?
If i export the face mesh as an .obj, when opening it on Xcode i can tweak the material properties, and i can get a fully metallic shiny face, but when applying the same params to my mesh in realtime they do not work.
In the end, what i want is to be able to have some part of the face metallic by using a grayscale map.
Also i am aware that the face tracking config does not allow an environment map(correct me if wrong) so a fully mirror like texture will look quite un-realistic but anyway, it should work.
Thanks!
Solved, i was missing this parameter in my setup method:
sceneView.autoenablesDefaultLighting = true
This adds what seems to be an ambient light to the mesh reflections, and roughness and metalness are taking effect.
I'm having a terrible time trying to figure out what's going on with my baked lighting. It appears that only Realtime lights affect my model. I've attached 2 images to demonstrate the problem. I have several point lights in the interior of my model. If I set them to Realtime everything looks great. However, if I set the, to Baked and change the GI accordingly they don't seem to interact with the model at all. Oddly enough the Directional Light on the exterior (and you can see it poking through the hallway door) Seems to display fine when set to Baked.
The model is generated in Blender and I do have the "Generate Lightmap UVs" import option selected. I've tried just about every combination of settings I can think of.
It turns out the interior lights were just a few pixels above the surface of my ceiling cube, causing the light to never reach the interior of the room :/
I have a model that I created in Blender. I then created a bow and arrow and then parented it to the hand bone of the model so that it moves with the hand. When I use the .blend file in Unity, however,the bow and arrow shifts to some other position away from where it is supposed to be. I'm not entirely sure how Unity and Blender's co-ordinate systems differ so it might be that but I haven't really had this problem with other models before. Any help would be appreciated.
Edit: Ok, so I've figured out what the problem is but I have no idea how to fix it (apologies for my poor modelling practices in advance because i'm fairly new to this)
This is my model in pose position:
This is my model in rest position:
I connected the bow to the skeleton by clicking on the bow rig > shift clicking on the hand bone > CNTRL+P > to bone. This works fine as the bow now moves with the skeleton and I can do whatever I need in the NLA editor.
Now, the issue is, when I use the .blend file in Unity, the bow is in the rest position of my model even though the skeleton is in pose position and performing the actions (so the bow is floating on the side).
I've tried connecting it differently. If I connect the bow instead of the bow rig to the model, then it is in the correct position in unity but then the bow rig detaches and so the bow animations don't play.
I've also thought the problem would be solved if I make the the current pose position my rest position but when I do that, the mesh reverts to the old rest position and moves very weirdly with the skeleton. Here is that pic:
I would really, really appreciated any help with this as it's been hindering my progress for the past few days.
This method describes saving the hierarchical assignment of a handheld object until after the character and animations are exported to unity. The weapon is imported separately in Unity and then assigned as a child to the relevant bone at that time.
Assigning a handheld object to a blender generated character.
I'm really new to 3d modeling, blender, etc.
I created a model with blender (a room). Now I exported it (as .obj) so that I can import it to CopperCube (a tool to create 3d scences).
The problem is, that the walls are only visible from outside. Take a look into the pictures:
Blender:
http://imageshack.us/photo/my-images/341/blenderg.png/
CopperCube:
http://imageshack.us/photo/my-images/829/coppercube.png/
I asked the forum of CopperCube and they said that there are only one-side polygons (or flipped). Is there a way to change this? Sorry, but I am a total beginner with this...
Here's the answer of the CopperCube forum:
I don't know blender, but are there any options you can change for exporting? It looks like your model just has one sided polygons, or the normals are flipped for some of them.
Make sure you have the normals checkbox checked in OBJ export options (at the left side, it's off by default):
You will need to model your room to have slim cubes instead of planes whenever they should be visible from both sides.
You can display the normals in Blender in edit mode. In Properties (N) scroll down to Mesh Display and check the type of normals you want to see and their length.
To recalculate the normals or flip their direction go to the Tool Shelf (T) in the Normals section.