Output vertex stream with Blender - blender

I'm looking to output a stream of the vertices, UVs and normals of an animated object (with clothing/softbody physics) to a file.
Is this possible with Blender? If not, is there another modelling application which can do that?
If it is possible, what such exporting of vertices called?

You can try to write a script with the Python Blender API: https://www.blender.org/api/blender_python_api_2_76_2/ This requires a lot of time to do.
If you want to use a common format, you could export your mesh with the export scripts in blender (to .obj, to .md5 etc...).

How you export will depend mostly on where you want the data to go.
.obj is a simple mesh object file while .mdd can contain an animated mesh - basically exports the mesh for each frame.
If you need to export in a custom format it isn't hard to get the mesh data to export. The obj.to_mesh() provides a copy of the mesh data with all modifiers and simulations applied.
import bpy, bmesh
scn = bpy.context.scene
obj = bpy.context.active_object
me = obj.to_mesh(scn, True, 'RENDER')
print('Vertices--')
for v in me.vertices:
print(v.index,':',end=' ')
for l in v.co:
print(l,end=',')
print()
print('Edges--')
for e in me.edges:
print(e.index,':',end=' ')
for v in e.vertices:
print(v,end=',')
print()
print('Faces--')
for f in me.polygons:
print(f.index,':',end=' ')
for v in f.vertices:
print(v,end=',')
print()
del me
You can get blender specific help with python scripts at blender.stackexchange.

Related

Exporting OBJ file from Blender, why are normals for each face vertex the same?

For example I created a sphere in blender, and exported the obj file. I was noticing some issues in my shader, because all the vertex normals for a face are the same. And can see it directly in the .obj file itself, such as portion below, see that for the first face, all the verticies are using normal index 1. I would expect they should all be different since it is a sphere, and are curving. Thanks, also this is in Blender 3.1
f 10/10/1 9/9/1 22/22/1 23/23/1
f 6/6/2 5/5/2 15/15/2 16/16/2
f 480/526/3 10/10/3 23/23/3 24/24/3
f 7/7/4 6/6/4 16/16/4 17/17/4
f 481/527/5 480/526/5 24/24/5 25/25/5
I found the issue, and I needed to add shading to the object in blender. Edit Mode->Select all vertices, and select smooth shading.

Is there any way to access blender defined vertex group in Godot?

I'm trying to automate SoftBody creation in Godot. Everything is working perfect except for the part were I'm supposed to supply pinned points for mesh.
var node = SoftBody.new()
node.pinned_points = [] #This is the array where I'm supposed to supply vertex id like: [1, 45, 1142]
Those are the vertex points that keep mesh hold position. Now since I'll use script for multiple models I can't use Godot editor for that purpose. So I thought I'll make an additional group in blender vertex group and access those vertices in Godot but I don't know the HOW part. Is there any alternative way? I'm open to ideas. Thanks!

procedural mesh creation with blender script

What is a clean way to create a procedural mesh in blender with a script? Ideally I'd like to create an empty mesh and start adding vertices and triangles, like this:
mesh = create_mesh()
a = mesh.add_vertex([0, 0, 0])
b = mesh.add_vertex([1, 0, 0])
c = mesh.add_vertex([1, 1, 0])
t = mesh.add_triangle(a, b, c)
Blenders bmesh module is designed for creating and editing mesh data, although you can also turn python arrays into mesh data.
You can find some examples included with blender, all addons are made using python, one helpful addon would be add_mesh_extra_objects which has several examples of creating different types of objects.
There is a blender specific stackexchange site where you will get more help with blender specific scripting. You should also find some existing examples there.

meshlab- how to transfer uvs from source .objs onto poisson reconstruction model

I've been struggling for some time to find a way in Meshlab to include or transfer UV’s onto a poisson model from source meshes. I will try to explain more of what I’m trying to accomplish below.
My source meshes have uv’s along with texture data. I need to build a fused model and include the texture data. It is for facial expression scan data reconstruction for a production pipeline which ultimately builds a facial rig for animation. Our source scan data includes marker information which we use to register, build a fused scan model which is used to generate a retopologized mesh for blendshapes.
Previously, we were using David3D. http://www.david-3d.com/en/support/downloads
David 3D used poisson surface reconstruction to create a fused model. The fused model it created brought along the uvs and optimized the source textures into 1 uv tile. I'll post a picture of the result below that I'm looking to recreate in MeshLab.
My need to find this solution in meshlab is to build tools to help automate this process. David3D version 5 does not have an development kit to program around.
Is it possible in Meshlab to apply the uvs from the regions used from the source mesh onto the poison model? Could I use a filter to transfer them? Reproject them?
Or is there another reconstruction method/ process from within Meshlab that will keep the uv’s?
Here is an image of what the resulting uv parameter looks like from David. The uvs are white on the left half of the image.
Thank You,David3D UV Layout Result
Dan
No, in MeshLab there is no direct way to transfer UV mapping between two layers.
This is because UV transfer is not, in the general case, a trivial task. It is not simply a matter of assigning to the new surface the "closest" UV of the original mesh: this would not work on UV discontinuities, which are present in the example you linked. Additionally, the two meshes should be almost coincident, otherwise you would also have problems also in defining the "closest" UV.
There are a couple ways to do it, but require manual work and a re-sampling of the texture:
create a UV mapping of the re-meshed model using whatever tool you may have, then resample the existing texture on the new parametrization using "transfer: vertex attributes to Texture (1 or 2 meshes)", using texture color as source
load the original mesh, and using the screenshot function, create "virtual" photos of the model (turn off illumination and do NOT use ortho views), adding them as raster layers, until the model surface has been fully covered. Load the new model, that should be in the same space, and texture-map it using the "parametrization + texturing " using those registered images
In MeshLab it is also possible to create a new texture from the original images, if you have a way to import the registered cameras...
TL;DR: UV coords to color channels → Vertex Attribute Transfer → Color channels back to UV coords
I have had very good results kludging it through the color channels, like this (say you are transfering from layer A to layer B):
Make sure A and B are roughly aligned with eachother (you can use the ICP filter if needed).
Select layer A, then:
Texture → Convert Per Wedge UV to Per Vertex UV (if you've got wedge coords)
Color Creation → Per Vertex Color Function, and transfer the tex coords to the color channels (assuming UV range 0-1, you'll want to tweak these if your range is larger):
func r = 255.0 * vtu
func g = 255.0 * vtv
func b = 0
Sampling → Vertex Attribute Transfer, and use this to transfer the vertex colors (which now hold texture coordinates) from layer A to layer B.
source mesh = layer A
target mesh = layer B
check Transfer Color
set distance large enough to not miss any spots
Now select layer B, which contains the mapped vertex colors, and do the opposite that you did for A:
Texture → Per Vertex Texture Function
func u = r / 255.0
func v = g / 255.0
Texture → Convert Per Vertex UV to Per Wedge UV
And that's it.
The results aren't going to be perfect, but in practice I often find them sufficient. In particular:
If the texture is not continuously mapped to layer A (e.g. maybe you've got patches of image mapped to certain areas, etc.), it's very possible for the attribute transfer to B (especially when upsampling) to have some vertices be interpolated across patch boundaries, which will probably lead to visual artifacts along patch boundaries.
UV coords may be quantized by conversion to a color channel and back. (You could maybe eliminate this by stretching U out over all three color channels, then transferring U, then repeating for V -- never tried it though.)
That said, there's a lot of cases it works in.
I may or may not add images / video to this post another day.
PS Meshlab is pretty straightforward to build from source; it might be possible to add a UV coordinate option to the Vertex Attribute Transfer filter. But, to make it more useful, you'd want to make sure that you didn't interpolate across boundary edges in the mapped UV projection. Definitely a project I'd like to work on some day... in theory. If that ever happens I'll post a link here.

How do I modify mesh attributes to send custom information in Blender?

I have a mesh in 3DS format. I imported this mesh to blender and now, I want to export this mesh back to 3DS but, I want to associate a number (say id) with each vertex of this mesh. Now, I only need the x, y and z coordinates of this newly exported 3DS, and I don't really care about the normals or the texture coordinates.
So the way of keeping the IDs intact could be to insert that number in an un-required attribute, let's say the x coordinate of each vertex normal or the first texture coordinate of each vertex.
Here's what I tried with normals:
import bpy
import bmesh
object_reference = bpy.context.active_object
bm = bmesh.new()
bm.from_mesh(object_reference.data)
for vert in bm.verts:
vert.normal[0] = vert.index
bm.to_mesh(object_reference.data)
But, the normals reverted back to default on export. So, how do I do this?
I couldn't figure out a way to set the texture coordinates, how can I do so? If I can't, then how can I make the vertex normal hack work? Is there a less-hacky way of doing this?