Bake Alembic and export as glTF or FBX with animation - blender

I have an Alembic animation of a flower growing. I am attempting to export it as glTF or FBX with the animation intact, but without success. My end goal is to be able to use the animated file in the browser, or in Spark AR.
I have tried using Nitrobake, but the resulting object does not animate. Have not been successful in finding other directions to try, and when consulting more experienced friends, they also seem at a loss.
I have also looked at https://vimeo.com/78684377, a video regarding baking Alembic to point caches, but I do not believe it is what I need - Or, I simply did not understand it properly.
I am using C4D R21 but happy to try Blender too, or other software that could do the trick.

Related

Blender Texturing doesn't show up correctly after repeatedly baked on it even though UV-Mapping fits perfectly

The problem occurs, when I baked lightings & reflections via Principled-BSDF (Cycles) on an "Image Texture"-Node repeatedly. The first times I get excpected solutions and then suddenly the mesh seams to be broken as it keeps showing future bakings incorrectly (image below).
Also when I move an island in the UV-Map nothing seams to changes on the Mesh in the 3D-Viewport. The UV-Texture looks unchanged no matter what I do. Like it has frozen or something.
My Blender Version is: 2.92. Im getting the same problem with 2.83.
I keep getting this problem over and over and I just can't find a solution. Even if I exported the mesh in another project. It just "infects" the other project and I get the same problem there.
I only can repair it if I completely start over.
Please help me. I'm really frustrated with this. This has defeated my blender project now for like the 4th time... :/
> Screenshot example here <
It appears as if the generated texture coordinates are being used for some reason instead of the UVMap coordinates. If the vector socket of the of the image texture node is unconnected it should use the current selected UVMap.
This may actually be a bug if it's happening after multiple uses of the baking tool.
You should first try connecting the image vector input to the uv output of a texture coordinate node to see if it has any effect. Alternatively try to connect a UVMap node

Importing new Animations to a Existing Character from Blender to Unreal Engine

So I'm building a game and I've run into this problem.
I built a character from scratch in Blender with 5 animations.
I successfully exported the .fbx file and implemented it in Unreal engine.
There are no errors and the game works fine, all 5 animations working.
I now need to add more animations to the character. It's here I'm a bit confused.
I built 5 additional animations in Blender saved the .fbx file but when I try to import it to unreal I get a message saying successful reimport, but the new animations don't show up in the unreal engine file browser.
I am trying to avoid having to rebuild the animation blueprint from scratch. Is there any way I can add animation to unreal and use them with the same character.
I've looked up animation retargeting but is doesn't seem to be working in my case. I have not tried everything but i'm looking for solutions online and keep running into dead ends. Can anyone please help or point me to the correct direction i will be greatful.
The entire project was built in blueprints. There is no Custom C++ Code.
When you import the animation fbx files there should be a dialog box asking you which skeleton you want to apply the animation to, in most cases this should be enough for targeting the animation to the specified skeleton. You just want to make sure all of your skeletal meshes are targeting the same skeleton.
To answer my own question, you simply ensure the Root bone in Blender is named something other than "Armature" and export it as .fbx file. Import it in Unreal using your existing Skeletal Mesh.

Lighting (to add lights) in gltf 2.0

I´m using Blender 2.79 and I´ve been exporting using the addon provided by khronos. How possible is it to light the scene with the model, and export it?
My model has textures appearing too dark. Facebook posts appear darker.
On sandbox babylon the environment helps to light the model a little but there´s no real light source setup. How do I export my model with lights?
Also: I´m using very high value spotlights, and point lights..none of which appear at (.gbl) file export.
Please help.

Correct way to export Ogre from Blender into jmonkeyengine?

I'm learning Jmonkeyengine and I'm still at about the same stage as in this question where I ask about loading models
Enabling materials and textures for OGre 3D model in jmonkeyengine?
Now I looked more at Blender and now at least I can get the basic usecase to work, export to Ogre 3D from Blender and then loading it in jmonkeyengine. But for more advanced models with textures, it won̈́t work.
I'm trying to load an Ogre 3D into jmonkeyengine but I think the conversion to Ogre format is not working. I can open the model in Blender but when I try to export it all I can get is a .scene file and no .mesh.xml
Could you tell me what I'm doing wrong?
For instance opening this model in Blender and exporting it to Ogre doesn't work for me.
For what it's worth: after hours of trying to figure out why JME wouldn't locate the materials, esp. when using submeshes, turned out you need to rename your material file to .material. If you use more than 1 material, append all material files to that one file.

3D animation programatically rendered in Blender

I have a project in which I would like to programatically create and render a 3d animation based upon input. I originally asked here on stackoverflow if Blender was right for the job, and the response was yes, but upon looking at the API, it says this:
Python was embedded in Blender, so to access BPython modules you need to run scripts from the program itself: you can't import the Blender module into an external Python interpreter.
I want to be able to create and render this scene without having to ever open another program like Blender. Is this possible, and is Blender still the right choice?
Thanks in advance!
At work me and colleague worked on a project that rendered 3d scenes altered externally. We used Python to modify/create scenes, and did the rending on server through the command line interface (no GUI).
You can pass a python script as an argument to Blender in the command line options to
generate your scene objects and do the rendering.
I don't see how you can render in Blender without using Blender.
You can use Blender if you want, obviously this is not your only option.
If you need to
create and render a 3d animation based upon input.
You can go as simple or as you complex as you'd like.
You can use OpenGL in your language of choice (C++, Java, Python, etc.)
and display the animation (with or without fancy renderings).
It's up to what 'render' means to your context.
If you need some nice shading(light, soft shadows, reflections, etc. - ray tracers basically), you can still show an interactive preview to your users and generate the scene
for a 3rd party renderer(like Yafaray, Sunflow, LuxRender, etc. - I've put together a short list of free renders), and show the progress to the users after they've chosen the external render option.
On a similar note, have a look at joons.
HTH
Cart by Suomi - Yafaray Gallery image
Julia quaternion fractal - Sunflow Gallery image
Klein Bottle - LuxRender Gallery image