Im not very good at 3D modeling, so hope you can answer my question in a simple way.
Im trying to make a building or its walls have a effect / animation / cute trick, where they can become transparent partially.
Think of Diablo3 in a dungeon, when the character goes behind a wall, then the wall is cut in the middle and hides the top part temporarily.
The problem is that i want the building to look perfectly intact while in Phase 1.
And the Phase 2 should be activatable from Unity, so if a ray hits the building while trying to see a character behind it, then activate phase 2, and deactivate it again when the character has moved past this part.
thats why im thinking an animation is needed. but what do you guys think?
How would i do this in Blender?
Related
I apologize if this has been asked before. I tried searching for this first and nothing is coming up. I'm pretty new to Unreal Engine 5.1 so this might be something I'm doing as well.
I've been exploring animation retargeting in unreal and have tried following the steps we learned in class using one of the models from mixamo.com. Everything appears to work fine at the start and I can get the actual IK and IKR objects working just fine. However when I try to export the animations either from the IKR object or by right clicking the ABP object for the source mesh one and only one of the animation sequences rotates 90*. It is always the same animation (the Land animation sequence) and I'm not sure how to go about fixing it.
Also tried looking on google and turned up nothing.
I'm hoping this is some stupid newb mistake that is easy to fix or maybe there's something I'm overlooking. Any help is greatly appreciated and I will continue trying to fix the problem myself as well and will post if I fix it
Tried retargeting using the following steps
Create an IK_Object for the model you wish to project your animations on with chains for each. Mine looked like the following
IK_Remy
Repeat step 1 for the model you wish to source animations from. Mine looked like the following
IK_Manny
Create an IKR_Object Linking the two together, here's what mine looked like
IKR_Remy
Find the ABP for your source model, right click, and select "Retarget Animation Assets->Duplicate and Retarget Animation Assets". Here's what I'm selecting for that
Retarget Dialog
When I do the following most of the animation sequences for "Manny" export just fine. However the "Land" animation flips for some reason (see image below)
Exported Animation Images
Even stranger, when I preview the MM_Land animation in my IKR object it looks fine i.e. not rotated. However, if I try to export the animation from the IKR object the same thing happens i.e. it rotates 90*. I would expect this to be a case of WYSIWYG where if it's working in the preview it would export correctly. However that apparently is not the case
Also I tried modifying the animation sequence manually but it won't let me. If I try to rotate the model in the animation sequence and save it, once I close the sequence it's re-rotated and the changes do not persist.
I can export the sequence as a new sequence, modify it, and save it, and then rename it as my exported "Land" animation to hard force it and it at least looks normal. However when I actually play the game and jump, when the land animation it still flips sideways and in addition causes the character to scale and warp for a second which makes me think there's something going on here that I don't know enough to fix. Really hoping someone with more experience in Unreal Engine can help.
EDIT: Fixed Image Descriptions
I can confirm that this is an issue - I'm seeing the same behaviour. I haven't managed to fix it yet, but my suspicion is that it's due to scaling - in my instance, I have had to scale up my custom character by around 2.5 times to replicate the scale of the default mannequin. Did you scale your custom character at all?
I'm having a terrible time trying to figure out what's going on with my baked lighting. It appears that only Realtime lights affect my model. I've attached 2 images to demonstrate the problem. I have several point lights in the interior of my model. If I set them to Realtime everything looks great. However, if I set the, to Baked and change the GI accordingly they don't seem to interact with the model at all. Oddly enough the Directional Light on the exterior (and you can see it poking through the hallway door) Seems to display fine when set to Baked.
The model is generated in Blender and I do have the "Generate Lightmap UVs" import option selected. I've tried just about every combination of settings I can think of.
It turns out the interior lights were just a few pixels above the surface of my ceiling cube, causing the light to never reach the interior of the room :/
I have a model that I created in Blender. I then created a bow and arrow and then parented it to the hand bone of the model so that it moves with the hand. When I use the .blend file in Unity, however,the bow and arrow shifts to some other position away from where it is supposed to be. I'm not entirely sure how Unity and Blender's co-ordinate systems differ so it might be that but I haven't really had this problem with other models before. Any help would be appreciated.
Edit: Ok, so I've figured out what the problem is but I have no idea how to fix it (apologies for my poor modelling practices in advance because i'm fairly new to this)
This is my model in pose position:
This is my model in rest position:
I connected the bow to the skeleton by clicking on the bow rig > shift clicking on the hand bone > CNTRL+P > to bone. This works fine as the bow now moves with the skeleton and I can do whatever I need in the NLA editor.
Now, the issue is, when I use the .blend file in Unity, the bow is in the rest position of my model even though the skeleton is in pose position and performing the actions (so the bow is floating on the side).
I've tried connecting it differently. If I connect the bow instead of the bow rig to the model, then it is in the correct position in unity but then the bow rig detaches and so the bow animations don't play.
I've also thought the problem would be solved if I make the the current pose position my rest position but when I do that, the mesh reverts to the old rest position and moves very weirdly with the skeleton. Here is that pic:
I would really, really appreciated any help with this as it's been hindering my progress for the past few days.
This method describes saving the hierarchical assignment of a handheld object until after the character and animations are exported to unity. The weapon is imported separately in Unity and then assigned as a child to the relevant bone at that time.
Assigning a handheld object to a blender generated character.
Now I could be wrong about this but after testing it all day, I have discovered...
When adding a widget and setting the z-index, the value "0" seems to be the magic depth.
If a widget's Z is at 0, it will be drawn on top of everything that's not at 0, Z wise.
It doesn't matter if a widget has a z-index of 99, -999, 10, -2 or what ever... It will not appear on top of a widget who's z-index is set to 0.
It gets more strange though...
Any index less than -2 or greater than 2 seems to create an "index out of range" error. Funny thing is...when I was working with a background and sprite widget, the background's Z was set to 999 and no errors. When I added another sprite widget, that's when the -2 to 2 z-index limitation appeared.
Yeah I know...sounds whacked!
My question is, am I right about "0" being the magic Z value?
If so, creating a simple 23D effect like making a sprite move being a big rock will take some unwanted code.
Since you can only set Z when adding,a widget, one must remove and immediately add back, with the new Z value...a widget.
You'll have to do this with the moving sprite and the overlapping object in question. Hell, I already have that code practically written but I want to find out from Kivy pros, is there a way to set z-index without removing and adding a widget.
If not, I'll have to settle for the painful way.
My version of Kivy is 1.9.0
What do you mean by z-order? Drawing order is determined entirely by order of widgets being added to the parent, and the index argument to add_widget is just a list index at which the widget will be inserted. The correct way to change drawing order amongs widgets is to remove and add them (actually you can mess with the canvases manually but this is the same thing just lower level, and not a better idea).
I found a working solution using basic logic based on the fact widgets have to be removed and added again in order to control depth/draw order.
I knew the Main Character widget had to be removed along with the object in question...so I created a Main Character Parent widget, which defines and control the Main Character, apart from its Graphic widget.
My test involves the Main Character walking in front of a large rock, then behind it...creating a 23D effect.
I simply used the "y-" theory along with widget attach and detach code to create the desired effect.
The only thing that caught me off guard was the fact my Graphic widget for my Actor was loading textures. That was a big no no because the fps died.
Simple fix, moved the texture loading to the Main Character Parent widget and the loading is done once for all-time.
PS, if anyone knows how to hide the scrollbars and wish to share that knowledge, it'll be much appreciated. I haven't looked for an API solution for it yet but I will soon.
Right now I'm just trying to make sure I can do the basic operations necessary for creating a commercial 23D game (handhelds).
I'm a graphic artist and web developer so coming up with lovely visuals won't be an issue. I'm more concerned with what'll be "under the hood" so to say. Hopefully enough, lol.
I've seen quite a bit of questions regarding how to draw isometric tiles, and most all point at being drawn back to front, top down. However I'm trying to find a way to prevent clipping with a single isometric image.
While normally drawing a sprite ontop of a single image would not prevent overdrawing on walls and such, I split up the image into 3 layers. A floor, lower wall, and top wall. Where the player checks the floor for collision, is drawn in front of the lower wall always, and drawn behind the top wall always. The result looks like the following
While this seems to work decently well, I'd like to know what the most efficient way to draw these sort of isometric scenes are. I've considered tiles, however that raises the question of how to draw multi-tiled buildings and such. If tiling becomes a better option I will create a new question regarding those questions. For now lets assume I'm using a single image broken into layers.
This is somewhat easier, however, for my artist. To be able to draw a single scene in isometric, and split it up into layers, eliminating the need for a map creator. And then using pixel collision to get precise collision with the enviroment.
Is using a multi-layered scene even a good approach for this? My biggest concern is preventing overdrawing and breaking perspective. I've also seen many good examples of drawing everything using tiles, however then I'm limited to a certain scale, and that arises even more questions. Do you know of the best way to approach this? Should I use tiles instead of a single image split into layers?
I plan to code this in either MonoGame or Processing.
(I would have posted this on gamedev but I can not post images there)