So I have no idea what happened with last week's post, and unfortunately it didn't save anywhere else for me to repost, so I'm just gonna continue with this one...
With my previous Unity projects, I've shied away from using character animations, purely because it seemed so intimidating to use premade animations on characters I would have to create from scratch...
This time, I knew I would have to learn how to do it.
For some reason Adobe Fuse doesn't play well with Unity 2019, so that was out. BUT, if I use the old (pre-Adobe) Fuse application through Steam, then upload that .obj file to Mixamo's auto-rigger website, I can perfectly import the generated .fbx file into Unity, with all the materials intact. Perfect!
BUT, the trouble was finding animations that would work with what cutscenes we had in the game's script. After messing about for a little while with all the animations Mixamo offers, I looked into different ways of making custom animations, without having a panic attack over Blender's terrifying UI. (All joking aside though, to be fair, Blender 2.8 is pretty awesome!)
I had an Xbox 360 Kinect sensor sitting in a closet for literally years, and would never have imagined that I would dig it out for any reason, especially after the 360 itself stopped working. But then I found the CinemaMocap Unity Asset.
This asset is incredible. After picking up a USB adapter for the sensor, the asset allows me to record animations (raw files that I can apply to any rigged humanoid character in any software) using really affordable motion capture!
This is amazing because now I can have my voice actors record dialogue AND act at the same time, for super realistic motion--it matches to the dialogue perfectly because the actors are essentially in the game! (For lack of a better explanation.)
So, it seems like this is the way we are going to be recording out cutscenes from now on. This added super realistic human motion also adds to the immersion in VR--when characters with generic animations are "interacting" with the player in VR, it can be extremely unsettling.
In addition to all this, I'm still messing around with trying to get my Mixamo Fuse characters working with facial blendshapes for lip sync, but that's still up in the air.
More to come!
Thanks for reading!