Fuck yes, a topic that I am useful in. Maybe.
Animations in Unity (because I'm not experienced with Unreal)Unity uses Mecanim a really, really convenient animation handling system. Yes, it's designed for animators rather than programmers because the vast majority of games don't need script-driven animations. It DOES support inverse kinematics (allowing hands to be accurately placed on ledges, feet to land on slopes realistically, etc), as well as allow the developers to construct elaborately blended, variable-driven systems through blend trees.
For example, a character could have an idle animation and a walk-forward animation that are blended together based on a script that sets the current forward velocity, and then that blended animation could be blended between two MORE animations to turn left or right depending on the character's change in rotation. Not particularly script-heavy, because that would suck to have to recreate that every time you started a new project (or even once to be recycled for multiple projects).
I'll get to script-driven animations in a bit.
Creating models and animationsGame engines aren't built for animating, and they
really aren't built for modelling. That should all be done in a variety of other programs (Maya, Blender, Mudbox, zBrush, xNormal, etc). A normal workflow might work something like this:
- Simple mesh created in Autodesk Maya
- Mesh gets UV mapped for textures
- Mesh gets subdivided and sculpted in Mudbox
- Sculpted mesh gets painted
- Painted mesh gets exported in two forms: a high-res version for later use and a low-res version for retopologizing
- Low-res mesh is brought back into Maya and has a new, simplified mesh projected over it (this is called retopologizing)
- Retop-ed mesh gets UV mapped again
- Retop-ed mesh gets exported into xNormal along with the high-res mesh from earlier, and texture, normal, and ambient occlusion maps are baked from the two
- Retop-ed mesh gets put into Unity along with atlased texture and normal maps
...In other words, a whole lot of steps because the world is complicated and that's just how it is because that's the only way it works.
If you need the model animated, you have to add in rigging and animation, which are both a whole 'nother can of worms. It's all just skills (lots of skills, unfortunately). There's no way around it. If you want to download models instead of making them from scratch, look for .fbx files for Unity and you can literally click-and-drag them into a scene. USING the models is dead simple.
Using models from Source Engine games or SkyrimAs it happens, I've done both of those. Skyrim models were easier: there's a program that reads the .bsa files from Elder Scrolls games and lets you take out files. You can open up a .bsa, copy models and relevant texture files to your computer, run them through another program, and get a usable file from them (can't remember if it's .obj or .fbx or something different entirely, though). Plug that into Maya, Blender, or 3ds Max and export it from there with the correct settings (don't try to put them directly into the engine because it almost certainly won't work).
Source models... I know GCFScape is the way to open up .vpk files. I somehow got a Spy model from TF2 into Unity, complete with textures and animations, but that nearly two years ago. I remember using Blender, but I don't know if I had to find a plugin or if they came in an importable format right off the bat.
I remember it being a pain in the ass, though. Once they were in Unity, it was all sunshine and rainbows. Getting them there... not so much.
Script-driven animations for VRWhen I heard about ControlVR (that VR glove/arm thingy), I wanted to make a very specific game. You would control a wizard and duel other players by casting spells with gestures. I wrote a system to control the hand animations, relying on the fact that Unity handles each bone as a seperate transform - 'transform' meaning a combination of position, rotation, and scale (in a hierarchy, but the script didn't care about that). The script had a series of variables that tracked each bone (hand, index_1, index_2, index_3, middle_1, middle_2, etc) and adjusted the rotations based on arbitrary inputs (I left it fairly open because I didn't know exactly how the inputs would work).
In practice, it made a pretty cool system. I could cast fireballs and create shields based on the script reading the gestures (specifically, comparing the rotations of each bone to predefined spells).
I then proceeded to completely forget about it once the initial wow-factor of VR faded from my mind.
To sum up this entire rambling essay of dumb bullshit:Everything requires complicated skills, but they're not that bad. Unity has a very effective and logical animation system once you get more into it. You can do anything you want if you're willing to script it, but it takes effort. If you want the easy way, use the built-in systems... which require learning skills to effectively use.