Saturday, July 22, 2006

The other part of my job

All the stuff I mentioned in the previous post is part of the job I've kind of had to take handling the animation pipeline. There's just no one else to handle that.

Those familiar with Motion Builder might wonder why we're exporting skeletons. The answer is simple: It's a safety check. The director has signed off the performance in Motion Builder. By having the MB skeleton, it provides a visual cross reference as proof that the performance of the Control/IK version of the character is a 1:1 match with the approved performance when there's no divergence between the two skeletons, except maybe small spacial differences between dissimilar rigs used in the mocap process. Even so, when there's time the most recent version of the character is "characterized" in MB, so the match is pretty close to start with.

But here's an example of how far astray this process can go: We had a scene mocapped with an early version of the character, and that pipeline skipped Motion Builder entirely. So all the hand animation done on top of the mocap was done in Maya. The system I came up with works with that as well. Even though in the latest version of this particular character, the rotations on the bones have been zeroed and realigned, the control structure is entirely different, and even the locations of the joints has changed. So part of the job of my conversion system is to take legacy animation and load the most recent versions of a character with that legacy animation. It's a really useful hat trick.

OK, which brings me to the job I was actually hired to do: rigging. Lately I've been working on a facial rig architecture, that doesn't depend on blend shapes. I had to try a lot of things and burn them to the ground, because every one of them had some annoying pathology. 6.5 doesn't have the ability to paint weights on wrap or wire deformers, and influence objects give me a little more control over weight assignment, at least for the whole curve I'm using for deformation. It was pretty hard to figure out how to get the rotational information of the curves to be picked up by the object when the character is reoriented, but it was only hard because I had no idea what I was doing at first ;)

But there are lots of other little issues that need solving. Things that should be cloth simmed won't be, like a character's leather jacket. Leather isn't like other kinds of cloth, and it presents special challenges. Fortunately, I have about 10 leather jackets to use as reference.

Another is the typical problem of the shoulder area. It isn't hard to defeat some of the problems, but some of the actors who were used in the mocap sessions are [i]limber[\n]. Human limitations weren't a concern to them, and I have to accommodate the exceptional flexibility of a contortionist. As my supervisor says, "Fun, fun, fun". I'll probably be using influence objects a lot to create a selective "muscle system" where these problems occur. I've also got overlapping joints to accommodate controlled twist, a finger control system that accommodates mocapped finger data but lets you animate on top of it and a lot of little tweak controls that will let an animator or TD solve all sorts of problems with the character's, most of whom are not human proportioned.

One of the things I have to deal with, is that with all these multiple characters, I can't use scriptJobs to pull off some of the character TD's magic tricks. I'm working on ways around that. But at least I've now boiled the rigging process down to a "one button operation" for the main controls, connections, constraints and expressions. The face and assorted compensation controls are the only part that has to be done by hand.

It's all getting pretty exciting. At least in that way that a TD's job can be exciting.

Finally getting down to brass tacks...

I'm finally starting to get some real scenes to work with. And my animation tools are getting a trial by fire.

What I found out: Reality is always different from the test cases.

First, the skeleton fbx files that I'm getting have joints differently named from the standard, with two characters in the scene. One character will have joints named with a suffix like ...J_1 instead of ...J. The tools were set up to look for _J, and when they're differently named, that a problem.

Second, those suffixes weren't always consistent within the character. The ball of the foot and toe would sometimes be just _J where the rest of the character had J_1. So with two characters in the scene, one would have J_1 for everything and one would have J_2, but their toes and balls of the feet would be _J and J_1 respectively. Pain in the butt.

Third, my method for loading a saved animation file saved under one character name or reference to another that might be referenced or not, or dissimilarly named joints or controls, required hand matching from the name used in the file to each joint or control in a character. There are a lot of controls and joints in characters. This would take a long time in practice even though I'd only have to do it once. It turns out that there is no luxury of time unless I want to put in a lot of OT to get things done, and I sometimes have minutes instead of hours to do things.

Fourth, one method I had for handling conversion of mocap data to a fully rigged IK character worked, but when I'd try to save the referenced file would crash Maya as soon as I'd go to save the scene.

Fifth, that method required separating the characters from the converted fbx scene into individual characters. This turned out to be time consuming also.

So I needed to make tools a one stop affair.

I created a new tool which would save the exact spacial location, regardless of hierarchy, of the IK character's driver transforms and rotation order. This could be used to convert the data from the mocap characters, even if they were based on a different skeletal setup and hierarchy. This file fits into my .sganm file format (which currently has three different versions that are automatically detected: ANIMATION which saves all of the keyframe data including tangents, tanget type, tangent weight, and infinity and is hierarchy dependent; MOCAP, which is just the raw keyframe number, and keyframe value for each channel and is still hierarchical; and a spatial location format that just puts everthing exactly where it is in the animation file in world space, which is non hierarchical but has to be implemented in hierarchical order or it results in a transporter accident. In otherwords, I already had a tool that could be deployed to store the data from the new rig. I save this in the master folder that contains the character's old mocap rig, new mocap rig and the IK character that has all the controls installed.

In order for all this to work, the two skeletons have to be aligned on frame zero in the hero pose. The same code that's used for the translation from one older state to a newer one, or joint based animation to control based IK animation, is also used to create a master "zero frame" hero pose file. This is also stored in the master directory.

I wrote a tool for automatic creation of a "map" file. This would use TD defined wildcards to match similarly named joints and controls to the one used in the animation file. Whether that file is a "zero frame" hero file, or a control location file. This worked great until I found the joint naming inconsistency.

So I need to create a "descendants" based system that looks for matches regardless of the added delimiters and indexing based on the selected character name. I have bits and pieces of the code in use for other purposes, so it should be too difficult to implement.

On top of all that, even if I get it all working smoothly, there's still the problem of the manual handling of the various steps in the conversion process. What's needed is yet another file format, this one being a "Rosetta Stone" so to speak, for all the characters that have that control space file and "zero frame" file already saved in the master directory. That way, I could load an fbx scene, select the characters in that scene from a list, and it would automatically set a zero frame, load the appropriate reference file for each character, and map the animation to the character with controls with a "one button push" and could be used by anyone, rather than having to have it done by a TD.

There are a lot of details to making a system like that work. But I figure it could take a half hour of hand messaging and turn it into maybe five minutes. Which would be good since there are 1400 shots.

The reason all this is necessary, is that Motion Builder doesn't handle the complex rigs with all their constraints and esoteric expressions, at least not the versions we're using in production. The rigs also can overload hand animation over the mocap, and have space switching capabilities, and MB can't handle those either, and you can't apply the motion directly to the joints because of the IK, so conversion is an absolute necessity.