Tuesday, May 27, 2014

Animation blending

In EMI, just like in many other 3D games the motion of a 3D model displayed on sceen may be a combination of several different keyframe animations playing simultaneously. For example, the animators may have supplied a 'walk' animation and a 'hold object' animation for a character in the game. The game engine may produce a 'walk while holding an object' animation by combining the lower body part of the 'walk' animation with the upper body part of the 'hold object' animation. This can be achieved with the animation priorities I described in the previous post. In addition, the engine can animate the transition between different animation states such as 'walk' and 'stand' by interpolating between them. This interpolation is what I refer to as animation blending.

With animation blending, the final animation that is displayed on the screen is the weighted sum of all simultaneously active animations. The weight of an animation (the "blend weight") determines how much contribution the animation has in the final result. The total sum of weights should equal 1. For example, we could animate the transition from 'stand' to 'walk', by linearly interpolating a value α from 0 to 1 over time and setting the blend weight of 'walk' to α and the blend weight of 'stand' to (1-α) at each step in time. The interpolated transition may not look completely realistic, but in most cases it looks much better than instantly snapping to another animation.

In EMI, the game may request certain animations to be faded in or faded out. To support this in ResidualVM, I store a 'fade' value for all active animations. When requested, the value is linearly interpolated between 0 and 1. If animations had equal priority, we could assign weight=fade for all animations and simply divide the intermediate weights by the total sum of weights to get the final normalized blend weights. However, with prioritized animations this changes a bit.

For higher priority animations we want to assign more weight than for lower priority animations. An animation with weight 100% will always completely override anything with a lower priority. If the animation has weight 60%, lower priority animations will only get to distribute the remaining 40% of weight among themselves.

How is this implemented? The way I'm doing it now is I first collect priority-specific accumulated contribution to animation "layers". Each animation layer contains the accumulated contribution of animation with a certain priority. For example, layer 0 contains the contribution of animation with priority 0, layer 1 contains the contribution of animation with priority 1, and so on. Within a layer we can assign weights for the animations in the simple fashion described before, with the exception that we'll only divide the weights by the total sum if the sum exceeds 1. I also assign a fade value to animation layers, which is simply the sum of weights.

Once the animation contribution is split into layers, we'll get the final result by blending the layers together. The blend weights are assigned so that the highest priority animation will contribute the most, and the remaining weight is distributed to lower priority animations. To be exact, the layer weights are calculated as follows for n layers:

weight_n = fade_n
weight_n-1 = fade_n-1 * (1 - fade_n)
weight_n-2 = fade_n-2 * (1 - fade_n-1) * (1 - fade_n)
...
weight_1 = fade_1 * (1 - fade_2) * ... * (1 - fade_n-1) * (1 - fade_n)

The end result is a system where each animation can be independently faded in and out, while still respecting animation priorities.

Friday, May 23, 2014

Animation progress

The Summer of Code is finally here, and the first week is already almost over! I'm sorry that I haven't updated the blog more, but I've been busy working on EMI's animations this week and I've made some excellent progress.

Some background: right now there is a basic implementation of EMI animations in ResidualVM, but the implementation is still far from perfect. I'm currently focusing on two major issues that I've identified. Firstly, the current implementation prioritizes animations in the order in which they were started, so the animation that is started last always overrides any previously applied animation (although there are specialized workarounds for some cases). Secondly, unlike in the original game, all animation transitions are instant due to the lack of animation blending. In this post I'm focusing mainly on the prioritization of animations.

If play order shouldn't matter, then how should it be chosen which animation takes precedence over another? We could get some ideas by looking at Grim Fandango, since it is based on an older version of the same engine. In Grim, each keyframe animation contains a priority field (actually two of them, but let's keep it simple) that control the order in which animations are applied. A higher priority animation always takes precedence over a lower priority one.

One might ask what happens if two animations with the same priority play at the same time. The answer is they are blended together, and the result is an average between the two animations. Again, the result is the same regardless of the order in which the animations were applied. I'll describe blending in more detail in a later post.

Knowing how the priorization of animations was done in Grim, the first thing I did of course was to try to find a similar priority field in EMI's .animb animation data format. Perhaps unsurprisingly, it turns out there is one! However, the catch is that the priority field in EMI is bone-specific. In other words, an animation may specify a different priority value for each bone that the animation is affecting. (Note: EMI uses skeletal animation.)

For example, we could have an animation of Guybrush waving both of his arms. For the left hand the priority value could be 1, and for the right hand it could be 3. In addition, we could have a standing idle animation with a constant priority of 2 for all bones. Now, if both of these animations were applied at the same time, the result would be that Guybrush would wave his right arm, but the rest of his body would follow the standing idle animation.

Typical real priority values I've seen so far are 0 for base poses like standing idle, 1 for walk and run, 2 for holding an object in hands, and so on.

Without animation blending, adding support for the priority value is fairly straightforward. When applying the animation to the bones of the skeleton, we can keep track of the currently applied priority for each bone. If the animation's priority is higher than the bone's current priority, the animation replaces the current bone transform completely. Otherwise the animation is skipped. Using this method we can apply the animations in arbitrary order and the highest priority animation is always displayed. Of course this simplistic approach is still dependent on the order in which the animations are applied in the case where animation priorities are equal.

Things start to get more complicated once animation blending is added into the mix, though. With blending, a higher priority animation may not completely replace a lower priority one if it has a blend weight lower than 100%. In that case some of the lower priority animation will "show through". In order to implement this properly animations can no longer be applied in arbitrary order. Instead, we need to calculate the transformation for each bone by applying the animations in descending priority order, taking the blend weights into account at each step. This is complicated by the fact that since priorities are bone-specific, the order in which animations should be applied may be different for each bone.

I'll go into details on how I solved this in the next post. In the meantime you can check out my progress at https://github.com/Akz-/residual/commits/animations.

Surprise GIF