Animating a three-dimensional avatar in an interactive setting requires motions and behaviours to be generated in real time.
In settings where only a limited number of different behaviours are required, such as in some video games, this can adequately be done by replaying pre-prepared animation clips, either hand-composed or obtained from motion capture. In more open-ended settings, such as sign language or more elaborately interactive virtual worlds, this becomes impractical, especially as each different avatar, having different geometry, would require its own version of the animations. A method of automatically synthesizing animations from an avatar-independent description of the actions to be performed can simplify the process enormously. Synthesis of naturalistic animations, whether in real time or not, has been approached by various researchers through simulation of physical systems, these being either biological creatures or robots.
Some simulations are of the underlying biocontrol systems, while others fit parameterised curves to samples of human motion in order to generate variations. The problem divides into two parts: there must be a way of specifying movements in an avatar-independent way, and a way of translating such specifications into animations for any avatar, in real time. We are currently studying these problems in the ViSiCAST and eSIGN projects, whose goal is the provision of deaf sign language support on the web or other interactive multimedia environments. We have used the HamNoSys notation for signing gestures, which was originally developed as a way for researchers on sign language to record signs in writing. Software written for the ViSiCAST project combines these sign descriptions with a description of the avatar's geometry to generate animation data. Current research focuses on extending the repertoire of gestures beyond signing to general movements, in the process re-examining the gesture notation to determine requirements for a revised notation designed for the purpose of computer animation.