ToonFace

ToonFace - Simple & Expressive Realtime Animation Specification

Kris Thorisson, Catherine Pelachaud, Zsofia Ruttkay, Kristleifur Dadason

ToonFace is an evolving specification of an animation system for algorithmic/parametric generation of facial movement and speech. ToonFace provides a useful representation of muscles and motion control in systems that generate movement in real-time and where such movement can be interrupted at human perception-action loop levels (around 100 msec). While everyone seems to be striving for the most realistic faces, the goals of the ToonFace specification are:

- A "simplest possible" representation, with expressive powers like that of great cartoon characters
- Easily mapping onto any underlying A.I. machinery and being able to reflect its states at runtime
- Low enough CPU requirements to leave something for the rest of a character's "brain", or to run a ToonFace face on a handheld

There is no concept of ?frames? in ToonFace; instructions move control points to particular end-points in a specified time. Since frames are a special case of the ToonFace mechanism, it is a higher-level interface than that provided by e.g. MPEG4.

ToonFace is the product of the ToonFace Working Group, which has on-line email discussions at yahoogroups; for those interested in participating in these please send mail to [kris at media dot mit dot edu].


External Project Page


Project Items (downloadables)

Name Category Sector Description Size Date Download
ToonFace 1.0 Draft Consumables N/A ToonFace, Complexity Layer 1 draft 400 KBytes 2004-06-05 Download

Project created: 2005-11-16