06 Jul.
2022

Changing the face of storytelling in digital animation

Our research pushed the boundaries of emotive storytelling in cinema and gaming with facial motion capture technology, resulting in several high-profile awards.

Impact highlights

  • The collaborative team taught computers to recognise and track faces.
  • Detailed tracking of faces allowed translation of live performances onto 3D models.
  • This work underpins some of the most celebrated facial animations in cinema and gaming.

The challenge of creating realistic digital faces

Computer-generated animation has traditionally used motion capture techniques to transfer live performances onto 3D character models. Facial motion capture, however, remains the greatest technical challenge. Effectively translating every emotive element from the camera to a 3D model means tracking the subtlest of changes in expression. Without it, significant time and artistic skill is required to recreate lost nuance during post-production.

Teaching computers to recognise facial features

Our researchers developed statistical models of faces, capturing the variation in shape and photo-realistic appearance of facial features. They also developed an algorithm that used the models to successfully locate and track facial features in videos.

The models 'learnt' by analysing a large collection of facial imagery. Initially, this involved manually marking up multiple equivalent points, such as the centres of the eyes, in each image, but this was limited to images of people facing the camera.

Later work automated this process and removed the restriction to frontal views. The team improved the way the model captured and used facial appearance statistics to track features, leading to increased accuracy.

The research laid the foundations for high-fidelity (reproduction with little distortion) motion tracking, which can capture nuances in performance. This made robust face-tracking for augmented reality (AR) applications possible and reduced the time and cost of animators in the development of fully-realised character models with detailed, fluid and life-like motions.

Award-winning animation design

Gareth Edwards and Kevin Walker, two members of the research team, then started Image Metrics – building on the initial research to develop facial motion capture software for use in film and high-end video game production. Their technology was used in the production of hundreds of titles, including seven of the top ten AAA (triple-A) games from 2016 to 2020 and Academy Award-winning feature films. This included the digitally animated reverse-ageing of Brad Pitt in David Fincher’s The Curious Case of Benjamin Button, which won the 2009 Academy Award for Best Visual Effects.

In 2012, Image Metrics became a world-leading provider of AR filters including the Die Hard Fan face-painting app, which allowed fans to apply face paint filters to their selfies and was adopted by the 2016 Summer Olympics in Rio de Janeiro.

Before this, in 2009, Edwards also founded Cubic Motion – a leading provider of automated performance-driven facial animation for computer games. The company has provided facial animation for ten BAFTA-winning titles, most notably God of War and Hellblade: Senua’s Sacrifice, which both won the Award for Performer, recognising the best performance featured in a game from voice artistry through to motion capture.

The overall body of research was awarded the IEEE Face and Gesture Conference Test of Time Award in 2015. The nomination described the work as having: “inspired many (if not all) subsequent work on deformable models for face analysis."

Seminars
Brochure