Thursday, April 25, 2024

AdvertiseDonateSubmit
NewsSportsArtsOpinionThe QuadPhotoVideoIllustrationsCartoonsGraphicsThe StackPRIMEEnterpriseInteractivesPodcastsBruinwalkClassifieds

How motion capture technology shaped characters in ‘Avengers: Infinity War’

(Cody Wilson/Daily Bruin)

By Kristin Snyder

Feb. 21, 2019 9:57 p.m.

Thanos’ destructive snap might have only taken a second, but creating his hands took much longer.

Alumnus Kelly Port, along with a team of other visual effects specialists at Digital Domain, brought Thanos to life in “Avengers: Infinity War,” leading to an Academy Award nomination for best achievement in visual effects.

As the Digital Effects supervisor for Digital Domain, Port said he and his team were responsible for over 500 of the approximately 2,700 shots that comprised the film, alongside a number of other companies contributing to the others. Much of their work focused on creating Thanos, working with motion capture technology to translate Josh Brolin’s performance from the set to the screen, Port said.

“I think a big part of it is capturing the detailed aspects of Brolin’s performance because we’re all so used to seeing human faces … so there’s a lot of critical analysis that happens in our brains at the subconscious level,” Port said. “When you capture that performance very tightly and accurately, that’s a big part of (making the character realistic).”

[RELATED: Movie review: ‘Avengers: Infinity War’]

To convey such nuances, Port said Brolin wore a helmet camera, which tracked his facial movements and fed the information to a machine learning system. During postproduction, visual effects specialists worked with animators to apply the data to Thanos, comparing Brolin’s on-set performance to the animated character. If Thanos’ emotional expression did not match Brolin’s, Port said they would model the character by hand on a computer to bridge the gap.

“Technology for visual effects improves year to year. … In terms of ability to do facial capture, this is really the pinnacle of being able to capture facial performance,” Port said. “We haven’t been able to capture it this well before.”

For Terry Notary, who played Thanos’ henchman Cull Obsidian and was the on-set motion capture actor for Groot, such technology helps attribute human emotions to nonhuman characters. As he embodied both the alien character and a sentient tree, Notary said the slightest nuances – such as an eye flickering or a lip quivering – are recorded and portrayed in the final versions of the character.

“It’s just like acting in a costume, but the animators put the costume on after your performance,” Notary said. “Everything you do in your performance is pretty much verbatim on the character.”

Characters whose costumes are created digitally are often shown mock-ups of what their final appearance will be after special effects are added. While working on set, actors are typically able to see their 3D characters’ movements mimicking their own as they act, which helps them picture the movements that best exemplify their characters, Notary said. For example, Notary said Brolin established that the character needed to ensure his movements were powerful and heavy, due to Thanos’ bulky size.

“You have to play this nonhuman character without any costume to hide underneath. You’ve got a motion capture suit, which is this skin-tight, Velcro suit,” he said. “There’s no hiding at all.”

However, actors unfamiliar with motion capture technology often make the mistake of overemphasizing their physical movements, he said. The more subtle movements, Notary said, are what actually help create a poignant, memorable scene. When working as Groot, he said he emphasized the teenage character’s vulnerable emotional state through inward-focused, slouching positions. But when Groot sacrifices his arm to help Thor, Notary wanted to convey the emotional shift the character experienced, using the motion capture technology to reflect his more confident movements, he said.

Diane Villaroman, who works as a programmer analyst for UCLA’s Virtual Reality Motion Capture (VR-MoCap) laboratory, said motion capture technology is useful outside of entertainment as well. For their research on spatial memory, Villaroman said they utilize small reflective markers to capture more subtle facial movements and larger markers on the rest of the body to track other physical movements – a process that also helped create Thanos.

[RELATED: Screening Science: How we’re comparing to the technology of the ‘Avengers’]

But capturing the nuances of Brolin’s performance was still challenging to translate to the screen, Port said. Where action scenes featuring swift movements and short shots are more forgiving in capturing performance, Port said the challenge of creating Thanos was the multitude of long shots fixating solely on the character. The scene where Thanos sees young Gamora was also challenging, as it was necessary to convey the moment of reflection Thanos went through as he considered the cost of what he had done.

“We had a lot of intimate framing – things like this were highly dramatic and subtle performance had to come through,” Port said. “The fact that (Thanos) was really the lead character of the film, with over an hour of screen time, made it critical that Thanos worked for the film.”

Share this story:FacebookTwitterRedditEmail
Kristin Snyder | Alumna
Snyder was previously the 2019-2020 Arts editor as well as the 2018-2019 Theater | Film | Television editor.
Snyder was previously the 2019-2020 Arts editor as well as the 2018-2019 Theater | Film | Television editor.
COMMENTS
Featured Classifieds
More classifieds »
Related Posts