By Zach Johnson
When Thanos waged war on Wakanda at the end of Marvel Studios’ Avengers: Infinity War, it was Mark Ruffalo’s Bruce Banner—and not The Hulk—who took a stand against the Mad Titan. It was hard to blame “the big green guy” for sitting that fight out, considering his surprising defeat at the beginning of the movie. But in Avengers: Endgame, the epic conclusion to the Infinity Saga, Banner and The Hulk had fused into a character the filmmakers dubbed Smart Hulk—and when the surviving Avengers came calling, he played a key role in their master plan.
Animating the new character proved quite a feat. “One of the biggest challenges of Avengers: Endgame was the creation of the new Hulk,” Dan Deleeuw, the film’s visual effects supervisor, recalls. “In this incarnation, Banner has reconciled with the Hulk and has become a merged character. He is the best of both worlds, with the intelligence of Banner and the brawn of Hulk.”
It’s a concept Smart Hulk explains over pancakes when Steve Rogers/Captain America (Chris Evans), Natasha Romanoff/Black Widow (Scarlett Johansson), and Scott Lang/Ant-Man (Paul Rudd) seek his expertise. “For years, I’d been treating the Hulk like he’s some kind of disease, something to get rid of,” Banner tells his friends. “But then I started looking at him as the cure.”
The technology required to bring Smart Hulk to life onscreen would no doubt impress a scientist like Banner. Under the watchful eye of visual effects supervisor Russell Earl, Industrial Light & Magic (ILM) used Anyma, a technology DisneyResearch|Studios (DRS) created in 2015 and continued to develop for years, to bring him to life. (No gamma radiation required.) Unlike its predecessor, the award-winning Medusa Performance Capture System, Anyma “is a fully artificial word,” explains principal research scientist Thabo Beeler. “The word contains ‘animation,’ but ‘anima’ in Latin also means ‘soul,’ and that’s what we wanted to get to: We wanted to capture the soul of the actor.” Anyma is revolutionizing digital character animation, adds senior research scientist Derek Bradley: “It’s not just helmet camera capture technology; it’s defining the next generation of performance capture that people hadn’t thought of before.”
DeLeeuw says Anyma created a richer environment for both Ruffalo and the filmmakers. “Having the actors perform on set with our motion capture characters was our primary goal,” he says. “As a result, the Anyma software needed to be adapted to work from helmet-cam footage. This gave us the freedom to photograph scenes organically and without restrictions.”
Filming was in full swing when DRS joined the production. “We did the test and it looked, obviously, fantastic,” Beeler recalls. Part of Anyma’s appeal is its ease of use—a necessary progression, as Ruffalo’s performance was “particularly emotive,” DeLeeuw explains, “and would require new techniques to capture the subtleties of his facial movement.”
Ruffalo “has very distinct facial movements and a subtle asymmetry that make him unique,” Earl echoes, praising the actor’s “subtle and nuanced” performance. “This in turn required a higher level of fidelity in our facial solves.” ILM used the robust facial system it had in place, but after seeing the initial Anyma results, he says ILM realized “it could be instrumental in achieving this higher level of fidelity and provide more control over the ultimate solved capture data…”
It was the perfect time to use Anyma, which up until this movie had not been seen onscreen. “You can set up cameras that are farther away, so the actors can actually move around and they can actually perform,” Beeler explains. “It’s very flexible, because it leverages the shapes you acquire with Medusa, and through this, it builds a very specific digital puppet of the actor.”
Of course, none of this would have been possible without the groundbreaking technologies that preceded Anyma. “Typically, such innovations build on top of each other. They’re built on experience, and one idea leads to the next one,” says Markus Gross, vice president of Research. “All the experience and the expertise Thabo, Derek, and their teams collected during the designing and the conceiving of Medusa inspired them to create Anyma.”
Anyma is now being integrated into other Disney productions by ILM and will only lead to further advancements. “I think what’s really important to emphasize is that this requires a long-term strategy,” Gross says. “This is what DRS stands for as an institution, as an organization within the company. Our aim is to push the forefront of technological innovations in service of our film production business.” He credits David Taritero, senior vice president of Visual Effects and Production for The Walt Disney Studios, for being the team’s early champion.
Taritero says DRS did “amazing” work, all made possible through Anyma technology: “The finished product is consistently better utilizing DRS technology than it would’ve been without.”
Because of the DRS and ILM teams, Hulk lives up to his incredible name.