MPC on its digital make-up advancements in ‘The Mummy’

Creature_2474_TPF_00042R

Previously on vfxblog, The Mummy’s overall visual effects supervisor Erik Nash weighed on crafting the 5 stages of transformation of Princess Ahmanet. Key to bringing her to life in closer-to-human form was augmenting real photography of Sofia Boutella by taking away portions or her face or body to show the re-generation process.

MPC was behind that work, and, in fact, the studio has been pioneering augmentation and ‘digital make-up’ work lately with films such as Ghost in the Shell and Pirates of the Caribbean: Dead Men Tell No Tales. MPC visual effects supervisor Greg Butler outlines how it was accomplished in The Mummy in this overview for vfxblog.

Starting with a physical performer

Greg Butler: Starting with a real actor did a couple things at once. It kept the movie visually grounded the entire time. There was obviously the occasional exception for a difficult stunt shot or something that was come up with too late or where we had to invent the shot in post, but most of the footage, whether it’s the Ahmanet work we did, the undead attacking in the church, even the undead attacking in the tunnels, the undead swimming after Tom in the water – they were always a physical performer. We at least kept generally their torsos and their costumes.

If you look at a shot, the first things your eyes go to subconsciously are like, “Is the cloth moving right? Is the character moving with weight?” All of those, your brain’s immediately answered that, “Yep, everything’s completely real,” because those essential aspects of the characters were.

So anything we did to them at that point, as long as we did a decent job of marrying the material and integrating it, we always had a solid foundation, we would never stray far at all. There would be no Uncanny Valley to anything, and it was such a strong platform that even when we had to insert a shot here and there where they were fully digital, you’re in the sequence so solidly that you don’t even notice because the real material is carrying you along. If we occasionally have to drop in a digital version, instead of having the fully digital thing have to carry the whole thing all the time, and all it takes is one slip up in a cloth sim that has not quite the right look and you just reject the whole thing. “Oh, that was all fake.”

I don’t know that there really is moments in the movie where that would happen to the audience because so much of it was grounded by the real stuff. And that was really a decision early on in the movie. And even we had to keep fighting against the occasional desire from whatever quarter, like, “Oh, can’t we just replace that guy and go digital?” Well, yes, but think about what that will do to this whole thing. It also kept people’s focus on the shoot on, “Is this really how you want the guy to move? Because this is it, he’s not being replaced, he’s not being painted out.” Of course, we never said, “No,” if there was a particular shot that didn’t work, and there were a few. So we did replace some people. But really the bulk of the choreography was what the director and the stunt coordinator rehearsed and did on set.

Augmentation techniques

We didn’t necessarily reinvent the wheel for this – we did some traditional known techniques, we refined the process, we glued some bits together including one of the new bits being the optical flow approach in NUKE, and it meant that we were able to get to a usable result much quicker. (MPC is presenting on the approach at SIGGRAPH this year, with a Talk by Curtis Andrus).

I dealt with way more tracking, like roto-animation, body tracking, skull tracking and facial deformation tracking than any project that I’ve ever dealt with. Hundreds and hundreds and hundreds of shots. And if we hadn’t come up with ways to create efficiencies and get good at repeating a particular technique and process, even the quality control aspect of reviewing the work, we would have never made it through.

One of the key things that really made it possible was, early on we knew that the character of Ahmanet, when she’s in stage 3 and 4, which is when we’d replacing parts of her face – we’d replace her eyes, we’d add on digital hair – all of these things we knew that everything was dependent on getting a solid track as soon as possible so that all the other departments’ work could make it into the shot.

2474_FPF_00183R

We continued to try to get shots ready to produce without a locked down design and therefore a locked down asset. But we had at least one hard and fast rule, which was that we would never be replacing or augmenting her, what we call the performance zone, her lips, her eyelids, small facial details that really represent the minute muscle twitches of a particular actor speaking lines and emoting.

So even though we took over in many shots, let’s say 70% of her face, that remaining 30% which is all her, it meant that we didn’t have to move into facial motion capture or post production methodologies to do a digital face. Which just increased the time and the complexity and the scrutiny to a whole other level. As long as we could stick with the broader areas of the face, then after a bunch of testing we found that we could use NUKE’s optical flow basis to work off of a plate, get the pixels all tracking, and then convert those pixels into a 3D model of the facial movement and render that just as if it had come out of motion capture or a full 3D pipeline.

I think about her eyes a lot, because we replace her eyes in almost every shot in the movie. And what made those things work so quickly for us is that I always tried to use the plate for the most intricate part of the eye, the caruncle, the meniscus, the eyelashes, I tried to use the least amount of pixels possible while still providing what’s necessary to show that she has these strange, alluring, monstrous, double pupils that represent that she’s not completely human. That was a technique that we spent a long time on up front, but by the time we were finishing the movie we could turn those shots around in so little time because we would take the plate, we knew what we needed to, you know, put CG into and we would just, kind of, just keep knocking them out.

MPC’s run of digital make-up projects

One of the interesting things is that we’re doing so many movies at the same and although all the supervisors don’t talk on any regular basis, we and our teams are aware enough of the fact that something else is going on, so in the case it was the latest Pirates of the Caribbean movie, it was the one where they were dealing with lots of roto-animation, lots of facial tracking and they were starting to experiment with NUKE, but we were just behind them in the schedule, so what they started to experiment with we said, “Hey, I think that’s what we need.”

null
Captain Salazar from Pirates 5 involved both meticulous face tracking and hair simulation by MPC.

We took their experiment and turned it into a pipeline, and then Ghost in the Shell was just ahead of us and so all of these projects worldwide were kind of all going and all looking at each other thinking, “Am I missing something? Did they just figure out something that I could benefit from?”

It can be tough to maintain that number of films in production across the company, but there’s a huge advantage in the shared knowledge. And when somebody figures out a good way to do something, it immediately rolls into whatever project isn’t done yet.

See vfxblog’s Breaking down Princess Ahmanet’s 5 stages in ‘The Mummy’

No Responses to “MPC on its digital make-up advancements in ‘The Mummy’”

Post a Comment