The visual effects of A Nightmare on Elm Street
Director Samuel Bayer’s 2010 re-imagining of A Nightmare on Elm Street, releasing on DVD and Blu-ray in October, sees serial-killer Freddy Krueger once again haunting and killing people in their dreams. Freddy’s famous gruesome face was the combined effort of practical prosthetics and digital augmentation from Method Studios. We talk to Method visual effects supervisor Sean Faden about the work.
What were some of the main challenges of the digital Freddy face shots?
Faden: Well, the original plan with Freddy’s face was to get the most we could from prosthetics with a modest budget. We tried to keep the digital effects work to certain regions of his face, but at some point in production they asked to go a lot further with it. We ended up having to put the CG treatment over a larger portion of his face. I think it was around 70 or 80 face shots. Every one of them required a full 3D track of the face and a matchmove of Freddy’s performance, plus full 3D rendering and compositing of Freddy’s face. So instead of being one patch of his cheek, it was the entire left side of his face and nose and chin.
Freddy wasn’t very stoic, either. He was moving around a lot. I had actually just come off work on the Marcus shots for Terminator: Salvation at Asylum. The good thing about Marcus was that he didn’t move his face very much – he was pretty stoic. As hard as it was to track things onto him, it was ten times harder to track things onto Freddy. We were involved in a re-design of the Freddy face and one of our lead compositing supes here at Method, Olivier Dumont, who’s also a 3D guy, put together two test shots from some concept art and we floored the producer and director with those. We compared everything we did from that point on with those two shots. Sometimes it’s good just to get two shots out there to get a buyoff on and move on from them.
So how did you take the Freddy face effects further?
Faden: Freddy was conceived to be mostly prosthetic with a few choice holes on his left cheek. The idea was to take the prosthetic, which was really nice work by Andrew Clement of Creative Character Engineering, and add some CG to it so the audience wouldn’t know how it was done. We wanted to blur the lines a little bit. We had certain regions of green marked out on his face that we knew we had to augment. In the end, things changed and we had to do a re-design. So it probably turned out to be 50 per cent real and 50 per cent CG. We had a scan of Jackie Earle Haley as Freddy in his makeup and took that model into Zbrush and basically sculpted in 3D. It’s really just like working with clay, you can adjust things like the tools and your pressure and it’s great for creating realistic displacement maps because you’re actually pulling and pushing geometry.
Our lead modeler, Masa Narita, is a Freddy maniac. He sculpted the model and ended up with a great design. He textured it as well.
It was animated and matchmoved with Maya and PFTrack. It was lit and rendered in RenderMan through Houdini and composited in Nuke. We had about ten layers of mattes for the compositors in order to give them a lot of flexibility. The beauty of Nuke was that they were able to do a lot of 3D projections. They were able to import the animating head, then a compositor could draw a roto-matte in the space of the head. So if they drew a matte around the mouth, they could draw it on one frame and because the track lined up perfectly, that roto would then automatically track with the head. So if they had to make an adjustment they could just tweak one frame. One of the hardest things about doing this work is the blending between the real and the CG. If a shot took two weeks, ten days might be just working on those blend points. Nuke was a huge advantage in getting through all that stuff and we could set up templates to help us get some consistency across shots.
How did you go about matching the prosthetic or skin to your CG face?
Faden: One of the things about the prosthetic was that it didn’t always behave like real skin. So sometimes you’d have the prosthetic right up against the CG and the CG would be doing all the proper subsurface scattering and everything which would be different to the prosthetic. A lot of times we had to dull down the prosthetic in areas where it came up against the CG work, just to help bridge the two. Also, sometimes the prosthetic wiggled and moved in different ways to real skin as well, because it was stuck onto something on top of his skin. So it made it tricky to stick things around the areas that moved the most, like the chin. One of our artists, Chris Bankhoff, created a really cool 2D magnet system within Nuke where you could set up a tracker and control the fall-off. You could create as many as you wanted and literally stick it on. If the CG was in the right place, you could lock it onto a CG tracked point based on the fall-off. We had something similar to that on Terminator: Salvation but we did it as a 3D thing. This worked really well in Nuke and the 3D capabilities of Nuke allowed it to be pretty solid. Plus there were a bunch of compositing tricks we added to help sell the shots.
What were some of the other effects shots Method worked on for Nightmare?
Faden: We also did what was referred to as the nightmare effects. This was mostly 2D work to add to the feel of the ‘micronaps’ – when you’ve been awake for 72 hours or more, your brain involuntarily shuts down and you don’t even realise you’ve taken a dip into sleep dream time. We did a lot of Flame work and Nuke compositing for that. It involved what Sam Bayer called a ‘squishy’ lens, which was actually a real lens they can put on a camera that would give a subtle warping effect and the lights bleed. I didn’t want them to shoot the Freddy shots with a squishy lens because that would have made our effects work rather difficult, so we emulated that and applied it to a bunch of shots. It adds just enough distortion to the scenes tell that something’s not quite right. And then we also re-did the famous stretchy wall from the original movie. Back then they had Freddy pushing through a piece of latex, and that actually was very effective for 1983. It was scary and pretty cool. But for this one it was a challenge to try and bring that into the 21st century.
We did a couple of CG walls with our Freddy character driving a cloth simulation which was then driving additional detail work done in Houdini to finesse the shaping and wrinkling around Freddy. Then there were a bunch of one off effects shots for the Freddy kills, especially wire removal.
Later in the movie when Nancy is in the pharmacy and she’s getting really tired and having micronaps, they wanted her to be stumbling down this isle and the camera is moving and dipping in and out of Freddy’s world. It had to be choreographed quite closely, so when we’re in the pharmacy we had to see her fighting against nothing and when she’s in Freddy’s world, which is a boiler room, there’s a narrow hallway with pipes to match the scale of the pharmacy. We set up the sequence and Halon prevized it.
I took the previz and cut up the shots for Sam to sell the idea to the producers. We had a moco rig that barely fit in the isles of the pharmacy set and then two weeks later shot Nancy in a real location which was an old paint factory with a fake wall on one side. We had mixers there to make sure we were roughly matching the earlier performance. There’s a couple of shots where we did a flickering effect in post, but most of it was done as quick cuts. It’s fun when it falls on your shoulders to figure that kind of stuff out. I love it. And I think if we had done a lot of digital effects work there to peel layers away, say, it could have turned out cheesy, so I’m glad it was more as a ‘blink of an eye’ thing.
Interview by Ian Failes