The visual effects of A Nightmare on Elm Street

The visual effects of A Nightmare on Elm Street

2010_a_nightmare_on_elm_street_026

Director Samuel Bayer’s 2010 re-imagining of A Nightmare on Elm Street, releasing on DVD and Blu-ray in October, sees serial-killer Freddy Krueger once again haunting and killing people in their dreams. Freddy’s famous gruesome face was the combined effort of practical prosthetics and digital augmentation from Method Studios. We talk to Method visual effects supervisor Sean Faden about the work. 



What were some of the main challenges of the digital Freddy face shots?



Faden: Well, the original plan with Freddy’s face was to get the most we could from prosthetics with a modest budget. We tried to keep the digital effects work to certain regions of his face, but at some point in production they asked to go a lot further with it. We ended up having to put the CG treatment over a larger portion of his face. I think it was around 70 or 80 face shots. Every one of them required a full 3D track of the face and a matchmove of Freddy’s performance, plus full 3D rendering and compositing of Freddy’s face. So instead of being one patch of his cheek, it was the entire left side of his face and nose and chin.

Freddy wasn’t very stoic, either. He was moving around a lot. I had actually just come off work on the Marcus shots for Terminator: Salvation at Asylum. The good thing about Marcus was that he didn’t move his face very much – he was pretty stoic. As hard as it was to track things onto him, it was ten times harder to track things onto Freddy. We were involved in a re-design of the Freddy face and one of our lead compositing supes here at Method, Olivier Dumont, who’s also a 3D guy, put together two test shots from some concept art and we floored the producer and director with those. We compared everything we did from that point on with those two shots. Sometimes it’s good just to get two shots out there to get a buyoff on and move on from them.



So how did you take the Freddy face effects further?

Faden: Freddy was conceived to be mostly prosthetic with a few choice holes on his left cheek. The idea was to take the prosthetic, which was really nice work by Andrew Clement of Creative Character Engineering, and add some CG to it so the audience wouldn’t know how it was done. We wanted to blur the lines a little bit. We had certain regions of green marked out on his face that we knew we had to augment. In the end, things changed and we had to do a re-design. So it probably turned out to be 50 per cent real and 50 per cent CG. We had a scan of Jackie Earle Haley as Freddy in his makeup and took that model into Zbrush and basically sculpted in 3D. It’s really just like working with clay, you can adjust things like the tools and your pressure and it’s great for creating realistic displacement maps because you’re actually pulling and pushing geometry.

Our lead modeler, Masa Narita, is a Freddy maniac. He sculpted the model and ended up with a great design. He textured it as well.

It was animated and matchmoved with Maya and PFTrack. It was lit and rendered in RenderMan through Houdini and composited in Nuke. We had about ten layers of mattes for the compositors in order to give them a lot of flexibility. The beauty of Nuke was that they were able to do a lot of 3D projections. They were able to import the animating head, then a compositor could draw a roto-matte in the space of the head. So if they drew a matte around the mouth, they could draw it on one frame and because the track lined up perfectly, that roto would then automatically track with the head. So if they had to make an adjustment they could just tweak one frame. One of the hardest things about doing this work is the blending between the real and the CG. If a shot took two weeks, ten days might be just working on those blend points. Nuke was a huge advantage in getting through all that stuff and we could set up templates to help us get some consistency across shots.





How did you go about matching the prosthetic or skin to your CG face?



Faden: One of the things about the prosthetic was that it didn’t always behave like real skin. So sometimes you’d have the prosthetic right up against the CG and the CG would be doing all the proper subsurface scattering and everything which would be different to the prosthetic. A lot of times we had to dull down the prosthetic in areas where it came up against the CG work, just to help bridge the two. Also, sometimes the prosthetic wiggled and moved in different ways to real skin as well, because it was stuck onto something on top of his skin. So it made it tricky to stick things around the areas that moved the most, like the chin. One of our artists, Chris Bankhoff, created a really cool 2D magnet system within Nuke where you could set up a tracker and control the fall-off. You could create as many as you wanted and literally stick it on. If the CG was in the right place, you could lock it onto a CG tracked point based on the fall-off. We had something similar to that on Terminator: Salvation but we did it as a 3D thing. This worked really well in Nuke and the 3D capabilities of Nuke allowed it to be pretty solid. Plus there were a bunch of compositing tricks we added to help sell the shots.





What were some of the other effects shots Method worked on for Nightmare?



Faden: We also did what was referred to as the nightmare effects. This was mostly 2D work to add to the feel of the ‘micronaps’ – when you’ve been awake for 72 hours or more, your brain involuntarily shuts down and you don’t even realise you’ve taken a dip into sleep dream time. We did a lot of Flame work and Nuke compositing for that. It involved what Sam Bayer called a ‘squishy’ lens, which was actually a real lens they can put on a camera that would give a subtle warping effect and the lights bleed. I didn’t want them to shoot the Freddy shots with a squishy lens because that would have made our effects work rather difficult, so we emulated that and applied it to a bunch of shots. It adds just enough distortion to the scenes tell that something’s not quite right. And then we also re-did the famous stretchy wall from the original movie. Back then they had Freddy pushing through a piece of latex, and that actually was very effective for 1983. It was scary and pretty cool. But for this one it was a challenge to try and bring that into the 21st century.

We did a couple of CG walls with our Freddy character driving a cloth simulation which was then driving additional detail work done in Houdini to finesse the shaping and wrinkling around Freddy. Then there were a bunch of one off effects shots for the Freddy kills, especially wire removal.

Later in the movie when Nancy is in the pharmacy and she’s getting really tired and having micronaps, they wanted her to be stumbling down this isle and the camera is moving and dipping in and out of Freddy’s world. It had to be choreographed quite closely, so when we’re in the pharmacy we had to see her fighting against nothing and when she’s in Freddy’s world, which is a boiler room, there’s a narrow hallway with pipes to match the scale of the pharmacy. We set up the sequence and Halon prevized it.

I took the previz and cut up the shots for Sam to sell the idea to the producers. We had a moco rig that barely fit in the isles of the pharmacy set and then two weeks later shot Nancy in a real location which was an old paint factory with a fake wall on one side. We had mixers there to make sure we were roughly matching the earlier performance. There’s a couple of shots where we did a flickering effect in post, but most of it was done as quick cuts. It’s fun when it falls on your shoulders to figure that kind of stuff out. I love it. And I think if we had done a lot of digital effects work there to peel layers away, say, it could have turned out cheesy, so I’m glad it was more as a ‘blink of an eye’ thing.

Interview by Ian Failes

Related Links

Official film site

Method Studios

cool interview

isolate a character out of a live video

Hi
How could i isolate a character out of a live video in the fastest & accurate way?
thanks

DVD Studio Pro editor needed!!

Looking for someone to edit reels and put together a montage of our studio’s work. Must be experienced in DVD Studio pro. The work would need to be done by end of next week

Please email LeahB@swaystudio.com

Thank you!

Mental Ray passes in 2011 and mip_mirrorball

Hallo,

I’m doing a simple composite of a game model into my garden. I created a HDR and plugged it into the mip_mirrorball node, however I can’t work out how to rotate this. There doesn’t seem to be any reference to this in the documentation. I’ve tried the Homer Simpson approach of rotating the scene but the HDR dome goes with it.

Secondly I’m kind of new to passes in MR and passes in Maya 2011 in general. I’ve worked out how to create them using layers, and how to get them to save out as individual files.. but I don’t actually know which ones I want because there’s so many to choose from. What would you suggest for this type of comp to give a good degree of control. In my scene I have one key and the HDR.

WIP:
http://dl.dropbox.com/u/1992790/portalradio.jpg

Hyperthread MacPro

Is there a way to hyperthread a MacPro. I am running a dual quad core intel Mac. I want to boost it from 8 cores to 16 threads

camera projection on a moving object

Hello, I did camera projections using both the project 3d and UV project successfully whenever the geometry I’m projection onto is not moving, I am now trying to project on a moving object, but when the object moves the texture I project on it doesn’t move with it, is there a way to project from one frame and then use that as a texture which sticks on the object even when it’s moving? thanks!

Send2Qmaster on Nuke6.xx

Has anyone got the script "Send2Qmaster" to work on Nuke 6? If so, how did you do it/set it up?

Availability of the set data while Matchmoving

I’m trying to get around with matchmoving and Ive just started off with Autodesk Matchmover.
Referring to the bundled tutorials, it has a chapter on supervised tracking. The tutorial just mentioned that the distance between 2 corners of the room was "20". 20 of what? It left me with some more questions:

I can upload a frame from the sequence if it could help.

1) Did they know the set measurements? (probably yes !!)
2) What units setup matchmover uses? i.e. cm, meters, inches etc.

And some more general questions:
1) Is camera data(like focal length, apertures, distortion grids) given to matchmovers?
2)Do matchmovers always get access to set data? And in its absence, what do they do?

These questions actually made me download a book on the same topic authored by Tim Dobbert.
Just wondering if someone could help out!!

Setting up two Nuke Path environment variables based on versions

Hey all,

I’m just starting to get my head around scripting with nuke. I recently set up an environment.plist script to designate a new location for Nuke’s plugins. Now I have Nuke 6.1v1 and the 64-bit beta Nuke 6.1v2b1 versions on my Mac. I think because some of the gizmos and tools in the beta 64-bit are a bit different I get an error when trying to open the the 64-bit version using the plugins from the older version and visa-versa with using the plugins from the 64-bit version.
So, I was wondering if there is a simple script to implement into my environment.plist file that would clarify a separate path for separate versions of Nuke?

Currently this is what my environment.plist file looks like (except I replaced the path string for confidentiality):

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//
EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>NUKE_PATH</key>
<string>/Place/where/plugins/are</string>
</dict>
</plist>

Thanks!