Sun 15th Feb 2015, by Trevor Hogg | Production
[embedded content]
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
Interstellar is on a roll with a recent VFX award from BAFTA, now it’s the big one: the Oscars! We have an in-depth talk with Paul Franklin, VFX supervisor of the Oscar nominated Interstellar
When it came to supervising the visual effects for his space epic Interstellar, filmmaker Christopher Nolan (The Prestige) recruited his frequent collaborator Paul Franklin (Harry Potter and the Order of the Phoenix) who won an Academy Award for Inception (2010). “We did extensive previs because it was useful in how we were planning to approach the miniatures and what kinds of setups we would need for that,” explains Franklin. “Previs also informed the development of many our heavy computer graphic sequences like the big wave, and all of the stuff with the wormhole and black hole, and in particularly, the end of the Tesseract when it turns itself inside out through a hypercube transformation. Chris came from a background of indie filmmaking where previs was the furthest thing from his mind; he has eased himself into the process of previs over the years. We didn’t use much previs at all with Batman Begins [2005] and with each successive film we’ve used more. The overall philosophy of the visual effects in Interstellar had a lot in common with the way we approached Inception. The visual effects work is at the heart of the storytelling in the film. You cannot go into a massive black hole spinning close to the speed of light without using visual effects. Chris was happy to embrace the visual effects but at the same time he pushed me to think about innovative new ways to create them.”
Director, co-screenwriter and producer Christopher Nolan on the set of INTERSTELLAR from Paramount Pictures and Warner Brothers Entertainment.
Photo Credit: Melinda Sue Gordon
“Chris is always reluctant to rely on green screen to create environments,” notes Paul Franklin. “He doesn’t like deferring a lot of the creative decision-making process six months down the line in the post-production cycle of the film. We long thought about whether it would be possible to create effects on-set in-camera using front projection in the same way that films did in the past. You think of all of those Hitchcock movies with Cary Grant [North by Northwest] driving a sports car along the Long Island coast and he’s actually in front of projection screen somewhere in Burbank. We felt we could reinvent the front projection process without becoming so restrictive that it impacted on our usual way of working. Chris likes to move fast on-set. It’s not usual for us to shoot 30 setups or more each day when working on one of his films. The digital projectors meant that we could keep pace with that. We had two 40,000 lumen projectors in a special custom made cage on this big forklift which could tilt onto any area of a huge projection screen that we built outside of the set. The screen was like 80 feet high and 300 feet long; it wrapped all the way around our big spacecraft set. It gave us brilliant results in-camera because we were capturing all of this stuff in one pass on the set but also it was well received by the cast who all came to us after the fact and said how much they appreciated being able to look out the window and see a massive black hole spinning outside the window of the spacecraft and not to have to imagine themselves in that situation.”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“In the end we got at least 170 shots completely in-camera with no additional digital visual effects through this process,” remarks Paul Franklin. “It certainly meant that I had my work cut out for me because normally the role of the visual effects supervisor is making sure that they are shooting things in the right way and advising on things that might help out later on in post. But in addition to all of that I was running the projectors and working with our projection crew. We doubled up the images to get enough exposure to be able to expose the negative to get the look of raw daylight outside the window. I explained to Chris early on at the beginning of production that, ‘The guys tell me that it takes at least a week to setup two projectors and converge them onto the same area of the screen.’ He listened patiently, thought about it, and immediately cut that to 15 minutes between setups. The projection guys said, ‘That’s impossible. We can’t do that.’ But somehow they managed to it. They kept up with the frantic pace of the Nolan set. We were moving projectors, recalibrating them, and creating new content live on-set. Andy Lockley [Rush], my co-supervisor and I had workstations setup on the stage. It was interesting because we were creating content in response to how the drama was developing in the scene.”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“Our models gave us amazing mileage,” states Paul Franklin. “I have to admit when we started out I said, ‘Okay, we’ll use miniatures for the destruction sequence where The Endurance blows up. We’ll maybe use them for a docking sequence early on in the movie.’ I honestly thought that the rest of it, particularly the approach to the black hole, we would be using the computer graphics version of the spacecraft. But an interesting thing was happening while we were shooting the miniatures. We worked out how to light them so that we could get them all in one pass; that meant we could get a lot more material out of the miniatures stage. We were shooting anything up to eight setups a day. Often you expect to get through one or two setups a day. We shot everything against black so the material we got from the miniature unit was a good match for what the way Chris was shooting on the stage with the full-size spacecraft models.” Franklin adds, “When you see the spacecraft orbiting the Earth, travelling through the wormhole, and approaching the black hole, that is all miniature work. We ended up doing something like 95 per cent miniatures. It turned out to be efficient, cost effective and looked great. The miniatures fitted the texture of our film. If we had gone full digital with the spacecrafts those sequences would have had a different feel.”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“Our scientific advisor on the movie was Kip Thorne, one of the world’s leading theoretical physicists,” states Paul Franklin. “Kip gave me a crash course on relativity, wormholes, space time and black holes, and pointed out where science fiction films had got it right in the past and where more often than not got it wrong in terms of their representation of these things. Kip was able to give us the physics that underlies all of these things; he worked out how to turn Einstein’s general relatively equations into computer code. Kip tested it out initially with the mathematical modelling package Mathematica which is made by Wolfran Research. Kip passed that on to our R&D team led by our chief scientist Oliver James over at Double Negative. Oliver took Kip’s equations and implemented them in a new relativistic renderer that calculates all of the light rays and light beams paths through the relativistic warped space time surrounding the black hole. We calculated the orbits of the individual photons which allowed us to now visualize the highly warped space created by this massive black hole that is spinning close to the speed of light. The other thing was that we were doing this for IMAX which is a high resolution. It was at a level of detail that no one had ever attempted before because this is not something generally needed by the theoretical physics community. The unexpected by-product of this was we observed these interesting ways that the space time was folding around the black hole which was not something we expected to get when we set out to make this image for the movie. That’s what our paper [with Kip Thorne] is about [which is to be published by The Journal of Classical and Quantum Gravity].
Kip Thorne
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“Right at the beginning Chris said, ‘I want this to be grounded in real physics but if this stuff doesn’t look interesting enough or looks too weird then we’re going to have to think of a way to bend it more towards our storytelling process,’” recalls Paul Franklin. “But what we found was that the early test images we started to get out of the early versions of the DNGR [Double Negative General Relativity] renderer were so exciting. I went to Chris and showed him the first image of the black hole with the accretion disk and the way it was being warped by the gravity of the black hole. The Einstein Lens produces an amazing halo effect around the black hole which is a pretty abstract quite extreme image; however, at the same time you could comprehend that you’re looking a sphere and can see the power and energy that’s coming off of it. Black holes, in rather contradictory fashion, are amongst the brightest objects in the universe; they emit huge amounts of light from the accretion disk as the material in the disk heats up with friction. Chris said, ‘That’s an exciting image. That’s something we have not seen before.’ At that point I knew we had something that was going to work for our film.”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“Ultimately, the images that you see of the black hole and also the wormhole you are looking at an image that is 99 per cent physics,” states Paul Franklin. “It’s what the software told us should be happening with the relativistic warped space around these objects. We added a little bit of a surface texture to the wormhole and a lens flare to the black hole. That’s it. We didn’t have to dress it up. We didn’t have to have that layer of science fiction pixie dust that a lot of movies tend to do.” Franklin notes, “The wormhole was decidedly easier to make because we were only simulating the effect of the wormhole on the background universe. The gravitational lens that is around it is only affecting the distant star fields. The complication with the black hole is that we have the accretion disk, the disk of debris which is orbiting the black hole so that had to pass through the lens as well. We had to work out how to get the lens to work with the geometry and volume metric simulation of the accretion disk. Clouds of super heated gas are swirling around in there. That was another order of difficulty.”
A massive dust storm encroaches upon a sporting event. “What you see on-screen for the most part is physical in-camera special effects,” remarks Paul Franklin. “Scott Fischer who was our fantastic special effects supervisor brought enormous wind machines, essentially aircraft propellers inside big fans. Things called Ritta Fans. We had about 12 of these things on-set and they blew huge clouds of a very finely ground up cellulose powder [C-90] which is apparently used as a food additive. We used it to simulate the dust storm. When you see Matthew McConaughey [A Time to Kill] and Mackenzie Foy [The Twilight Saga: Breaking Dawn] running through the dust clouds that’s how we created that. We did do two establishing shots at the baseball game when you see the big dust clouds growing and those are visual effects shots. The guys did extensive fluid simulations to simulate the big clouds rolling in and they were based on reference that we looked at. There is lots of video footage out there of dust storms. We also looked at the Ken Burns documentary about the Dust Bowl. When you see those senior citizens at the beginning of the movie talking about the dust storm, they are real survivors of the 1930s Dust Bowl. We wanted to be true to that experience those people had had.”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“When we created the big wide shots we are dealing with volume metric rendering so we’re calculating a fluid simulation with voxels so we’re not doing individual particles with the wide views,” explains Paul Franklin. “We did calculate the paths of individual dust particles for the scenes inside the farmhouse like when lines of dust are falling down and making this interesting pattern on the floor. Cooper [Matthew McConaughey] deduces it to be a binary code giving him the coordinates of the secret NASA facility. What he doesn’t realize at this point is that he sent this message to himself from the future. Special effects filled the room with swirling dust clouds and then we match moved it and created the columns of falling dust as a precisely controlled particle simulation. Dust enters the fields of gravitational attraction and is pulled straight down in lines, eventually forming the pattern on the floor. The pattern on the floor was something I designed myself on the day. I cut a stencil from a binary pattern that I had originally developed from the coordinates of my house in London. It turns out that I actually plotted the points somewhere in the Pacific Ocean so I’m safe!”
A trip to a world immersed in water results in the galactic travellers having an encounter with a natural phenomenon. “Chris said that he wanted it to be this endless ocean. There’s no land on this planet. We went to look for a location where we could film this. We tried a few different locations. We looked at some tidal bays here at the United Kingdom but eventually we found this coastal lagoon in Iceland which is feed by the melting water coming off one of the glaciers there so it is very cold. It’s about two feet deep and three or four miles across so we were able to stand out in this for a week or so shooting all of the scenes with our cast sloshing around in the water. I have to say that it was one of the most awkward locations I’ve ever been on because you couldn’t sit down. There was nowhere to put stuff. It looks dramatic on film but it was horrible while we were there. We took the footage back into visual effects, erased the distant landscape of Iceland that we could see in a few places, added the endless ocean in the distance, and created the giant wave. Chris had said to me that he wanted this wave to be really big. I said, ‘Like the big waves off the coast of Hawaii where the surfers go?’ He said, ‘No, bigger than that. I want them to be 4000 feet high.’ The idea is that the tidal gravity from the black hole is propagating these enormous waves across the surface of the planet. We later came up with a plausible scientific explanation for why these waves are so large and why the water is so shallow. Kip tells me that they are solitary standing waves that can travel great distances. It’s to do with the way the planet which is entirely locked by the black hole is oscillating in its orbit. It has only recently reached this position in the last 100 million years. Apparently, that would generate these huge waves. I’m not going to argue with the Feynman Professor of Theoretical Physics on this one!”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“Rendering the giant wave was a huge challenge,” states Paul Franklin. “You can look at the biggest waves on Earth and they’re not even close to the size of these waves so we spent a lot of time looking at things like mountains and cliff faces trying to get a sense of how big structures like this might look. The guys came up with a straightforward system for blocking out scenes. We animated the waves using key frame deformers to make the big wave shapes travelling through. Once we got the layout, which was quick, the guys had laid this out in previs over a period of a few days.” Surface details had to be added such as foam, spray, wavelets and bubbles to create a sense of scale for the audience. “It was done in a combination of Houdini and our own propriety software called Squirt Ocean that we have at Double Negative which does all of the wave simulations on the surface. I would signoff on a layout and it would be a month to two months before I saw something that had the wave surfaces applied because it took that long to crunch the numbers for the simulations. We were rendering it in combination of RenderMan and Mantra to create the wave surfaces. At one point they worked out how to break the relationship between the little waves and the big waves which meant they could scale down the little waves to sell even more scale on the overall big wave. It was an interesting innovation. The shot where you see the spaceship cresting over the top of the wave and surfing down the back of the wave benefitted from that approach.”
“The Tesseract was interesting because that was the thing we started on the earliest of all,” notes Paul Franklin. “Chris wanted to have an environment where time was represented as a physical dimension and this would allow the hero to essentially navigate along his personal timeline and be able to see images of the past and to interact directly with them. We started thinking how we might justify this on a scientific level. The idea is that Cooper is existing in a higher dimensional and is able to comprehend it in the same way that you and I can comprehend the three physical dimensions that we’re able to perceive with our senses in our own universe. Cooper is able to apply pressure to the four dimensional universe and send a wave through space time which is gravity. It was all very well talking about these abstract scientific concepts but what is this going to look like? The script didn’t give me much of a clue.”
The Tesseract
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
“We started researching ways people had gone about representing time pictorially in the past,” explains Paul Franklin. “I looked at the work of painters like German painter Gerhard Richter who does these wonderful images where he scrapes out the paint leaving a history of all the marks that he has made across the canvas. I also looked at slitscan photography. Slitscan in the visual effects community is always associated with the stargate in 2001: A Space Odyssey [1968] but it predates that. It goes back to the dawn of photography. Slitscan is a process where rather capture all of the light in a room or an environment through a singular aperture, you make a thin slit that records only one position in space and then you move the negative pass the slit as objects are moving through the scene. Anything that moves pass the slit is recorded onto the negative as it moves and anything that is standing still becomes a long streaked blur. It is the sort of thing used in those photo-finished images of racehorses. What was interesting to me about that was immediately you had a way of representing time in an image because the horizontal axis of the image becomes time. I figured that was a way to lead into this idea that every piece of matter is leaving a trail behind it in space time. You’re leaving a trail through your past and there’s a trail in front of you stretching out into your future.”
“The Tesseract eventually became a lattice of intersecting timelines which are the physical representation of every object that’s inside Murph’s [Mackenzie Foy] bedroom. Where the timelines would intersect reveals the three dimensional image of the room. Each successive instance of the room is another moment in time. Each room is separated by the next by three or four seconds. The idea is that this lattice goes on forever. If Cooper travels far enough in the Tesseract, he will find any moment in time that has passed inside Murph’s room, including all of the moments he was there with Murph in the past. This was the other challenge that Chris gave us. He said, ‘Once you work out how to represent time as a physical dimension we need to build it. I don’t want to just hang Matthew in front of a green screen.’ Months were spent by the visual effects team developing various ideas. “We built virtual 3D slitscan cameras to try to work how to record time physically. It produced some very abstracted imagery of the type that wouldn’t pass muster on a big mainstream tentpole movie. The end result was that we were able to pass our model to the Art Department which then constructed a part of the tesseract as a physical set. The set was something like 100 feet long and 60 feet high and 60 feet wide and had one cell from the tesseract in it. It had six rooms in it, one of which had an actual floor so we could put Mackenzie and Matthew inside the room. We used our digital model to extend it off to infinity and then to overlay this fine tracery of the threads of time.”
“Every object had to be tracked and all of the actors had to be body tracked. We were emitting three dimensional threads from all of their surfaces, picking up their local colour. We had to fine tune all of this because we didn’t want it to get in the way of emotional story we’re trying to tell. The most important aspect of the Tesseract from the storytelling point of view is that when Cooper looks into that room and sees his 10 year old daughter from decades ago in the past to him it looks like he is seeing reality. Cooper is seeing his daughter for real; he is not seeing a crazy four dimensional hologram. Cooper responds to it as if she is really there and that’s why he tries to call to her. That’s why Cooper tries to send her a message in the past. It’s not like he is watching a videotape. That was one of the trickiest things to get right. We want to show that the Tesseract was an extraordinary environment. We wanted it to look spectacular. But we didn’t want it to get in the way of the emotional intimacy that we had got from the principle photography.
The Tesseract
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
A baseball game takes place on space station in the shape of an orb. “Chris has always been intrigued by inverting the real world,” remarks Paul Franklin. “It’s a cousin to the folding city in Inception. You look up and see people walking around above you. Chris has always been fascinated by the drawings of the Dutch artist M.C. Escher. It was also grounded in those amazing concepts images from when NASA was talking about huge cylindrical colonies in space. The whole idea is that it is a giant cylinder which is rotating creating gravity so everyone is pinned against the inner surface. But we also wanted it to look rather matter-of-fact and ordinary so the foregrounds were all shot on-location in Alberta. The baseball diamond is the little league field right next door to the baseball stadium where we shot the dust storm scenes at the beginning of the movie. What I did to create the interior of the cylinder was to go up in a helicopter to shoot IMAX plates of the farmland which is laid out in these amazing geometric patterns. We made a geometric patchwork of fields, wrapped them onto a cylinder, and created the interior world of Cooper Station. When you see the window being smashed by the baseball, the way we did that was to get up on a cherry picker, point down at the rooftops of the surrounding houses, shoot the plates with our stills cameras and then flip them upside down. To keep the image alive we added people walking around in the shot; those are staff members of Double Negative filmed in our atrium here.”
[embedded content]
“The images of the Earth were based on low Earth photography taken primarily taken from the International Space Station,” remarks Paul Franklin who also examined archive films from the Gemini and Apollo spacecraft missions. “One of the key references that came to us was when we were in prep Chris Hadfield, the Canadian Commander of the ISS, had released his Major Tom video on YouTube. There’s a brilliant shot of Chris Hadfield in this amazing observation dome; he’s singing away and you’ve got the Earth rotating behind him. The sun is hitting the Earth full on and it’s totally blown out. It’s raw unfiltered sunlight and that’s something you generally don’t see in science fiction films. Sci-Fi films always have the tendency to balance out exposures so you can see all of the detail and everything looks like a beautiful picture book. We created two versions of the Earth. We created a full 3D version which we used later on in post and was rendered in RenderMan and built in Maya. We could rotate the Earth, offset the clouds and put the sunlight at any given position. We also made a slightly rougher version using Nuke’s 3D capabilities to give us one that we could put together quickly to create the projection material which we used live on-set. We did the same thing with Saturn. The primary reference was the imagery that came back from the Cassini probe which was relatively low resolution compared to what we needed for IMAX. We had to produce large matte paintings for the surface texture on the planet to create all of the subtle cloud patterns that are on Saturn.”
“Zero gravity we did old school borrowing ideas from classic films like 2001 [1968] and Apollo 13 [1995] and in fact we used some of the same special effects rigs that were used on 2001,” states Paul Franklin. “We had two main ways of doing that. We had a rig which is giant metal seesaw. It clamps into the hips of the actor and at the other end you have a big handle which Chris Nolan could puppeteer the actors around the set. It will go up and down and then it’s on a dolly a track so we could push it through the set and pivot around. That’s how we created most of the zero gravity for Cooper and Brand [Anne Hathaway], and everyone else floating around inside the spacecraft. They’re serious rig removals. Sometimes the rig was so obtrusive in the shots that we have to replace the lower half of the actors’ body with a digital double. There are couple shots with Matthew’s legs and Mann’s [Matt Damon] lower half which are completely CG.” Franklin adds, “The other way we did zero gravity was wirework. The trick there is to build the set on its side, drop people in from the top of the set and film them from underneath so they hide their wires for the most part and can spin around. We had to paint out the rigs and add little bits of floating zero gravity debris.”
© Warner Bros. Entertainment Inc. and Paramount Pictures Corporation
Imperfections were incorporated into the imagery to further the sense of realism. “A good example of that would be the robots we created,” observes Paul Franklin. “Chris was adamant that he wanted these to be physical performers on the set; he wanted these to be actual robots that would be full members of the cast. But Chris also wanted them to be giant walking blocks of metal like minimalistic hinges striding around the set. Special effects came up with an elaborate puppet that could be performed live on-set in-camera. Most of the robot shots are captured in-camera with our amazing actor Bill Irwin [Rachel Getting Married] puppeteering the robots and delivering the performance simultaneously. When the robot goes into high speed mode, starts thrashing it’s way through the water, running across the ice or floating around in zero gravity in the spacecraft it becomes digital. You had to capture the body language of the puppet which wobbled and shuddered a little bit.” TARS rescues Brand from a monstrous ocean wave. “When Brand gets picked up by the robot she is actually getting picked up by one of our stuntmen Diz Sharpe [Skyfall] who was rotto out and replaced with the robot. When the robot is running with Anne we had a special effects rig, a forklift and a stunt double [Alicia Vela-Bailey]. The forklift was attached to a quad bike. We replaced the quad bike with the running robot. When you see the robot running in the wide shot carrying Brand that’s a digital robot with a digital Anne Hathaway.”
“The biggest intellectual change was definitely the Tesseract,” notes Paul Franklin. “That was the one which was the first thing we started to work on and it was the last thing we designed; that was a close collaboration with the Art Department to build. In terms of the technical challenge by far and away the most challenging thing was the black hole. It took a long time to get it to a point where we were happy that it was going to hold up to scrutiny in IMAX. We had amazing early success that allowed us to create the imagery for the projection material but for later on with the final sequence of flying to the black hole that required a major effort. Kip Thorne didn’t just spend a couple of days writing down his equations and sending them to us through an email; he was at it for at least three months, and then the R&D team at Double Negative worked for considerably longer and were continually conferring with Kip. I was copied on all of the emails between Kip and our R&D team. It was like a conversation between two higher dimensional beings that can speak English. The first email is, ‘Hey, how are you doing? It’s lovely to be here.’ After that they were talking in Imperial Code. I was completely lost.”
“There are places in the movie where we stretch reality like the Ice Planet which is a good example,” remarks Paul Franklin. “Those huge glaciers hanging over your head probably wouldn’t work. What we do with those sorts of things is to present them with as much reality and conviction as we can.” Interstellar has inspired some creative imitators on YouTube. “It has been quite gratifying to see people going around trying to workout how to create our black hole effect. I’ve seen one guy who made it in computer graphics. It’s a pretty fair replica of what we did. There was another bunch who built it all out of practical elements with long exposures, sparklers, torches and flame effects. It’s fun to see that. You know when you’ve crossed over into some new territory of popular culture when people are trying to copy your effects on YouTube.” For his visual effects efforts, Franklin received BAFTA and Oscar nominations. “It was a great collaboration between special effects, visual effects and also our miniature vendor New Deal Studios. Scott Fischer’s special effects team did an amazing job. I couldn’t be more pleased with the work from the team at Double Negative in London. It’s a high point in a career you can say! I’m a lucky man. I get to work with the greatest filmmakers in the world so I couldn’t ask for more than that as a visual effects guy.”