Zoic’s Andrew Orloff on ‘V’
ABC’s V, a re-imagining of the eerie 1980s alien television show, features visual effects from Zoic Studios, whose contributions include a massive spaceship and numerous virtual sets. Visual effects supervisor Andrew Orloff discusses Zoic’s work for the pilot episode and what to expect from the ongoing series.
Were you a fan of the original show?
Absolutely! I’ve been a lifelong fan of visual effects on TV. V was something that captured my imagination – just the whole idea of this alien race coming down as saviors and coming down as fascists taking over the U.S. It was just the right combination of sci-fi and scary. Actually, the most interesting thing about the original series was that it wasn’t even supposed to be aliens. It was supposed to be about a fascist party that got elected in the U.S. The network thought the idea was too heavy and made it aliens instead. The point is, all really good and relevant sci-fi series are not all about tech and spaceships. That’s part of it, but it’s also about a mirror on society, about what’s going on in the world.
Did you look to reference from the 80s series?
Definitely. From the beginning, Scott Peters the creator and writer of the pilot and the executive producers Steve Pearlman and Jase Hall brought us in at the very beginning, before the show was even greenlit to get our ideas on design and a methodology for the effects. So we were very heavily involved with them and Ian Thomas in the production design department in conceptualising especially what the ships were going to be.
That was something that we really wanted to make sure we referenced and paid homage to from the original series, but updated. So there are some design similarities in the shuttle and the mothership that come across from the original show. You’ll notice the surfacing and that it’s a grey saucer-shaped ship with plating at the top and bottom. Also, in the equator of the ship you can see all the decking. That general setup of the ship comes from the design of the original series. Of course, our shape is different. It’s updated and much more fleshed out but the main hits of the design components are there. I think part of the success of doing a show like this is paying homage to the fans.
How did you design the ship?
It is a huge model. It’s in the neighbourhood of 3 million polygons with so much detail. When there’s a ship that’s that large, you have to make sure you have scale. There’s a lot of a small lights and radio towers and little bits of detail that give you an idea of how big it is. That’s the artistic challenge of designing a vehicle of that size. It’s a quarter of a mile wide.
The panelling is something we went into as well. If you look at the SR-71 Blackbird or the stealth bomber which are highly technical composite materials that have a black sheen to them, we really wanted to capture that look because we felt it made it look a lot more real. But we also wanted the pattern to not be recognisable as something you would see on earth. So we went through a bunch of different designs to make it look like some kind of other-worldly material.
How were those flipping panels created for the ship?
The underside of the ship where the hexagonal panels flip over. It was in the script that the ship opens up in some way and then the panels flip to reveal Anna speaking down below. We had to figure all that out. We couldn’t make a completely smooth bottom to the ship because later they have to go up into it and it’s not all that interesting for it to be totally smooth. So the idea was that we’d have a macro approach where the large panels would come off and re-configure. They’d come off the surface and then become all smooth. Then at that point the smaller sub-panels would need to flip over and lock into place.
All the mechanics for this were designed in previs. The flipping of the small panels was done by writing special scripts in Lightwave in order to do that mass amount of panels flipping. The shimmering effect we did in compositing, when Anna comes onto the screen. It’s supposed to look like it’s being revealed as the panel flips over. I think it’s a great shot dramatically, because there’s so much happening with the ship and at first you think it might be some kind of weapon that’s arming. That was kind of the unintended consequence of all the movement. Once the director saw all our tests, he really latched onto that and had the idea that people would get fearful that there was going to be an Independence Day-type decimation of their city.
Can you talk about that pullback shot in New York?
The pullback was a tricky shot from a technical perspective. You’re marrying a pullback on a greenscreen stage to a pullback on a CG spaceship to a pullback of a helicopter flying over Manhattan. We spent a lot of time stablising, re-tracking, stabilising again, re-tracking and warping plates together so that everything would go fit seamlessly together. There was so much movement in the original helicopter shot that we had to first stabilise it and then re-paint in portions of the background to get enough space to make it look continuous.
What compositing solution did you use for the show?
We used After Effects for the pilot, but we switched to Nuke as we went into the series. The reason that we switched to Nuke was that it gave us a lot of flexibility as far as scripting goes. We can now build our comps based on the EDL and give our compositor a starting point in their composite. When they sit down to do their work, there’s a script that runs in the background that generates a starting point for their composite. They don’t have to load all the frames by hand and look at all the frame counts on their own. Nuke allows us to do a lot of Python scripting that ties into our main shot tracking database.
Can you talk about the virtual sets being used on the show?
The majority of the shot count were actually taken up by the virtual sets. As far as visual effects goes, there were over 150 shots on greenscreen. Every single one of the ship interiors was a completely CG set, which is continuing on into the series. It was decided early on that there would be no money or time to build sets that quickly or that fast. To extend them was not practical either for the making of a pilot. So we decided early on in the production phase that we were going to have to do photo-real set work. We worked with Ian Thomas, who would design his sets in the traditional way. But instead of sending his plans off to the construction department, he would give them to us. We worked with him and the DP to work on the lighting of the sets. They were mostly complete by the time they started shooting so we could show the actor’s what it would look like when the shot was finished.
What tools did you use for building the virtual sets?
On the production side, they were using Google SketchUp. They would take the original architectural drawings and convert them to SketchUp to get an idea of what they would like in 3D. It’s becoming a very common tool for production designers to use to visualise their sets. Then they sent the files to us which we converted to Maya. We rendered with mental ray which let us use real world lighting models. It has a specific type of light that lets you plug in from existing lights as far as fall-off, the colour of the light and the detail of the light goes. We can plug those in to suit the direction given by the DP. We did all of our camera tracking in boujou and SynthEyes.
Additional shots for the pilot were done with a real time tracking system called Lightcraft and a hardware system we call Zeus – the Zoic Environmental Unification System. That allows to us go on stage, shoot with the camera and see a real-time previs composite with the CG background as we go. We used that at the very end of the pilot and will use that through the series.
Can you tell me more about Zeus?
The way that it works is we get all the designs from production, we add the lighting details by working with the DP, and then we use Lightcraft’s realt-time 3D graphics engine, like what you would find in a video game. We convert our digital sets, which would normally take hours to render, into something that works in the graphics engine much faster.
There’s a little tiny lipstick camera attached to the main camera on set which shoots up into the ceiling to read a bunch of tracking markers and give us a real-time track of the set. It will make the set move around in the exact same way as the main camera is moving. We map all the lenses to give us real-time lens distortion and depth of field. It also does a real-time composite with the greenscreen. So what we’re seeing on the day is real-time feedback of what the final shot is going to look like. The actors can look at and know where to stand. The DP can use it to know which direction the light is coming from so they can light more consistently. And the director can see the architecture of the set really well so they can choreograph way more complicated camera moves to go past interesting pieces of the set that you could never do without real-time feedback.
At the post end, the real-time comps are being used by editorial for temp visual effects, which is really helpful. We save the camera data and bring that in for our final shots to save the amount of time it takes to complete the effects. For a lot of the shots, especially for the close-ups where the background is blurry, we use the real-time set as the final set. It cuts down the amount of time we have to track and render the backgrounds and it makes it viable on a budgetary and scheduling basis to do these virtual sets.
Related Links
Zoic Studios
IDesignYourEyes, Zoic’s blog