Matt Hoyle Photography

The work of Matt Hoyle interests me and does definitely deserve a post.

III. Organization and Worklow

Starting with post 1995 and up to post 2000, over the next several days I’ll be posting excerpts from a book I’ve been writing (in my copious free time). Feel free to discuss the items mentioned.

III.Organization and Workflow

The great thing about this book is that it’s not the final say on any technique. The following tips are other methods that a compositor will try to make a shot work. It is up to you and your problem solving skills to figure out the best approach to making a shot perfect within your time constraints. There’s no wrong way or right way to composite.. Only a quick way and a slow way, and comp veterans will know the best quick way.

Managing your script via notes and observable nodes will make your life easier. The key is consistency. These are simple adjustments that can be made immediately upon starting a new composite. Elements brought in should correctly show which version they are. Live action plates can have abbreviated names to show what they actually are. Instead of BL0450_plate, naming it BL04550_greenscreen or BL04550_cleanplate, will make your script much easier to navigate. I often abbreviate the names even further.. Greenscreen becomes gs. Bluescreen becomes bs. A stabilized plate becomes stab. A dustbusted plate becomes db. Here are some other ones which you may find useful

_rt – a retimed plate
_mt – an articulate matte or mask
_tag – a color channel, or several
_CC – a colorcorrected plate
_grade – a graded plated

Make up your own names and abbreviations that match the image you’re bringing in. The key is consistency.

In addition to naming conventions, the way you organize your script will drastically improve your speed as a compositor. Organized, coherent scripts can be easily navigated by other artists if you need to point out a necessary technique. It will also make you faster, as you’ll know where to go to fix a problem. Each compositor has their own method to their madness. It’s up to you to decide how best to organize your work.

Layer organization. Channel management. These two (or four) words represent quite a bit to compositors these days, as comps get bigger and bigger, and supervisors and directors want more and more. There must be a way to organize your scripts and trees into organized bits of information that can be readily adjusted days, weeks, or months down the line. Today I’ll explain my methodology of organization, which you may have seen on sites like VFXTalk. If you’ve delved a bit into my gallery, you’ll notice that most of the scripts I present are organized, or at least start off that way, and then they blossom into some kind of freak, mangled tree. Once in a while the tree gets trimmed, and it gets back into some semblance of order. Follow the link below to read how I organize my mind, and how I can get through some of the more difficult comps I’ve been tasked with.

Some of the terminology that I’ll be going over is pretty specific, so I’ve labeled the image below with the jargon I’ll be using.

In each comp, you’re given a set of instructions. Given a live action plate, add the requisite effects to make it look like everything was filmed at once. This sentence is usually harder in practice. Most compositors build their scripts from the top down, where you have your plate and add things to it, one thing at a time. This is usually pretty effective since you have control over every single element that you pipe into the main trunk. Other times, building branches of effects and then piping that into your main trunk will be easier, more controllable, and often faster. The roots of your comp are usually the final outputs, but often you will have roots further up the tree as precomps, which will enable quicker interactivity during compositing.

My method is a combination of the two, as well as copious node labeling and note taking. This makes it easier for another artist to quickly take over a comp that I’ve worked on, or take select bits of my comp to use in theirs. Some artists which I’ve known in the past are very protective of their techniques, which sometimes work, and sometimes don’t. I have often taken a small script and weeded through the extraneous bits, and kept the important nodes for my own comp.


For integration of CGI and live action element compositing I approach it in several ways. What is the goal of the shot? What pieces will I have to replace or matte out? What areas will I have to roto or key? Years of production experience will help you in this manner, and that is the reason that comp leads are comp leads, they have the years of experience to direct you in the most quick and efficient manner. In organizing my comp, I will usually lump all CG elements into one branch. each live action element (smoke, debris, fire, etc) is in its own separate branch. This allows me to adjust the CGI as one unit, which can be prerendered if necessary to speed compositing. Sometimes the numerous amount of CG elements will require individual masks from the live action plate, and that will occur in that branch before being joined with the main trunk. Take a look at this script. Can you figure out where the trunk is? What about the CG elements? Keep in mind that in this comp in Shake I use over nodes, where A (left input) is over B (right input). Even though it’s been two plus years since I’ve seen this shot, I know exactly where my original plate is, and what I’ve done to it. Here’s where the vital information is.

One of the key things you should be able to do is organize in a coherent manner. This means lining up nodes, labeling nodes intuitively, a regular over will become CGoverPLATE or NEOoverSKY, and so on. Groups are beneficial as well. You should be able to group methods of working, like enclosing a sequence of nodes and labeling that Grain Work, or pulling several keys and putting them together and calling that FG key. Some compositing programs have the ability to automatically append the version numbers of your CGI inputs in your script. If it doesn’t, it is often beneficial to do so. so you can see at a glance what versions you have if anyone asks. When organizing, do it continuously. Often it can be a pain when your comp starts getting larger and larger, and you just run out of space between nodes to place new nodes. Continually improving the layout of your comp as you work is an advantage that will surely help. I’ve found that as I near the end of a comp as the deadline approaches, it often becomes easier to just place a node between two others, and slowly build out a bulge on the branch of the tree. This can get messy very quickly if there are numerous revisions near the deadline of a shot!

Another method of organization has surfaced, in addition to simple RGBA color management. Combustion has had this for a while, with their RPF file format, allowing different bits of information to reside in other channels of the image, and now Nuke has a similar capability. It allows you to have up to 1024 channels of image information. each of these channels are floating point. This allows artists to use any channel as an matte, not just the simple alpha channel included in RGBA layering. For example, my comp usually contains the RGBA layering, labelled as rgba.r, rgba,g, rgba,b, and rgba.a. For other layers there is usually a depth.z, a uv.u and a uv.v, and things like spec.r, spec.g, spec.b. Keep in mind that when you view each of these color channels, they are grayscale (of course) and can contain selected items in the scene. In addition to these regular channels, I can add my own, which I usually do if I’m dealing with a large script which involves multiple channels of information. When I create my masks I usually name them maskA, maskB, maskC, and so forth. Each of these masks can contain four channels, which I label maskA.r, maskA.g, maskA.b, and maskA.a. However I don’t have to follow the r,g,b,a naming convention, and I could just as easily label them maskA.front, maskA.middle, and maskA.back.

There is no wrong way to comp, only easier and faster. Only production experience will get you to the easier and faster method, simply because you’re running on someone else’s schedule, not your own. LIke almost every industry, the acronym, KISS, is most important to remember. Hopefully the above tips will help you in production, and make you a quicker and more efficient compositor!

Making Of: Seattle International Film Festival 2009

Our previous post about Digital Kitchen’s work for the Seattle International Film Festival tickled my curiosity: How exactly were the animation rigs set up? How did they maintain control over the acetate layers?

So I asked DK if they’d be willing to share some making-of morsels, and they came through with the goods!

A little explanation from DK:

In executing the piece, we created a small set-up in our Seattle studio consisting of two rear-illuminated lightboxes made of 5 panes of glass layered on top of a diffusion layer. Each layer of glass had an element that was either animated frame by frame or was static to create the environment.

For example, a scene might have a layer of diffusion, a painted layer, a layer of characters that we could articulate, and a layer of organic materials, etc. that created the environment. We mounted a Canon Rebel XSi over animation stands, and connected directly to a Mac Pro workstation running the stop-motion software Dragon.

DK also acknowledges their inspiration for the project:

DK developed an approach that not only fit within the overarching campaign, but celebrated the hand-made qualities of early stop motion animation as well as the universality and diversity of SIFF – films from around the world that range from the highest production value to the most raw, stripped down filmmaking.

In doing so, DK sought to pay homage to one of the oldest feature-length films, The Adventures of Prince Achmed, by German animator Lotte Reiniger, and also drew inspiration from the work of Jamie Caliri, Kara Walker, Kim Keever, and the Quay Brothers.

Check out the finished project on Digital Kitchen’s site.

Posted on Motionographer

Seth Weisfeld – in8 Design

One glance at Seth’s work is all you need. He has been behind some of the most talked-about launches of the past year, including Hotel 626 and Made For Each Other.

Tom Hines

Great work by photographer Tom Hines.

[CS3] Single Frame render different from Video Render

I’ve noticed a few times now that render errors can appear when rendering to video, but the broken frames are fine when rendering single frame. This topic is for reporting such errors. Because if your project requires a long render time, it’s good to know in advance what should be rendered to single frames.

These aren’t user errors as the frames are fine both during noncached preview and single frame render. Render to Video is where it goes wrong somehow.

Earlier I found that rotoscoping inside subcomps sometimes causes the mask to apply itself too early or late, causing ofcourse horrible results.

Now I found another such bug. I animated the Offset Turbulence property of some Fractal Noise by the expression “[time*6000,0]”. at approximately frame 90, the noise stops animating in my video. It works fine using the script in the above mentioned topic.

Ball and Socket Rig

Man, i’ve got to get this thing rigged soon. I’ve messed around with this for too long now.

Back in august of last year, I posted this: http://www.vfxtalk.com/forum/two-joi…all+socket+rig

Since then, i’ve been able to create a full set up from head to toe, but I’m STILL having problems getting the hips and shoulders to work like how I want them to.

They’re essentially ball and socket joints. I’m a compositor at heart, so doing all of this is like jumping into the deep end without my water wings. >_<

I just need some guidance in how to create a proper ball and socket rig. I heard someone quietly mention a "broken Hierarchy rig", but I can only imagine what that means.

Anyone have experience with this? If you’d like the model to mess around with and test things out with, just let me know here, and i’ll send it to you in a PM.

I essentially need the arms and legs to function with IK’s but be able to rotate around the ball joint. I’m willing to settle for a solution to that before getting tied up figuring out how to get the rotation out of the elbows like I originally wanted.

Real Wave – – ??

hey guys does anyone knows how to achieve this through real wave.

http://physbam.stanford.edu/~fedkiw/…t_straight.wmv

Realflow interaction

Hi there.

I’m not sure if the "plugins" section is the appropriate place to post this topic but well… seemed close to me.

Ok. Here is the question:
I started to play with realwave [realflow] to generate some waves interacting with a geometry.
Here is the little problem.

I have a large scale realwave plane, (50*50) with high resolution (0.2 polygon size). And I have a sphere (realflow sphere) in the very middle of my scene.
I also have a spectrum, connected to my realwave (as my main wave generator)

However, the problem is, spectrum waves are not "interacting" with my sphere. I mean, the ripples do not be reflected (or ripple-back).

How can create this ?

Am I doing anything wrong ?

full face for tracking

hi!
we are working on a commercial and we need to replace the heads from the actors with 3d objects. Does anyone know where we can find some fill some green or blue full face with trackers ??