Building a NukeX and MARI Linux Work Station

Hey Guys,
I’m getting ready to drop some money down on a linux workstation for NukeX and Mari. And I was just wondering if anyone had any advice? Specifically I’m not sure if it’s better to have a single CPU quad core or 6 core machine vs a dual cpu 8 core or 12 core machine. And if it is better to have a 2 cpu machine how do I know the motherboard I purchase supports two cpus? My goal is to have a very high end compositing suite that will also come in handy for 3D work in Maya and Mari. Any tips or suggestions are greatly appreciated. Thanks!

matching Color values between two footages

When we wanted to match Color values between two footages, in Fusion, we can simply pipe BG of ‘Color Correction’ node to raw footage and pipe FG to the footage to which we wanted to match color to and then we simply click ‘Match’ under Histogram.

Do we have anything of that sort in Nuke.

Thanks in Advance
Amstos

Stabilizing both plate and 3d camera

Hi all,

I’m having a bit of a problem that I haven’t been able to solve.

– I have a 3d tracked camera and a plate.
– I have stabilized the plate by converting two 3d points to 2d and used that data in a tracker node (stabilize).

I then get a smooth plate that is stabilized on x and y. But now, as I want to continue to work, my camera doesn’t line up any more as it’s movement is based on the original plate. I was thinking, as all I’m doing to the plate is x and y stabilization, I should be able to use that data on the camera as well and stabilize it. Somehow.

The reason I need to stabilize it at all is that I need to get a smooth camera exported from Nuke and use it to shoot elements on a motion control rig.

After I’ve composited my elements I want to bring back my original movement, and that should be as easy as just reapplying the motion with the inverse of the original track.

Earlier I’ve been stabilizing the plate and then done a new 3d track of that plate, but it feels like there should be an easier faster way, as all I’m doing is a simple stabilization.

Anyone have any idea?

Efficient network workflow

Hi, I’m facing a network workflow problem with Nuke.

I’m working with a team of compositors and they need to pass the same script each other so the read nodes must be pointing at the same network place. Using local files isn’t an option cause we need to pass the scripts each other very frequently.

The problem is the speed: working in an 1GB ethernet network is like 10 times slower than do it localy. I think working this way isn’t reasonable due to low eficiency but at the same time, it’s better than having to relink all the files constantly. I’ve tried to use DiskCache after every read node but they only cache files for the script used, so if I save a new version the cache is missed and I must preche everithing again…

I’m very frustrated cause Nuke is suposedly designed with network and team work in mind and cannot see how is this possible with the actual workflow since working in a 1gb slow network (not fiber channel) isn’t practical.

Any suggestion?

Also I would like to know how are you people working with teams in network environments with Nuke.

Many thanks.

Fire in the Wood

Hi, I’m a young filmmaker and composer of visual effects.
I am working in a music video and I have several planes with much fire
digital.
there is a fomula specified to compose the fire?
I have the Digital Library of juice and videocopilot
thank you all.:scorching

rendering aspect ratio off

i have a 1440×1080 clip i am editing in nuke. when i go to render via the write node, it renders it in a 4:3 ratio, despite the settings in the full size format in the project matching the ratio for the video.

toggle python expression

hi, does anyone know how to toggle python expression button and the ‘R’ button next to it (which are accessible when we edit expression). i want to put multiline python statement using .setExpression(), but it will always set the expression mode to TCL. i mean how to toggle it using expression not by clicking it. Can anyone help?

Attach 2 cameras and run a third inbetween?

If you have two orbiting animated cameras. Can you attach a third inbetween? Like if it were a rope between the two cameras and you want the third camera to be able to go back and forth on rope.

Overscanned matte painting alignment issue

Hi,

I’m working with undistorted RED 4K 16×9 footage in nuke, after undistorting the plate in pftrack and exporting i ended up with a 4231×2380 plate.

The camera is a left to right track, we are building geometry and projecting patches on walls and ceilings.

I sent the plate to a matte painter, and she painted the whole ceiling revealed in the shot and returned a 4800×2380 patch painted on a certain frame to cover the whole shot.

Now when i project on the geometry and scanlinerender the mp doesnt align with live action.

i attached a reformat node to the scanlinerender with 4800×2308 format and merged under my live action which is 4231×2380.

what am i doing wrong?

I checked in after effects and they fit perfectly.

How to load Framecycler last session by default?

Hi, I’m very frustrated doing flipbooks with Framecycler and having to adjust the same things all the time. Things like the LUT, crop, zoom to fit (crtl+home) and minor color grading adjustments.

I’ve read the Framecycler manual and it mentions the "session" concept but seems like this Nuke Fraecycler doesnt remember the last session settings.

Is there any way to make Framecycler remember the last settings or session?

Thanks.