I’m getting ready to drop some money down on a linux workstation for NukeX and Mari. And I was just wondering if anyone had any advice? Specifically I’m not sure if it’s better to have a single CPU quad core or 6 core machine vs a dual cpu 8 core or 12 core machine. And if it is better to have a 2 cpu machine how do I know the motherboard I purchase supports two cpus? My goal is to have a very high end compositing suite that will also come in handy for 3D work in Maya and Mari. Any tips or suggestions are greatly appreciated. Thanks!
I’m getting ready to drop some money down on a linux workstation for NukeX and Mari. And I was just wondering if anyone had any advice? Specifically I’m not sure if it’s better to have a single CPU quad core or 6 core machine vs a dual cpu 8 core or 12 core machine. And if it is better to have a 2 cpu machine how do I know the motherboard I purchase supports two cpus? My goal is to have a very high end compositing suite that will also come in handy for 3D work in Maya and Mari. Any tips or suggestions are greatly appreciated. Thanks!
Do we have anything of that sort in Nuke.
Thanks in Advance
Amstos
I’m having a bit of a problem that I haven’t been able to solve.
– I have a 3d tracked camera and a plate.
– I have stabilized the plate by converting two 3d points to 2d and used that data in a tracker node (stabilize).
I then get a smooth plate that is stabilized on x and y. But now, as I want to continue to work, my camera doesn’t line up any more as it’s movement is based on the original plate. I was thinking, as all I’m doing to the plate is x and y stabilization, I should be able to use that data on the camera as well and stabilize it. Somehow.
The reason I need to stabilize it at all is that I need to get a smooth camera exported from Nuke and use it to shoot elements on a motion control rig.
After I’ve composited my elements I want to bring back my original movement, and that should be as easy as just reapplying the motion with the inverse of the original track.
Earlier I’ve been stabilizing the plate and then done a new 3d track of that plate, but it feels like there should be an easier faster way, as all I’m doing is a simple stabilization.
Anyone have any idea?
Efficient network workflow
Posted in: NUKE from The FoundryI’m working with a team of compositors and they need to pass the same script each other so the read nodes must be pointing at the same network place. Using local files isn’t an option cause we need to pass the scripts each other very frequently.
The problem is the speed: working in an 1GB ethernet network is like 10 times slower than do it localy. I think working this way isn’t reasonable due to low eficiency but at the same time, it’s better than having to relink all the files constantly. I’ve tried to use DiskCache after every read node but they only cache files for the script used, so if I save a new version the cache is missed and I must preche everithing again…
I’m very frustrated cause Nuke is suposedly designed with network and team work in mind and cannot see how is this possible with the actual workflow since working in a 1gb slow network (not fiber channel) isn’t practical.
Any suggestion?
Also I would like to know how are you people working with teams in network environments with Nuke.
Many thanks.
Fire in the Wood
Posted in: NUKE from The FoundryI am working in a music video and I have several planes with much fire
digital.
there is a fomula specified to compose the fire?
I have the Digital Library of juice and videocopilot
thank you all.:scorching
rendering aspect ratio off
Posted in: NUKE from The Foundrytoggle python expression
Posted in: NUKE from The FoundryI’m working with undistorted RED 4K 16×9 footage in nuke, after undistorting the plate in pftrack and exporting i ended up with a 4231×2380 plate.
The camera is a left to right track, we are building geometry and projecting patches on walls and ceilings.
I sent the plate to a matte painter, and she painted the whole ceiling revealed in the shot and returned a 4800×2380 patch painted on a certain frame to cover the whole shot.
Now when i project on the geometry and scanlinerender the mp doesnt align with live action.
i attached a reformat node to the scanlinerender with 4800×2308 format and merged under my live action which is 4231×2380.
what am i doing wrong?
I checked in after effects and they fit perfectly.
I’ve read the Framecycler manual and it mentions the "session" concept but seems like this Nuke Fraecycler doesnt remember the last session settings.
Is there any way to make Framecycler remember the last settings or session?
Thanks.