Tracker and python
Posted in: NUKE from The Foundrydoes anybody know python or tcl commands for Tracker buttons(Track to the last frame, track the next frame)? Or some other ways to use tracker via python command
I cannot get the menu to appear with the thread the author said to use. I can make a menu and submenu with gizmos….but this has me stumped
any help please
thanks
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:/Program Files/Nuke6.0v5/plugins\nukescripts\renderpanel.py", line 31, in render_panel
nuke.executeMultiple(_list, frame_ranges )
RuntimeError: O_Solver2: Failed to find sufficient feature matches.
How to solve it?
Im thinking of a couple ways to fix it by running the tracker pass and doing a smooth on that or doing F_steadiness but I was wondering if anyone else had the wiggle in there comp and how they dealt with it? Even if it relates to shake let I may be able to correct it if anyone else has ideas. Im at work right now so I cant try my ideas yet.
I dont remember the wiggle being addressed by Tahl in videos but its been awhile since I watched them and in order to challenge myself to learn nuke Im trying to avoid watching someone comp the shot.
So just checking with community let me know if you can think of anything.
I am doing a matte painting with 3D projection, and would like fog to be in 3D space to help blend the gaps, etc,
Is the fog in nuke 3D? Or would I have to render fog passes out of maya, etc?
Thanks!
hope to hear soon, thanks, lala
Dumb question i know, but i am stumped. I have an old animated matte that i painted on a quantel box some years back. I want to use that animation as a matte within nuke.
I tried to add as a matte on my merge but it affected the whole merge node.
Thanks
So here is the example and the question:
I have two renders from 3D, one of a sphere and one of a cube, both which live in the same 3D scene. From this particular angle, the sphere is in front (e.g. closer to the camera) of the cube, and partially overlaps it. But from other angles, the cube will be in front of the sphere and overlap it.
For our particular final 2D deliverable, we need to deliver the sphere and the cube in different layers (think Photoshop), however, the area where the sphere overlaps the cube must be "cut out". So the sphere would be whole, but the cube would have, say, a half-sphere cut out of it.
What I am hoping to do is render a depth pass (e.g. floating-point grayscale image indicating distance from camera) for the sphere and a separate depth pass for the cube. Then, my hope is that we can use Nuke to determine for pixel 1,1 (just making up terms here…) the sphere is closer to camera than the cube, and thus "keep" the sphere’s pixel and delete the cube’s pixel.
So essentially, Nuke would have to be able to determine from the depth passes which pixel is closer to camera, and then keep or delete the corresponding pixels in the main render.
My question is if this functionality exists in Nuke today, or if you are aware of that functionality in a plugin, script, or other custom code. Or even if it seems viable to try to write it ourselves.
Let me know if you need clarification of the example above – I’ve kept it very basic. In our actual deliverables, there are thousands if not millions of these operations in a single still frame – so please don’t respond by marginalizing the question (e.g. "Just mask it in Photoshop…") because the example is so simple…
Thanks a bunch.