I mean when I render out from the lighting stage I need to writeout all the diffuse, spec reflection etc to the compositor so at my light composite stage.
I am just trying to figure out how to write images in parallel instead of recomputing again and again for every single write node…
I want to compose my 3d scene rendered from Maya using .exr format (with several passes contained into every file).
The problem is that Maya generate the names of the passes with this format:
<SHORT_NAME_PASS>_<long_name>_<set_name>_<camera_r endered>
for example:
AMB_ambient_BG_persp
DIFF_diffuse__FG_persp
but the max lenght is 32 chars for every pass into EXR file, and I have camera names quite long (neccesary for production), so the names are truncated and filled with random numbers in order to distinguish from each other; i.e I get names like this:
AMB_ambient_set1_cam_name_truncated12345
AMB_ambient_set2_cam_name_truncated67890
so the question is:
is there any way of making a manual Shuffle taking as source channel the one which name matches with an initial substring (i.e. with the two truncated names of the previous example: if I want to select as source the pass Ambient forn the set 1, I would specify the starting substring "AMB_ambient_set1")
Thank you very much!
3D objects as comp mattes
Posted in: NUKE from The FoundryBy some chance I took a look at the Merge Expression node changed its alpha contribution to -1 and piped that into the apply material node for a 3D object. It seems to work pretty well but you need to premult by the alpha channel after a scanline node to get the alpha to be used properly. And it can give some strange results at times.
I have absolutely no idea if there is a better or proper way to do this (I’m very new to nuke) but if someone has another way to achieve this I’d love to hear it.
Test script attached.
Cheers
Dave
how to paste the links to TMI from a global node. Or it will be very happy if I can get to know how to paste reverse controls ..ie when I increase one slider the other slider needs to go in the opposite direction…
its finally for the TMI control… pls help me..
Better Nuke Workflow?
Posted in: NUKE from The FoundryI was wondering if there are any certain workflow pattern one should follow in order to keep renders as quick as possible. Or does the worlkflow pattern not matter? Like, is there certain ‘Donts’ that would either cause coutnless of errors or further slowdown renders?
Also, does Nuke work best with .mov or sequences?
When writing out, is there any certain compression settings that nuke works best with?
Sorry for all the questions, I just want to make sure that i can fully optimize my current workflow. Thank you!
Regrain Gizmo
Posted in: NUKE from The FoundryOne thing I noticed was that if you use F_Degrain (furnace tool) and you push it to 3 you can blur your gs image and make for an easy key with nice motion blur on really grainy footage. Then you can also export the grain instead of the degrain result and you will get a black image with nothing but grain. Render that out to an uncompressed 32-bit .exr and you can simply merge it over the degrained footage and it restores it completely. So you can key, then restore.
I’m putting this footage onto a bg plate that needs to be defocused. I was just adding a grain plate…but you can still tell that the true grain in the fg image does not truly match the fake grain in the bg image.
Regrain takes the grain as an input and the defocused BG image and adds the grain to it in a natural way….there is control for red, green, and blue, and also for how much grain passes thru dark areas in the image (since really dark areas tend to not have much grain..and black areas have no grain.) So this can be used in a couple ways. One is to just apply the original gs footage grain to the bg…so it matches perfectly…..or if you have one, get a clean green plate without your talent, get the grain plate and apply it.
BG_Passthru Gizmo
Posted in: NUKE from The FoundryI’m working with scanned 16mm Film, and its grainy as hell. Whats worse is that it was shot in lower light than usual. The actress has brunette hair tightly bound into a braid, but there are small pieces of hair that are sticking out from her head.
Anyone have any suggestions for tackling this? Unfortunately I can’t upload anything, so I’m not expecting any really detailed help, but suggestions would be great.
My current plan of attack has been to generate a core and edge mattes, then using separate keyers for pieces of the hair. But damn, everything is so grainy, I just can’t avoid sizzling edges. I tried degraining the green screen, then comping the foreground back over it and keying that. That got me a little more progress, but not enough.
Primary keyers i’m using are the IBK and Primatte. Lumakeys just aren’t getting the detail I need.
Is it possible to render the wireframe view out? Without the lights and cameras visible of course.
I have something like a brick wall and a projection going on to the wall – the bricks arent perfectly lined up (apparently) and im getting lines instead of a smooth surface on some blocks, but actually I wouldnt mind getting the edges and using it to show like grooves.
cheers