Defocus Masked

Hi guys,

I was trying to fake a zblur with a defocus, since the first one is very cpu intensive, but I realized that actually the mask works on the mix of the defocus, rather than the actual blur size, giving really not good result.

Does anyone knows a workaround for this issue or shall I use the zblur?

See images attached (the fake dof os a ramp from right to left, by the way)

Thank you!

Attached Thumbnails

Click image for larger version

Name:	defmasked.jpg
Views:	N/A
Size:	241.8 KB
ID:	11987
 

Click image for larger version

Name:	defmasked1.jpg
Views:	N/A
Size:	232.8 KB
ID:	11988
 

primatte for despill?

Howdy – has anyone figured out a way to use Primatte strictly for despill in nuke? like how with keylight you can plug in a full white alpha constant into the garbage matte input and then it won’t try to key your footage, will just perform the despill op.

anyone found a way to do that with primatte? it has some really nice spill tools… seems like we used to do this thing with shake but I might be dreaming.

best lightwave to nuke method?

we are making a feature length 3D movie and i’ve been given the task of making lightwave and nuke play nice. we’ve modeled our environments in lightwave, and now it’s time to get those environments into nuke so i can composite, color correct, add fx, etc.

so when i save an obj in lightwave and load it into nuke, it loads it without textures. ive scoured the forums and learned to make a readgeo node to add A (as in 1) texture, but it loads that texture onto the entire model. so, my question is, what is the best way to get objects from lightwave into nuke, with the textures? i was not on the lightwave team, so i don’t know the program at all. i did do some searching and came across "texture baking". is that the best way? or should i be looking at another solution such as using layout to composite then bring the composited footage into nuke?

Smoke with Nuke-Particles

Hi everyone,

I watched the very informative videos on the Foundry Vimeo Channel:

http://vimeo.com/thefoundry/videos/sort:newest

Looks great to me! Seems like The Foundry took some inspiration from Trapcode Particular for their own Particles and stuff from Mocha for the Planar Tracker.

Anyway – what would be the approach for creating Particular-like Smoke in Nuke? As far as I could see there are no "cloudlet"-particles or something similar like that. How can you do this? Using blurred spheres for this? Sounds like pretty heavy calculations for me.

Anyone tried this yet?

Cheers!

ARRI Alexa experiences?

Hi Guys,
a TV show I’ll be working on later in the year is thinking of using ARRI Alexa camera’s to shoot with. I was wondering what peoples experiences are with it in a Nuke workflow. How good is the pro-res 4:4:4 compared to the ARRIRAW? I got some basic test footage to have a look at and something that strikes me straight away is that the AlexaV3LogC colourspace in the Nuke 6.3 Read node settings, still seems quite flat for what I would expect (lifted blacks).

I’d be interested to hear peoples experiences with footage from this camera at any rate.

Cheers 🙂

modeler only worked once.

hey,

i just tried the modeler node in 6.2v4
i was able to create a plane in the modeler as expected and all was ok.
later i tried again and now i am not able to use the modeler anymore o.O

after i connect it to my camera and the backplate like needed and view the modeler in the viewer
– the viewer always swtches to 3D display.
– if i change to 2D the image is plain black. ( i could create polygons there but that makes no sense)

any ideas? reboot and other user and other comter even was checked without any success. nukeX license is available for sure.

cheers

alex

p.s.: the attached image should create an image at least; right? (only for showing the problem; of course i try with my real sequence and the tracked camera)

Attached Thumbnails

Click image for larger version

Name:	modeler.png
Views:	N/A
Size:	79.3 KB
ID:	11969
 

Add Normals to Point Cloud

Hello there, I’m having a little problem here and was wondering if anyone here could help me. What I’m trying to do is to create a point cloud using a pWorld pass (with "PositionToPoints" node) and use it to emit particles with the new 6.3 3dParticle node. The particle creation works fine, but having just a pWorld pass the Particle system still needs to fetch the normals of the rendered surface somewhere. Hence I’ve tried to utilize the "Normals" node to add the information of my nWorld pass to the Point cloud generated from pWorld. I’ve tried this using the "set" mode of the Normals node and plug the nWorld channels in the x,y,z inputs using an expression (just plugging it in there with Read2.red etc.) But the expression doesnt seem to be sampled per pixel, and all the normal vectors get the same coordinates – so the particles just move into one direction, instead of correctly moving away from the mesh based on it’s normal. Is there any way to get my nWorld’s normals into the point cloud?

No 32-bit Nuke 6.3?

Let me start by saying the new features are really really great.

I know, I know, we all use 64-bit capable hardware. That said there many situations where for convenience we use 32-bit systems and we’d really like a 32-bit copy of Nuke 6.3. I’ll even take a 32-bit version with limited functionality. I guess we’re sticking with 6.2v3 for a while.

If there is a plan for a 32-bit release I apologize.

Nuke 6.3 & Mari Event in LA Thursday July 21st

NUKE | MARI LA Event
http://www.thefoundry.co.uk/articles…dreamworks-la/

Venue
Dreamworks Studios: Theatre

1000 Flower Street
Glendale
California
91201
View map

Date and time
Thursday 21 July, 2011

Registration starts at 6pm. Presentations start at 7pm. Event ends at 9.30pm

http://www.thefoundry.co.uk/events/1…-event/signup/

1920×1080 clip importing into nuke as 1440×1080

i am pulling my hair out. why is it doing this?!