Project3D and UVProject and DepthMap

I may be overlooking something obvious but running into a few issues.
I can use Project3D and Apply Materials to a card and it displays correctly no matter what angle or distance. (Stays the same size)

If I use a DisplaceGeo using a depthmap then I run into issues of the depthmap not lining up. I’m assuming this is because the differences of simply projecting versus UV.

So I switch the UVProject which lines up better but now the image seems to be changing size in the areas that are shifted from the displace.

Is there a way I can use a 3D generated depthmap to do a real 3D displace (displaceGeo) and have the image be the same size throughout but simply offset in Z so a camera slightly off axis will see that z difference?

I know I can fudge by using iDistort but I’d like to get an accurate representation with the correct image at the correct plane.

Thanks.

Fine Tuning

Hello All,

I had posted a keying issue last week using IBK keyer, I revised my comp using IBK keyer and Primatte and I still need to know how to get the whispies of her hair more refined. I attached my nuke script, if anyone has any ideas it would be much appreciated.

Thanking all in advance,

cheers,
rdamani

Attached Thumbnails

Click image for larger version

Name:	Screen shot 2010-10-04 at 11.27.39 AM.png
Views:	N/A
Size:	117.5 KB
ID:	10652
 

Click image for larger version

Name:	Screen shot 2010-10-04 at 11.28.30 AM.png
Views:	N/A
Size:	995.9 KB
ID:	10653
 

Stereoscopic/DispGeo Problem

When i view the left in 3d eye my card is displaced properly. How ever when i view the right eye it is still a flat image in 3d space. I have checked and rechecked my work and Im not sure what im doing wrong.

How to tell if all 16 bits have data in them

Hi there,
We have some 16-bit files (TIFFs) and suspect that there is only 10 bits of data in them (ie. from a 10 bit source). Is there any way to find out if all 16 bits have data or if they’re rounded at 10 bits?

Cheers!

rendered vectors to help furnace nodes

Hi!

This is my first post here on the forums, and I don’t want to sound like a noob who can’t read before asking, but I would like to ask your help, since after a few hours browsing trough the topics I couldn’t find anything related.

So, is there a way to use rendered motion vectors to speed up furcane nodes like f_motion blur or f_kronos? I need to retime a rendered clip really bad, and kronos gives great results, however the generated vectorspace is not very accurate, and I have a motion channel saved, so thought I give it a shot.

The motion channel is in camera/screen space, float, unclamped, and gives accurate results when used with the vector blur node, however, when I plug it into the furnace nodes as foreground/background vectors, the furnace nodes can’t use it.

Any suggestions? Is this possible at all, or the motion vectors generated by the furnace vector generator are totally different from vectors rendered by my rendering engine (VRay) ?

Thank you for your help in advance!

Best regards,
A.

See render time?

Hi.

I have just a little question, is there any way to see how much time it took to render out a sequence via a write node? You can see how long it + – will take when you hit render, but I hope there is somewhere an info box that says how much time it really took, maybe even better by a frame by frame basis.

Thx!

importing roto shapes

Wondering if it’s possible to import an SVG document and convert it to a roto shape?
Thanks

Adding Atmosphere

Ive seen a number of shot builds on people’s demo reels where a layer of "atmosphere" or "haze" is added to the comp to better tie in cg elements with footage. For example a cg building or an airplane.

Can anyone give me an idea of how this technique is achieved.

Check out whats happening at around 2:40 in this reel.

http://vimeo.com/15052997

Syntheyes Pointcloud to Nuke?

Hi.

I have been wondering, when you 3D track inside of Nuke, you get this nice Pointcloud where you actually see the pixels in their 3D space. Since we matchmove in Syntheyes, is there any way to get a similar Pointcloud to Nuke? Would be great!

Thanks a lot!

how to create new class based on existing Node

Hi, i like to add some more new function/feature to an existing Node. Trying to do with gizmo but not quite happy with some refreshing issue so i want to do the other way. I’m thinking to create new class base on existing node. Here my attempt just for testing :

Code:

class newblur(nuke.Node):
    def __init__(self):
        nuke.Node.__init__(self)
        self.x=nuke.Int_Knob('test','test')
        self.addKnob(self.x)

a=newblur('Blur')


it creates a Node with no error, but as you can see in my script , i’m trying to add new knob . But there’s no new knob created. Yes, i can use this script to get new knob :

Code:

a=newblur('Blur')
a.addKnob(nuke.Int_Knob('test','test'))


but this is not i prefer. I want to have my new Node class is ready with its new knob as i instantiate it. Since i get no error , it’s hard to find out what’s wrong. Can anyone help me out ? what is missing? any Pythonerzz pls?

thanx