Lens distortion matchmove work flow issue

Hi,
I’ve stumbled upon a workflow issue between Nuke and Houdini when match moving (though I’m sure this applies to any 3D app, Maya/3DS/C4d etc).

I’ve set up and tracked a shot in Nuke using the ‘CameraTracker’ node. Before tracking, I’ve undistorted the lens using the ‘LensDistortion’ node, so my track and camera are based on the undistorted lens.

I’ve also used a ‘modeler’ node to create some rough location geometries as reference when I take the scene into Houdini. Again, all based on the undistorted lens and footage.

Upon importing the FBX file, it’s all worked beautifully, apart from the fact the camera, track and modeler geometry are all based on the undistorted lens, and so, inside Houdini, the Nuke elements obviously don’t agree with the background plate. And, besides, even they did, if I create, say, a fluid sim in the Houdini scene, it’ll obviously not match the undistorted scene in Nuke. :spacedout

Is there anything I can do in either Nuke or Houdini to make this workflow…work? 😀

I reeeeeally hope this makes sense

Thanks in advance.
Chaz

Secondary color correction instead of keying

Hi, I’m in a shot where I must change the color of a window. It has a bluescreen behind it (not very evenly) and I must change it to be completely black.

I’ve tried to key the window with IBK, Primatte, Keylight, luma, min/max, but it doesnt seem to work well… Also HSVTool but nothing…

I just would like to know if maybe there is another approach for this, like some kind of color correction.

Thanks.

Motion control rig or 3D zoom?

Hi all,
I am always confused when I see a typical green screen composite with a dolly push in/pull out with a set extension.

Does anyone know if this is "usually" done on cards in 3D space or a motion control rig?

I don’t see any tracking markers on the green screen to suggest a 3D track

The sill image shot is the last shot in the video http://aimediaserver.com/studiodaily…480&height=310

Attached Thumbnails

Click image for larger version

Name:	Green composite.jpg
Views:	N/A
Size:	251.8 KB
ID:	11636
 

Warping shadow on a matte painting

Hi everybody,

I’m a french student in a VFX school and I’m in a dead end here.
I’m working on a short movie which is set in the Middle Ages.
I would like to warp shadows from a shot to make it match to a matte painting I made (WIP).

Here is a part of the shot to illustrate what I’m trying to do:

Original shot:

Matte painting:

Shot composited (here quickly in photoshop for the example):

The problem is that I want to keep the beginning of the shadow on the floor (less rotoscoping) and then warp it on the walls. I have a 3D layout which can be used for porjection.

Any idea?

Thanks in advance

which Normal vector needed for relighting ?

Hi, i’m learning the relighting. So i need to feed the Normal and Position pass. I check and test to render out Normal pass from couple of 3d app. Some of them output the normal vector as relative direction to the camera. Some other output the normal vector as absoult XYZ world. Which one is correctly needed for relighting in Nuke ? For example houdini and modo render out differently. In houdini , the normal is translated as xyz (rgb) which is relative to the render camera, while modo translate the normal as absolute world-xyz position.

thanx

set alpha help?

hello, i have trouble with "set alpha" in nuke

let say i have pic A(no alpha) and i wanna set the A alpha with pic B

i trying to use shuffle/shuffle copy but i think im lost..

corner pin or camera projection

Hello, I need to replace the number plate on a car because it was flipped due to continuity reasons. It was a driving shot and sometimes there are other cars going pass in front of it.
I am not sure that should I just do that with corner pin? maybe with card 3D. Or I should do it with camera projection, because I am not sure when should I do it in 3D.
Can someone please explain?
Thanks

Nuke tracked rotoshape to Silhouette

Heyy guys,

I have a shot, for which, I need to get rotoshapes of objects in Silhouette (sfx format) as output. The thing is, I am trying to avoid roto-ing these objects for the entire range. So, for that, I neatly tracked the shot in Nuke, and got my rotoshapes animated(tracked) nicely. And now I want these rotoshapes exported to Silhouette..
I have tried using ‘auto-trace’ option in AfterEffects, but its not working well there with my shot.
Is there a way I can get auto-keys to the tracked roto-shapes in nuke, n then transfer it to Silhouette? or any other option?? Please help… M stuck!

Tracking eye movement

is the anything special about tracking eye movement or should i tackle this like i would tackle any other tracking? i mean, if i want to follow the eye movement

Map a projected Image to a frame

Hey.

I have:

A tracked, imported camera.
Geo of a guys jacket.
Plate of guys jacked with rig that needs to me removed off the back.

I’d like to go to a specific frame, map a projection of the jacket from that frame on to the jacket geo. Then "bake" that into the geo, so when the geo moves it’s no longer projection mapped, it sticks to the jacket. I’ve tried using UV project. It works, problem is, I can’t use the animation baked into the jacket FBX BEFORE the UVproject. I have to use a transformGeo downstream of the UVprojection, but then I can’t retain my baked in animation.

This should be simple? Like the Fusion merge 3d Am I crazy? I mean I know I can take the geo into Maya or something, map normals, etc… but I want a more efficient workflow.