I wondered if anyone could advise of the best method to deal with lens distortion using the CameraTracker node in Nuke (allowing for rotoing before tracking).
Currently, I’ve attached a ‘LensDistortion’ node to the footage, used the ‘Line Analysis’ section to draw some vertices in, and analysed the shot. This seems to have done an awesome job straight off.
So, the next step was to roto out the moving people in the scene before tracking (it’s a busy London shot so this took awhile).
It’s only at this point that I’ve now attached the ‘CameraTracker’ node and started the tracking process, after which I’ve hooked up a ‘PointCloudGenerator’ node.
The resulting point cloud is massively disappointing though! Absolutely tiny, with little detail, and arguably less useful than the standard CameraTrackerPointCloud :confused:
So now I’m wondering if I’ve gone about this the wrong way?
I saw a tutorial on The Foundry’s site where they branched the ‘LensDistortion’ node off separately from the footage, then hooked up the ‘CameraTracker’ node directly to the clip itself, and just dragged and dropped the analysed distortion properties from the ‘LensDistortion’ node to the lens distortion section of the ‘CameraTracker’ node. They never actually included the ‘LensDistortion’ node in the CameraTracker tree like I have.
But on the other hand, they didn’t do any roto. And I wonder, if I did it this way, would I have to roto from scratch on the original/distorted plate? Or could I somehow use the undistorted roto?
I’m really hoping I can somehow make use of the original roto node because I’ve spent a couple of evenings rotoing everything using the undistorted image now.
I really hope all this makes sense. Lol. Apologies for the HUGE post!
It’s just that I’m wondering whether I’m actually doing this right or not?
Thanks in advance.
Chaz
Post a Comment