Understanding UVMaps - Warping with STMap - Pt. 1

Written by Florian Gellinger on .

Here at RISE we started using UVMaps to apply camera lens distortion to matte paintings, cg renders and all sorts of footage when there was no (or at least no cheap) way to get the lens distortion value from your favourite 3D tracking application into Nuke. In this first example I'm going to show how to do this - although now there are plenty of ways to get i.e. Syntheyes' or PFtrack's lens distortion values into Nuke - or to use the Nuke tracker right away for undistorting and redistorting.

Usually you would start shooting a grid with your lens of choice:

001

Then you would undistort it using the 3D tracking software (please note the increase of image size due to warping):

002

You can now use your undistorted footage in your 3D application of choice as background and all track predictions will stick perfectly (as long as your 3D camera's focal length takes the overscan image size into account). Render your CG images with overscan from any application and renderer you want. You can now apply the lens distortion to your renderings using the 3D tracking package.

But - what about Nuke's 3D space? Do I have to scanline render everything with overscan resolution, distort my renderings with the external 3D tracker and then read the image sequence in Nuke again to do my final 2D comp? What about 2 billion exr channels? Is there any 3D tracking software that can distort exr sequences including all channels? I don't think so (correct me if I'm wrong)...

If you want to do proper redistorts in Nuke you should have a pixel accurate warp doing exactly what your 3D tracking package does to your images. Consider this: A standard UVMap has a unique color value or ID for each pixel. If you change something in an image (like redistorting it) you could apply the same transformation to a UVMap and you could then later reconstruct by comparing the pixel color values with the standard UVMap values what has been changed. Here is how to create a standard UVMap with just two linear ramp nodes (same resolution as your undistorted images - so the UVMap has to have an overscan):

003

Now write the image to disk and re-distort it with your 3D tracking software of choice. It still has overscan resolution but it has been re-distorted:

004

Connect the re-distorted UVMap to the stmap input of the STMap node. The source input of the STMap node will now be transformed in the same way the UVMap was warped by the 3D tracking software. Both inputs have to be of equal resolution, of course. You can then crop the image to your original source resolution:

005

The lensgrid that was undistorted using the 3D tracking application is now redistorted and croped to match the original lensgrid that was captured on set. If you want to use this technique to distort Nuke's 3D space output from a scanline renderer node to match the distortion of the filmed background plate - just set the background of the scanline renderer to be the same overscan resolution as the undistorted images (with rgba turned off) and apply the correct overscan focal length to the camera in your 3D scene. Then apply the STMap node with your distorted UVMap to warp the output of the scanline renderer.

006

The next part of this tutorial will cover warping of paint strokes and beziers using projected UVMaps to make color correction on 3D renders a little easier - or to add other fancy stuff that shifts perfectly in perspective with the rest of the scene without being 3D...

Comments   

 
# vipin garg 2010-07-26 20:43
it was great ......aweseum
 
 
# behram patel 2010-07-27 03:21
nice.
Look forward to more.

b
 
 
# ayman ali 2010-08-03 07:21
It's great thanksssssssss
 
 
+1 # Randy Little 2010-08-17 21:27
Just remember that lens distortion changes with focal distance so if the lens is focused at Infinity the distortion is different then when focused at 2 feet. Probably by a pretty good amount depending on the lens. Lens charts should be made that are as close to the focal distance as possible. Also fyi focal length of a lens changes by the same means. Focal length of a lens is determined by distance of the rear nodal point to the image plane when the lens is focused at infinity. The focal length becomes longer as focus moves closer to the lens.
 
 
# Tom van Dop 2010-08-20 09:46
Great stuff!
I was wondering though, when I use the STMap in combination a with UV map (float exr) made from the nuke lens tool, the result is a bit softer than when I just copy and paste the lenstool and set it to distort.
Is this always the case or am I doing something wrong?
I tried blurring the UV map slightly and playing with the filter settings in the STMap.
 
 
# Florian Gellinger 2010-08-23 07:39
Hi Tom!
Yes, you're right - the result of the STMap node is always a little softer than the original. But as long as you use it to distort ultra crisp cg renderings that need to be softened anyway - why bother. I'd keep it away from filmed plates, though. I did this tutorial not necessarily to show how lens distortion can be applied with an STMap node but I thought it might be a good approach to show what it does in general - the new lens distort node in Nuke is much better. The cool stuff that I use it for on a daily basis is covered in Pt.2. I'm thinking about doing Pt.3 for stereo projects. Might be online sometime soon.

Rock on!
 
 
# Tom van Dop 2010-08-23 07:54
Thanks for the info Florian!
And keep this stuff coming :-)
 
 
# djati waskito 2010-09-03 05:36
very good info--indeed.
keep up rolling good stuff Flo
 
 
# Mohamed Selim 2010-12-03 23:38
Great, thanks for the info!
 
 
# Kevin Landry 2011-01-28 09:35
I tried it and it is always giving me a result that is different from the undistorted plate exported from the tracking application, which is kind of weird because the UV map was exported using the same undistort function in the Tracking software. So technically, it should be the same. The image undistorted with the UV map is always 1/2 pixel offsetted right and up compared to the image undistorted in the tracking software. Could this be due to some pixel filtering or do you have any idea on how to fix this?
Thanks a lot!
 
 
# Salvador Zalvidea 2011-03-09 03:09
Kevin,

I had the same issue and solved that problem by offsetting the ramps by 1/2 pixel (if your plate is 100 pixels wide, then your ramp goes from -.5 to 99.5). It's closer but still not the exact same.

I wonder if this could be an issue in a stereo project where pixel accuracy is more important.
 
 
# Ian Northrop 2011-06-20 13:59
Kevin,

Your tracking software may also be undistorting from a center point that is 1/2 a pixel away from the center-point of your image. Unless nuke is taking into account this off-center point-of-origin value, the results you are referring to would be expected. Hope this helps.
 
 
# Giso Spijkerman 2012-01-06 05:01
Hi Kevin,

nice tut, thnks!

btw, you are questioning if there is any tracking package that can distort exr's with it's channels, but I think PFTrack can. I am not sure, didn't try it yet, but as it exports files with the same settings as it was read in, i think it'll export the channels aswell.

Cheers
Giso
 

You have no rights to post comments

We have 3218 guests and 108 members online