This week, I began tracking my DLR shot. The following days, I was focusing on this process.
Loading the footage
I exported the footage first as a .dpx sequence, which I brought into 3DEqualizer. I found out this was a wrong decision, as .dpx colour space is different from SRGb and thus it did not display correctly. That is why I used .png sequence later, which worked well.
Tracking the camera
Click to expand.
I used about 150 manual tracking points in conjunction with filtered autotracked points. The result was very good in the first stage, so the only thing I focused on in the next stage was calculating the focal length and lens distortion properly. I used the Parameter Adjustment Window to adjust these details. First, I used wide range and brute force method to guess the numbers, and then Fine with Adaptive method. The results were good and the lens distortion was not visible afterwards. Then, I exported this as a .mel file for use in Maya, and rendered the first matchmove playblast, which I included below. I also modeled some geometry, which is I will use later on in comp for rough roto work.
Survey points indicate the scene scale and the position of scene origin (point that has zero X,Y,Z values). I used data from Google Earth to survey the camera. It works by clicking on the point, then choosing ‘Exactly Surveyed’ under survey type. The point in the image would be my scene origin.
After setting the scene origin, I selected another point which I assumed would be further back on Z axis. I roughly estimated the distance of the point, lined up the horizon line correctly and exported the scene into Maya. I then used measurements from Google Earth to scale my scene to real world scale inside Maya.