Tuesday 9 April 2013

Final year project diary update

I am posting this as an update to my final major project progress diary, but have opted for a blog entry as I need to use a lot of screenshots and I don't have a printer! I will be providing a step by step account of how I have achieved my work so far along with any issues i faced, and the learning i had to undertake to overcome the issues.

Once I have made my select from the rushes I captured on the day of the shoot I exported from Premiere a full quality, linear colorspace, DPX sequence of the frames that comprise the VFX shot I am working on for my project.


Following this I imported the sequence into Nuke and configured the project, as well as locking the range to the frames where work is required. This prevents any unnecessary scrubbing/rendering/caching.


The next step was to track the camera. I knew we used a 50mm lens for this particular shot so I entered this along with our camera's sensor size. I found this information online and thankfully my camera had a full frame 35mm sensor which standard measures 24.89mm x 18.67mm as a standard. I then tracked 8 solid points in my sequence (rec709 colorspace) and fed them into the camera tracker. The result was encouraging! I exported the resulting camera and pointcloud as an FBX.


After tracking the camera I had to layout a scene to help verify my track and also matchmove it within the scene. I acquired reference images of the location, most usefully the aerial view from google maps. I made some curves in maya that roughly matched the profiles of the ramps, and aligned these along with primitive shapes to match the contours and buildings on location that feature in the shot. I decided not to take the required time to perfect the model to the plate but I think it is around 90% of the way there as it is.


This is a technique I learnt from The Art and Technique of Matchmoving and was fantastically useful to help me position my camera and also came in very handy when hand matching a 3D BMX to the plate, ensuring it made sense in the scene. I proceeded to 'scooch' the camera in to position trying to align the wireframe geometry to its corresponding piece of the image. All the while I was bearing in mind the information I had from set and referred to a photograph of the camera on the day as a 'sanity check'. I also used the locators in my point cloud to see whether the things I tracked in the first place made sense as to where they lined up with the scene in 3D space. They did. PERFECTLY!!!!








No comments:

Post a Comment