Year by year the growing of Blender come increasing and surprising us. The short movie Tears of Steel proves this, mainly with the new feature of Blender, the camera tracking.
This article will show some tests with this technology in conjunction with Pyhton Photogrammetry Toolbox. Firstly we did an attempt to reconstruct a scene partially with PPT and match it with the footage.
Second, we use the camera tracked and we imported other scene (sphynx) to use the real movement of the camera.
Why use PPT instead a modeling over a picture? 1) Because the reconstruction over a picture is subjective and have the distortion of the perspective. 2) Because scanning complex objects can be easer than modeling it (thing in a statue broken, or an assimetric vase). 3) Because make the texture will be easier when we use the reference pictures. 4) Because you can use the frames of the footage to reconstruct the scene. 4) Because the work of ilumination can be easier cause the texture already be illuminated and the scene (background) be ready.
How can I use the camera tracking in Blender? Make the process can be more easier than you think. A good videotutorial can be found here. Once you have the scene tracked, you can do the reconstruction using PPT.
The image above is a frame of the original footage. How we said, you can use the video to make the reconstruction with PPT. You will have to convert the video in a imagem sequence using FFMPEG, for example (see the previous articles).
The great news, is that we discover (thanks to rgaidao!) an addon that imports Bundler files (bundler.out) inside Blender.
With this, you can receive the cameras with the pictures to project the texture on model.
And produce a model with a great quality of resemblance with the original.
Obs.: Unfortunately this reconstruction wasn't made by Luca Bezzi, the master of PPT reconstruction. So, we did all possible cover using Meshlab and Ball Pivoting reconstruction. This was sufficient to make a model that matched with the original in the important areas.
With the model tracked, reconstructed and matched, you can increase the possibilities of animation to make the impossible... like the picture above, and the videos in the top.
In archaeology, the Blender traking can be used, for example, to reconstruct ancient builds over current ruins.
The uses can be many, your creativity is the limitation.
A big hug!
Thank you for the "master", Cicero, but Alessandro is more expert than me in this technique :). Anyway we want to try to take a video with the UAV. It would be nice to try camera tracking and Blender from such a perspective...
ReplyDeleteExcellent job Cicero! Using a SfM output (point cloud + camera) with Blender camera tracking was very original. The use of dense point clouds as base for traditional polygonal 3D model is a reality, but there's still a lack of good FLOSS tools to work them. If i'm not mistake Blender itself still don't display vertex colors. rgaidao.
ReplyDeleteHi Luca! Ok, you two are the masters ;) I'm waiting for the news. rgaidao, thank you again for th elink of the Blender script. Blender work with vertex color, but projected in the faces. A big hug!
ReplyDelete