Showing posts with label RGBDemo. Show all posts
Showing posts with label RGBDemo. Show all posts

Thursday, 20 June 2013

Kinect - Infrared prospections

Despite what I wrote at the end of this post, it looks like that Kinect is not really the best option for archaeological underground documentation, or for any other situation in which it is necessary to work in darkness.
I already tested the hardware and the software (RGBDemo) at home, simulating the light conditions of an underground environment, and the result was that Kinect scanned in 3D some parts of an object (a small table), with great difficulties. 
My hope was that the infrared sensors of Kinect were enough to record the objects geometries also in darkness, as actually happened. The problem was that probably RGBDemo, to work properly, needs also RGB values (from the normal camera). Without colors information the final 3D model is obviously black (as you can see below), but (and this is the real difficulty) it seems that the software loses a fundamental parameter to keep tracking the object to document, so that the operations become too slow and, in most cases, it is not possible to complete the recording of a whole scene. In other words the documentation process often stops, so that after it is necessary to start again or simply to save different partial scans of the scene, to reassemble at a later time.
However, before discarding Kinect as an option for 3D documentation in darkness, I wanted to do one more experiment in a real archaeological excavation and, some weeks ago, I found the right test area: an acient family tomb inside a medieval church.
As you see in the movie below, the structure was partially damaged, having a small hole on the North side. This hole was big enough to insert Kinect in the tomb, so that I could try to get a fast 3D overview of the inside, also to understand its real area (which was not identifiable from the outside).




As I expected, it was problematic to record the 3D characteristics of such a dark room, but I got all the informations I needed to estimate the real perimeter. I guess that in this occasion RGBDemo worked better because of the ray of light that, entering the underground structure and illuminating a small spot of the ground, was giving the software a good reference point in order to track all the surrounding areas.
Since the poor quality video it is difficult to evaluate the low resolution of the 3D reconstruction, you can get a better idea looking this other short clip, where the final pointcloud is loaded in MeshLab.



This new test of Kinect in a real archaeological excavation seems to confirm that this technology is not (yet?) ready for documentation in complete absence of light. However the most remarkable result of the experiment was the use of one of the tool of RGBDemo, which shows directly the infrared input in a monitor. This option has been a good prospection instrument to explore and monitoring the inside of the burial structure, without other invasive methodology. As you see in the screenshot, it is possible to see the inside condition of the tomb and to recognize some of the objects that lie on the ground (e.g. wooden planks or human bones), but of course this could have been done simply with a normal endoscope and some led lights (like we did in this occasion).

RGBDemo infrared view
However, here is possible to compare what the normal RGB sensor of Kinects is able to "see" in darkness and what its infrared sensors can do:

PS
This experiment was possible thanks to the support of Gianluca Fondriest, who helped me in every single step of the workflow.

Monday, 8 October 2012

Kinect 3D indoor: excavation test

To complete the "Kinect trilogy", today I write this post about our first test during a real archaeological fieldwork. 
Also in this case we (Alessandro Bezzi and me) used our "hacked Kinect" with the external battery in connection with the rugged PC and, again, the chosen software for data acquisition was RGBDemo. This time we documented in 3D a layer during an "indoor" excavation, to avoid the problems with direct sunlight I descirbed in this post.
The video below tries to summarize this operation...




... and here are some screenshots to have an idea of the final result:

The pointcloud (frontal view)


The pointcloud (side view)

The mesh

The mesh (wireframe)

As you can see the general quality is lower respect the results we can obtain with other techniques (e.g. SfM and IBM), but Kinect and RGBDemo have the benefit to acquire and elaborate the data almost at the same moment, with the possibility to see the documentation process in real time. 
Ultimately Kinect is one more option to consider for 3D indoor documentation, considering the peculiarities of the archaeological project (the light conditions, the available time, the required level of detail, etc...). Our experiments will now go on now with some tests in particoular situations, where this technique could be the best option (expecially in underground environments).
Have a nice day!

Saturday, 6 October 2012

Kinect 3D outdoor: first test

It was a sunny September Sunday, so I decided to take a walk with my wife Kathi and show her one of the hermitages located in the valley in which we live (Val di Non, Trentino, Italy). 
My second thought was that the ramble was a perfect opportunity to test the hacked Kinect and try to document in 3D the main wall of S. Gallo's ruins (the remains of the hermitage). So I prepared the backpack with Kinect, the external battery and the rugged pc we normally use on the archaeological excavations. 
After half an hour's walk throught apple orchards and woods we reached the hermitage. Along the way we also found a stunned rooster. That was strange! A rooster, in italian "gallo", in the S. Gallo's hermitage...
However, we began to try to document the main wall of the ruins, which you can see in the picture below...

S. Gallo's hermitage, with the rooster

... but, probably due to direct sunlight conditions, Kinect and RGBDemo where not working propertly.
In fact, as you can also read in M. Dalla Mura, M. Aravecchia and M. Zanin poster (during "LOW COST 3D: sensori, algoritmi e applicazioni" workshop), "...The main issue is due to direct Sun illumination that leads to saturation in the depth acquisition...". Moreover the software (RGBDemo) was reacting very slowly, but this was probably due to the hardware (Panasonic Thougbook), which is less powerful compared to the laptop I normally use to work. Secondly also RGBDemo seems to work better on GNU/Linux (ArcheOS), the Operating System which runs my laptop, than in Windows, the rugged PC OS (but this could be just my impression). 
Not beeing satisfied with the results I get with the 3D documentation of the ruins (software too slow to manage all the data recording process, high errors on the sunny parts of the wall, etc...), I decided to check for another subject to document. Luckily in S. Gallo's hermitage are not missing the caves, so, with the help of Kathi, I did a fast digital 3D copy of the cave you see in the picture below.

S. Gallo's cave


This time the software was working good, fast enought to work on the field and with negligible errors in data acquiring. In the movie below it is possible to see the final pointcloud (not complete, but big enought to understand the quality of a 3D "field" documentation with Kinect).



After this test, our Kinect was ready to the "trial by fire" of a real (indoor) archaeological excavation, which will be the topic of one of the next posts in ATOR.
Ciao.
BlogItalia - La directory italiana dei blog Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.