The first test scan was set up using the P20 Leica scanner at the photographic studio at Kingston Uni. Following this we’ve been working on a 3D rendering engine that will allow us to render the point-cloud data generated by the terrestrial laser scanner on a tablet.
This came about because we needed to find a solution to our immediate challenge, which was to find a way to allow this portable device, to deal with such high detail data. The image data contains millions of points, and we needed to determine how this data would translate visually once rendered, how clear it would be and how freely could the user navigate it.
Following the test day we started by writing a tool to convert the point cloud data into much smaller binary files, by using a programming technique called “memory mapping.”
Once this work was complete we were left with a file that contained all the points and also fitted within the virtual memory constraints of the device. (I.e. less than 700Mb) The software developed also allows the user to rotate around the scan, and zoom in and out to see more detail.
Here are some stills taken from the test scan, also click link below to view. https://vimeo.com/112594848
The scan also provides colour information for each point, to allow the scanned object to be rendered in colour. This information in this test scan was poor, de-saturated, and of the wrong white balance. To tackle this issue we arranged a second test scan to try a different approach using an external optical camera and more lighting.