Oct 25 2011

CAO Dreaming

Breakthrough technology enables 3D mapping of rainforests, tree by tree” - the latest news from the Carnegie Airborne Observatory (CAO)- but also old news: since about 2006, the CAO has been the most powerful 3D forest scanning system ever devised, and Greg Asner has continually improved it.

The CAO was the original inspiration behind Ecosynth.  In 2006/2007, I  was on sabbatical at the Department of Global Ecology at the Carnegie Institute of Washington at Stanford, and my office was right next to Greg’s.   Though he was mostly in Hawaii getting the CAO up and running, he and his team at Stanford completely sold me on the idea that the future of ecologically relevant remote sensing was multispectral 3D scanning (or better- hyperspectral- but one must start somewhere!). 

I coveted the CAO.   I wanted so much to use it to scan my research sites in China.  Our high-resolution ecological mapping efforts there had been so difficult and the 3D approach seemed to offer the chance to overcome so many of the challenges we faced. 

Yet it still seemed impossible to make it happen- gaining permission to fly a surveillance-grade remote sensing system over China?  It would take years and tremendous logistical and political obstacles to overcome.  So I changed my thinking…

What if we could fly over landscapes with a small hobbyist-grade remote controlled aircraft with a tiny LiDAR and a camera?  Alas, no, - LiDAR systems (high grade GPS + IMU) are way too heavy, and will be for a long time.

Then I saw Photosynth, and I thought- maybe that approach to generating 3D scans from multiple photographs might allow us to scan landscapes on demand without major logistical hassles?  The answer is yes, and the result, translated into reality by Jonathan Dandois, is Ecosynth.

Can Ecosynth achieve capabilities similar to CAO?  Our ultimate goal is to find out.   And make it cheap and accessible to all- as the first “personal” remote sensing system of the Anthropocene.

Jun 28 2011

Automated terrestrial multispectral scanning

topcon_ips23D scanning just keeps getting better (but not cheaper!).

A post from Engadget: Topcon's IP-S2 Lite (~$300K) creates panoramic maps in 3D, spots every bump in the road (video) http://www.engadget.com/2011/06/28/topcons-ip-s2-lite-creates-panoramic-maps-in-3d-spots-every-bu/.

More from Topcon:

http://www.topconpositioning.com/products/mobile-mapping/ip-s2

http://global.topcon.com/news/20091204-4285.html

 

IMG_1433

In China recently, we had the good fortune to collaborate in using a wonderful new ground-based (terrestrial) LiDAR scanner (TLS) from Riegl: The VZ-400, which fuzes LiDAR scans with images acquired from a digital camera (~$140K). Pictured at left- graduate students of the Chinese Academy of Forestry with us in the field- literally!

Dec 06 2010

Near-Infrared Structure from Motion?

TTC00007Some time ago we purchased a calibrated digital camera for the purpose of capturing reflectance of near-infrared (NIR) light from vegetation for our computer vision remote sensing research.  The goal was to make 3D structure from motion point clouds with images recording light in a part of the spectrum that is known to provide very useful information about vegetation.

We purchased a Tetracam ADC Lite for use with our small aerial photography equipment.  This camera has a small image sensor similar to what might be found in the off-the-shelf digital cameras we use for our regular applications, but has a modified light filter that allows it to record light reflected in the near-infrared portion of the electromagnetic spectrum.  Plants absorb red and blue light for photosynthesis and reflect green light, hence why we see most plants as green.  Plants are also highly reflective of near-infrared light: light in that portion of the spectrum just beyond visible red.  This portion of light is reflected by the structure of plant cell walls and this characteristic can be captured using a camera or sensor sensitive to that part of the spectrum.  For example, in the image above the green shrubbery is seen as bright red because the Tetracam is displaying near-infrared reflectance as red color.  Below is a normal looking (red-green-blue) photo of the same scene.

Capturing NIR reflectance can be useful for discriminating between types of vegetation cover or for interpreting vegetation health when combined with values of reflected light in other ‘channels’ (e.g., Red, Green, or Blue).  A goal would be to use NIR imagery in the computer vision workflow to be able to make use of the additional information for scene analysis. 

We have just started to play around with this camera, but unfortunately all the leaves are gone off of the main trees in our study areas.  The new researcher to our team, Chris Leeney, took these photos recently as he was experimenting on how best to use the camera for our applications.

It was necessary to import the images as DCM format into the included proprietary software to be able to see the ‘false-color’ image seen above.  I also ran a small set of images through Photosynth, with terrible results and few identified features, link here.  I wonder if there is such poor reconstruction quality because of the grey scale transformation applied prior to SIFT?  It is likely impossible to say what is being done within Photosynth, but I ran some initial two image tests on my laptop with more promising results. 

I am running OpenCV on my Mac and am working with an open source OpenCV implementation of the SIFT algorithm written in C, written by Rob Hess and blogged about previously, 27 October 2010 “Identifying SIFT features in the forest”.  Interestingly Mr. Hess recently won 2nd place for this implementation in an open source software competition, congratulations!  

Initial tests showed about 50 or so correspondences between two adjacent images and when I ran the default RGB to gray scale conversion it was not readily apparent that a large amount of detail was lost and a round of the SIFT feature detector turned up thousands of potential features.  The next thing to do will be to get things running in Bundler and perhaps take more photos with the camera.

Sorry to scoop the story Chris, I was playing with the camera software and got the false-color images out and just had to test it out.  I owe you one!