We have moved! Please visit us at ANTHROECOLOGY.ORG. This website is for archival purposes only.

Oct 14 2011

Mikrokopter and Computer Vision/Photogrammetry used for Landslide Modeling

Researchers at the Universität Stuttgart, Institute for Geophysics in Stuttgart Germany, have used manually flown Mikrokopters and semi-automated photogrammetric software to generate high resolution photo mosaics and digital terrain models of a landslide area for tracking terrain displacement.  

An article published this spring in the journal Engineering Geology demonstrated the value of using remote controlled aircraft and off-the-shelf digital cameras for high resolution digtial terrain modeling.  The researchers used photogrammetry and computer vision software VMS to make 3D terrain models with aerial images and compared the results to aerial LIDAR and TLS terrain models.  A network of ~200 GPS measured ground control points were used to assist with image registration and model accuracy with good results.

The authors appear to agree with our sentiments that RC based aerial photography and 3D scanning has the benefits of low-cost and repeatability compared to traditional fixed wing or satellite based data collections.

Unlike our research, the authors of this study were interested in only the digital terrain model (DTM) and vegetation was considered noise to be removed for more accurate surface modelling.

Again...just one more reason for me to get cranking on that next paper!

Image source: http://commons.wikimedia.org/wiki/File:Super_sauze_landslide.JPG

Oct 13 2011

WebGL Globe

I just stumbled on a great looking globe based data visualization tool: WebGL Globe.

The screen cap at right is from a browser based visualization of 1990 global population data.  While not realted to Ecosynth, this is a really cool technology that could be valuable for other research in our lab, like visualizing global data for understanding global relevance of locations and studies, the GLOBE project.

Some of the example global datasets are fun, Google technology user group meetings or blogger mood, but it would be interesting to see some more ecologically relevant data plotted this way.  For example, trends in estimates of forest cover or urban growth.

Can't wait to see this in 3D, or maybe in a holographic projection!

Aug 03 2011

Pentax WG-1 GPS camera–too slow for scanning

I loved the Pentax WG-1 GPS camera when it first arrived.  It looked cool, had a non-extending lens, and offered the potential for GPS tagging our photos during flight – a feature that could be very time-saving for reconstructions.

But out of the box I quickly noted some major drawbacks.  The first was that the GPS only updates every 15 seconds.  At the average speed of 5 m/s of a Hexakopter, that meant that GPS logs would be something like 75m apart!  The unit also has a slower continuous shooting mode than the SD4000, about 1 fps.  The biggest drawback by far though was the lag, which I can only assume is a memory write lag.

I set up the camera to the maximum image quality settings, in continuous shooting mode, and with 15 second GPS refresh.  I was using a brand new Sandisk Extreme 16GB memory card, which would provide professional grade write speeds.  I strapped down the shutter button by lightly taping a plastic nut over the button and wrapping the unit with a velcro strap, just like we do with the SD4000s.  The Pentax WG-1 would take a continuous stream of about 30 photos then stop.  It would show the ‘number of images remaining’ counting down and just hung out.  After sometimes 10-15 seconds it would then resume taking photos continuously, but then repeat the same thing after another 30 photos.  The camera was not taking photos for 10-15 seconds while in continuous shooting mode.  At a flying speed of 5 m/s that means that for 50-75 meters in the air, no pictures would be taken!

I repeated this test with increasingly lower camera settings until I got down to the lowest possible settings of maximum compression and 640x480 resolution.  This time the camera took lots more photos  (~100 or so) but still had a long lag of no photos.

It was this that finally made us decide to send the Pentax WG-1 back.

Based on my research this GPS camera has the fastest GPS refresh time of any other point and shoot style camera, but the continuous shooting ‘lag’ was a deal breaker.

Jul 29 2011

Multirotors on the Colbert Report

Check out multirotors on the Colbert Report!!!  The clip starts at about 15 minutes into the program.

The researcher, Missy Cummings Associate Professor from MIT, is developing better human multirotor interfaces to help people steer the units using only a smart phone, which makes me wonder how different it is from the Parrot AR.Drone.



Seeing this video reminded me of something I noticed when flying the Hexakopters on campus with Tom Allnutt last week, see his post here.  Many people stopped and asked, ‘What is that?’, as usual, while we were out practicing in the Quad at UMBC.  But almost everyone asked if we had put a camera on it, as if that was the obvious thing to do with such a cool device.  I explained to them our research and that we do usually fly with cameras and thought to myself that something is different now then when we were practicing last year.  In September 2010 when people asked us what we were doing they never asked if we were putting cameras on the devices and thought it was an odd thing to do when we told them about our work.  Now it seems that the practice is even expected.  I hope this signals a shift in perception about autonomous vehicles as useful tools for research and for recreational aerial photography and not just greater public awareness about the other uses of such devices.

UPDATE: I've been thinking about this post and in all fairness, the researcher is discussing the use of multirotors by the armed forces.  I posted for the sake of noting the signifcance of the devices in pop-culture.

Jul 18 2011

XBees Again

I think I have fixed the XBees, again, maybe…

I wanted to get our tablet laptop up and running again as a Hexakopter flying machine for the field – especially since I got the new Pentax WG-1 GPS camera in the mail today (I’ll post on that soon).  This laptop had already been running Mikrokopter-Tool v1.74a, allowing us to do 3D waypoint runs, but the XBees were not functioning at all.  I also had it in my head to install a SSD hard drive in this old laptop to give it a new lease on life – what better opportunity to try a fresh setup! 

A quick note to anyone that has found their way here with their own XBee woes, we are using XBee Pro DigiMesh 900 modules.  This post discusses the (hopefully) successful configuration of a pair of XBee Pro 900’s each mounted on an Xbee Explorer USB.  In a previous post, Xbee Solutions?, I suggested that it is necessary to have an Xbee Explorer Regulated on the MK end, but it seems that may not be necessary based on the results described below.

I got all the standard drivers and software installed and running (XCTU and  UART drivers) and plugged in the suspect Xbees.  Windows 7 said it correctly installed the new hardware, but when I opened up MikroKopter Tool I could not get any XBee communication. AAAAAAAH!

Back to the internet, I found this long thread about Xbee problems that offered promise: http://forum.mikrokopter.de/topic-21969.html

Taking from the thread, I set up two XBees on the same machine in two instances of XCTU to be able to effectively range test and compare parameters. Why had I never thought of that!? I read the modem configurations from each unit – mostly noting anything that was other than the default and confirming the baud rates were set correctly.  I quickly noted that the Modem VID numbers were different and read from the help dialog: “Only radio modems with matching VIDs can communicate with each other.”  One XBee was set to the default and another was set to a specific number.  I didn’t remember making this change but decided to set them both to the same number.  The range test was now working perfectly (see post picture).  Back in Mikrokopter Tool I was back in business with wireless telemetry, but I still couldn’t transfer waypoints.  I kept getting that ‘Communication Timeout’ error.

I tried another suggestion from this  post  in the same thread and manually adjusted the Destination Addressing fields on each unit.  I noted the high and low serial numbers for each unit (SL and SH) and manually configured the  high and low destination addresses to point at each other: XBee1 DL = XBee2SL, XBee1DH = XBee2SH, and vice-versa.

I flashed these settings, booted up MikroKopter Tool and was wirelessly transferring waypoints and receiving telemetry with no problems.

Of course, now we just have to see if it’s actually going to work in the field!

Next up: playing with the GPS camera!

Apr 10 2011

Visualizing point clouds in your browser

Check out 3DTubeMe.com to see some of the latest in web based 3D visualizations.  I was directed to a post on Slashdot about the website by a professor and am totally thrilled about what this could mean for visualizing or own 3D point cloud data.  Currently you need to login and add this as an app through Facebook to upload and view, but the website authors say they are going to get rid of this feature soon.  I uploaded a small set of photos for processing, but was notified that my camera was not in their database and to wait to hear back about the processing of my cloud.  Maybe we could get this WebGL working to visualize our own point clouds? 

That’s all for now, back to the grind!

Apr 07 2011

Open Source Terrain Processing

I am very excited by the current prospects of incorporating free, open-source terrain processing algorithms into our workflow.  While we are ultimately interested in studying the trees in our 3D scans, it is necessary to automatically derive a digital terrain model (DTM) that represents the ground below the canopy for the purpose of estimating tree height.

A recent paper in the open-source journal Remote Sensing, describes several freely available algorithms for terrain processing.  I am in the process of converting the entire ArcGIS workflow we used in our first paper into an automated Python workflow, and am excited about the prospect of incorporating other open-source algorithms into the mix.  Currently, by working with Numpy in Python, my processing code takes a input Ecosynth point cloud and applies two levels of ‘global’ and ‘local’ statistical filtering to remove outlier and noise elevation points in about a minute for 500,000 points.  This had previously taken hours with ArcGIS, but by formatting the data into arrays, Numpy effortlessly screams through all the points in no time. 

I am going to focus on two pieces of software.  One is the Multiscale Curvature Classification algorithm (MCC-LIDAR) by Evans and Hudak, at sourceforge here, that was mentioned in the recent paper in Remote Sensing.  The other is the libLAS module for Python, included with OSGeo, that can be used to read and write to the industry standard LAS data format for working with LiDAR data. Fun, fun!  This of course if going on in the meantime while I try to get my proposal finished.


Dandois, J.P.; Ellis, E.C. Remote Sensing of Vegetation Structure Using Computer Vision. Remote Sens. 2010, 2, 1157-1176.

Tinkham, W.T.; Huang, H.; Smith, A.M.S.; Shrestha, R.; Falkowski, M.J.; Hudak, A.T.; Link, T.E.; Glenn, N.F.; Marks, D.G. A Comparison of Two Open Source LiDAR Surface Classification Algorithms. Remote Sens. 2011, 3, 638-649.

Apr 05 2011

Mention of unmanned aircraft in new FAA Act

The Dayton Business Journal provides a short review of the language in the new FAA Reauthorization and Reform Act of 2011 that would lead to official FAA Unmanned Aerial Vehicle regulations and classifications, along with official test sites in Dayton.  You can have fun with the text of the legislation from the Library of Congress, here.  The section on unmanned aircraft is brief, but a few key points should be noted. 

  • Aircraft are classified as public ('...an unmanned aircraft system that meets the qualifications and conditions required for operation of a public aircraft...') or small ('...an unmanned aircraft weighing less than 55 pounds...')
  • The FAA will make the determination "...which types of unmanned aircraft systems, if any, as a result of their size, weight, speed, operational capability, proximity to airports and population areas, and operation within visual line-of-sight do not create a hazard to users of the national airspace system or the public or pose a threat to national security..."
  • That the determination of these aircraft types is to occur within 180 days from the enactment of the Act
  • The FAA has 270 days from the time of approval of the Act to issue a plan for developing regulations
  • The plan must be in effect by September 30, 2015.

In summary, it looks like we might get a word on regulations of particular size, weight, range classes of aircraft this summer, but that the full regulatory system is still years away.  I think that the language I highlighted in the second bullet is particularly important, it tells us that determinations are going to made based on whether the craft poses a threat to users of the national airspace and to national security.  Arguably, working with small, light-weight aircraft (< 4 lbs) at altitudes below the national airspace (< 400 ft) will make sure that our research stays within the safezone.  

As a side note, I think there is also some language in the Act about the popular passenger bill of rights, but I didn't take the time to look!


UPDATE: The AMA just sent out a post about this as well:

Mar 05 2011

Field Day with the 485 Class

We had a great field day with the GES 485 class on Saturday flying the Hexakopter at the Herbert Run site and developing field work and 'ground-synthing' techniques.

The weather was actually quite good for a data collection.  The sky was overcast and there was no wind, meaning that the Hexakopter was able to stay on track and the light was relatively diffuse so there are few shadows in the images.  I gave a large set of about 2000 photos to Photoscan for processing on Saturday afternoon and it is still running.  This is a great software, but I don't yet have enough of my own benchmarking data with large sets to really test out how it is going to work. I hope the point cloud looks good!

Regarding our Xbee testing, I used the MKUSB to upload waypoints, but then discovered that when I power down and then power back up to swap out the battery and plug in the wireless Xbee module I cannot read waypoints from the MK, or they are not stored on board.  But, I was able to upload waypoints wirelessly with the new Xbee configuration and the real-time telemetry communication during the flight was OK.  At least the current setup is no worse than what we had before.  More to come.

Mar 03 2011

New Field Equipment for 3D Forestry

Our new forestry mapping equipment is going to make collecting 3D tree and canopy data a lot easier!

We recently acquired a Trimble GeoXT GPS and TruPulse 360B laser range-finder for use in our forestry field data collection work.  The GeoXT is a high grade mobile-mapping, mobile GIS, GPS unit that offers sub-meter accuracy after post-processing in the lab. 

By itself this would allow us to collect sub-meter (0.5m – 0.7m) accurate positions of tree trunks or other features on the ground.  The TruPulse is used for measuring distances and heights using a built in laser and inclinometer that automatically does all that pesky math that would be needed when using an analog clinometer.  The 360B model has built in Bluetooth communication, which means that with a little configuration in the lab the unit can wirelessly beam positional and height data to the GeoXT.

This combo is used for ‘offset-mapping’ where the user stands in one location with both GPS and laser in hand and by using the laser is able to map to the GPS the XYZ position of other objects that are not nearby (typically less than 200m based on the power of the laser).  For us, this means I can map the position of tree tops in 3D space and automatically record the tree height to the mapping GPS with relative ease and greater precision than when using paper and pencil field notes.  This type of data collection is necessary for the calibration and validation of Ecosynth 3D point clouds, http://ecotope.org/ecosynth/methods/ecology/.

We will roll out this tech in the field in the coming few weeks as we move into the growing season, but in the mean time my initial results suggest that this will be a high-quality approach for mapping the position of tree crowns, a vital and challenging task.

The photo below doesn’t look like much, but it shows a sample of some of this 3D data.  This is an oblique shot looking through a 3D point cloud of the Knoll at UMBC.  The yellow area at the bottom is a digital terrain model of the land underneath the canopy; the blue points are the Ecosynth 3D point cloud of the site; and the red points are 3D points of tree tops and tree base mapped using the GPS  / laser combination.  This screen capture doesn’t do it justice, but trust me when I say that it looks good in 3D!

Hey Evan, are you sure you don’t want to come back to continue the forestry work?