We have moved! Please visit us at ANTHROECOLOGY.ORG. This website is for archival purposes only.


May 02 2012

First Group Field Day

On Saturday 4/29/2012 we had the first field day of the semester. The goal was to begin mapping the trees at HR and to perfect our methods. However we soon discovered that our 5x5 meter plots that had been previously marked with PVC had much more error than we anticipated. To accommodate this we mapped the trees in the corners of 25x25 meter plots because they contained the known survey points.

We managed to get 8 of the 5x5 Meter plots surveyed and ready to document. Also, we have determined a new method to plot the 5x5 meter subplots. Our error came from one main source. When we were measuring the 5x5 subplots we started by marking the perimeter. Once this was done we laid out a string across the plot and measured along the string to mark our subplot points. While the points were 5 meters apart in one direction they were not in the other. The reference string did not provide enough accuracy and would lead to a line of points which fall to the left or the right of where they should fall.

To tackle this problem we purchased a straight line laser that can shoot up to 1000ft. The idea behind this is it will give us a perfectly straight reference line. We will shoot the laser across the plot from one known perimeter point to the next and than proceed to mark the points within the plot that lay on this line. This will hopefully do away with the error that accumulates while measuring along an inaccurate reference line.

At the end of the day we learned allot about our methods and what needs to be improved. This is all a part of field work to design, test, and redesign. Hopefully we will have another group field day soon with corrected subplots allowing much more mapping to be accomplished. I want to thank everyone from the ecosynth team and volunteers who made this day possible. 

Apr 04 2012

Hexakopter Flying and Testing the GoPro

Stephen and I practed flying the hexakoptors.  We were able to fly Roflkopter (one of the hexakopters) from the lab to the library, over the library and adjacent garage, and land on a 2ft by 2ft board.  In addition to the lirary expedition, we also practiced maneuvering the hexakopters, landing on a target, and getting them flying at the correct altitude.  Furthermore, we used the GoPro camera to capture video and pictures of the flights.  (The camera was mounted on the hexakopter.)  Unfortunately, the pictures had a lot of compression (as can be seen by the picture to the left that was taken in the lab).  Next week we will be testing to see if adjusting the setting will yield better images.

Below is a link to a video from the GoPro as we flew through Academic Row.  The first half of the video is with the distortion and the second half is the cleaned-up version.

http://www.youtube.com/watch?v=YtPkQShCR8c&context=C451b577ADvjVQa1PpcFNA2j44Y1Kwcn_6Rdo149XVXfaZn7cl70E=

 

 

Mar 27 2012

Topography and the Mapping Grid

There has been a new data sheet designed to address the specific needs of the forest we are working with. Because the method for mapping the trees has changed, the data sheets also needed to be altered. We are returning to the previous used method of laying out a 1x1 meter grid within our 5x5 meter grid. Once this is complete the location of the trees will be marked on the graph found on the data sheet. There has also been a "codes" column added to the data sheet to represent trees that may need special attention. This could include a leaning stem, a stem broken below breast hight, or as seen in the picture multiple stems from one trunk forming below breast height. However, before the trees can be mapped the grid must first be sectioned into 5x5 meter squares. Jonathan, fellow students, and I are hoping to get one of the 25x25 meter plots sectioned off so we can begin to test our tree mapping stratgies. We are also tackling the problems we may face concerning drastic elevation changes. In summary we have all of our supplies ready and in bags we just need to find a time to get dirty and see how our ideas work.


 

Mar 21 2012

Tree mapping Technique

There have been many methods for mapping the trees within our 25x25 meter grid that we have identified. The one certatinty we have decided on is the grid must be sectioned into a 5x5 meter grid before we can begin mapping. The picture on the left shows a method found in the field guide Methods For Establishment And Inventory Of Permanent Plots. This method involves usining geometry to determine the exact point of a tree and we thought it could be more accurate and faster than other ideas. However when we went to our forrest to test we discovered that it was not only more tedious but may not improve accuracy by a reasonable amount if at all. The problems arose when we needed to take measurments on unlevel surfaces. It would involve 3 or more people with much instruction and using handfulls of equpment, it was uneffective for our purposes. We plan on going on another test run before the week ends to try another method that will hopefully work for what we need. 

References:

Dallmeier, F. (1992). "Long-term monitoring of biological diversity in tropical forest areas." Methods for establishment and inventory of permanent plots. MAB Digest Series, 11. UNESCO. Paris

Mar 21 2012

Herbert Run Update

On Monday (3/21/2012), Andrew and I went to Herbert Run to survey more points to make a grid so that we can start mapping trees.  By the end of the day, we finished enough points to have ten 25 by 25 grid points marked and ready for tree mapping.  We begin mapping trees today (3/21/2012).

Dec 26 2011

Ecogeo versus spline codes

There was one last thing that I did for the error analysis. 

Going through the raw ply data set from Herbert Run Spring 2010 in an arbitrary coordinate system, I picked out the locations of 5 buckets that were in the shape of an X on campus:
100, 102, 108, 111, 114.

Using ScanView like before, I was able to pick out each location for these buckets by individually chosing points within the area where a bucket should be that appeared to be a part of a clump of orange. I took the average of each x,y,z coordinate for each set of points to obtain an approximate center of where the buckets should be located in the arbitrary coordinate system generated when the point cloud was made. I then paired these coordinates with the referenced locations of where each specific bucket is located in GPS coordinates. 

This data was used by a different python code ecogeo4.py, which is also a way of getting the 7 Helmert parameters needed to transform arbitrary point cloud into the correct GPS coordinate system. This code takes one parameter text file which should be in the following format:

arbitraryX arbitraryY arbitraryZ realX realY realZ,

one point per row, seperated by spaces not tabs.  

Using the 5 buckets mentioned before, I ran the ecogeo code to obtain a new set of helmert parameters. I then used the applyHelmert python code to transform a list of the locations of the buckets in the raw point cloud, consisting of just 14 points.

This yielded data similar to the process of using the spline.py code. The z direction is still inverted, which is the coordinate that most of the error is coming from. The x and y directions are very good.
For the tranformed x values verses the expected x values, the trend line y = 1.0008x - 265.39, with an R2 of 0.9998.

For the y values, y = 0.9998x + 3372.5, also with an R2 of 0.9998.

The z coordinates are odd with a trend line of y = -0.2557x + 68.562 and an R2 of 0.1563, which is really bad not only because the data is inverted, but it seems to be quite unrelated. 

This data resulted in root mean square errors of distances between actual bucket locations and predicted bucket locations of 2.354 in the XY plane, 9.045 in the Z direction and an overall error of 9.346.

The result I recieved with the spline code had RMSE errors of 4.198 for XY, 95.167 for Z and 95.299 overall. Obviously the spline code does a much worse job converting the data in the z direction than this ecogeo code does, but in the xy plane, the errors aren't too far off.

Overall, the spline code seems to work almost as well as the ecogeo code did with this small data set in the x and y directions, but there is still the confusion with the z direction due to inversion.

Dec 17 2011

TLS scanning at UMBC

We have been having an exciting time in New Jersey and Baltimore working with aTerrestrial Laser Scanner (TLS; Riegel VZ400) for generating high quality 3D reference datasets for validation of Ecosynth data.  We are in the lab today because of windy conditions, working on post-processing and data management of the large amounts of data collected in New Jersey and in the photo studio at UMBC.  I thought it would be a good time for a short update post.

These pictures are from our test setup of mobile scaffolding that we will use for gaining an elevated perspective on several open grown trees for TLS scanning.  The plan is to set up the scaffolding at each of the 4 orthogonal scan stations with the TLS mounted on the platform as shown.

The tower platform is about 2m above the ground and the TLS scanning head is about 3m off the ground.  The tower can be moved by 3-4 people to each of the scanning positions, after the TLS equipment has been taken down!

We have also configured the TLS for WLAN control, meaning that we can operate scanning and review data wirelessly.  This should be useful for when we attempt TLS scanning from the bucket crane.

Nov 30 2011

Georeferencing Code Updates

Continuing from my last post, I did the same analysis on the Herbert Run point cloud that was generated from spring 2011. It turns out at first, the set of GPS data was not ordered properly, so the spline function didn't work correctly. This yielded the following results:

The x-y-z axes show how the orientation of the data is set up. Ideally, this picture would show an untilted image as if one were looking down on the campus perpendicularly. This point cloud was given an incorrect set of helmert parameters, due to having a poorly constructed spline of the GPS and camera data. This problem was fixed and once I analyzed the data again, I got much better results.

 

 This point cloud transformation was much better, now that the GPS points were in the correct order. The x and y axes appear to be close enough to where they should be and it seems that we are perpendicularly looking down onto campus, but there is one glitch that this picture does not show. All of the z coordinates appear to have been inverted. The high points in the point cloud are actually the low points, and the low points in the cloud are the real high points. This is indicated in the analysis of the orange field bucket position in the point cloud versus their actual position in space when the pictures were taken. 

These scatter plots are for this second attempt of transforming the point cloud. The graph is of the X-values of the manually detected buckets in the point cloud, versus the actual GPS coordinates of those buckets in the field. The equation of the trend line for the x coordinates is y=0.996x + 1398.7 with an R-squared = 0.9995. The graph of the y-values of the data is not shown, but is very similar to the first graph, with the trend line for the y values for the buckets being y=1.0073x - 31820 with an R-squared = 0.9994. The graphs  of x and y show a strong correlation between the two data sets for each. Both slopes are very close to 1. 

The second graph shown is for the values of the estimated z coordinates of the buckets versus the GPS z coordinates. You can see a correlation between the two by the trend line, but the slope is negative. The trend line is y = -1.0884x +187.29 and R-squared = 0.9872. This negative slope seems to be tied to the fact that all of the point cloud data had inverted z coordinate values. 
Overall, this data is much, much better than the original result. We are currently trying to find a solution to the inverted z-axis, but the following is the first attempt to fix this problem.

When the helmert parameters were compared to the original data set from Herbert Run in Fall 2010, the fourth parameter which was for scaling turned out to be negative for the spring. We wanted to see how the transformed point cloud would react if we forced the scaling constant to be greater than zero. This change results in the following point cloud orientation:

This did exactly what we wanted for the z-axis, all the real world high points became point cloud high points and lows becames lows. The obvious problem here is that it inverted the x and y axes. This "solution" really did not solve much due to the fact that it caused the problem it was attempting to fix in different axes. The correlation between the 3 sets of variables only changed by making the slopes of the trend lines of opposite sign to what they were before. The R-squared values did not change when the scale parameter was altered. Besides this, despite having the z axis in the correct orientation, the data seems a little wierd. The z coordinates were falling in a range of about (-3,7). I took the differences between the real GPS height of the buckets and the calculated heights of the buckets and it looks like there is a consistent difference between the two. The calculated data is about 50.7 units below that of the expected GPS heights, for each bucket. 
I want to see how just altering the applyHelmert code to multiply anything involving the z-axis by the absolute value of the scale parameter and leaving the x and y axes multiplications alone will do. If we can maintain the x,y axes from the first attempt with ordered data, and use the z-axis orientation with ordered data and only being multiplied by the absolute value of the scale parameter for the z-components, the point cloud should be oriented in the correct way, just translated down too low by a constant amount. (Which is something that has not been explained yet.)

Jul 15 2011

First Altitude Controlled Hexakopter Flight!!!

 

This past week I've been working on flashing the new firmware to fly altitude controlled waypoints. As it turns out there was no need for the newest hardware to use the latest firmware (FC 2.1ME, BL 2.0 required). After working out some compatibility issues with the old version of MKtools, I finally was able to connect to the Hexakopter. Today we were able to do a flight test, check out the video for yourself (best in full screen hd).

Next week I plan to flash the new firmware on to the other 2 remaining Hexakopters.

 

                                                                                        Why are you reading this watch the video!

Apr 05 2011

Digitizing Field Collections

Within the past few weeks digitizing the field collections was completed, this was accomplished by taking Jonathan and Evan's collected field data and creating a referenced shape file over the area of Herbert Run. The work was split by Mariah and I, in the picture above her points are green, mine are yellow. The data was in the form of hand-drawn grids and estimated point positions within the subplot were numbered including the species and estimated DBH of the described tree.

    The method I used in creating this representative tree population distribution is fairly straight forward. Each subplot drawing was overlaid with a transparency in which I attempted to equally partition the cell into 25, 5x5 meter subcells. A 5x5 meter subgrid polygon file was supplied for the Herbert run area and the drawn points were transferred.

 One of the main issues with the supplied data that may cause minor error was the case of trees that split at the trunk. These were denoted on the drawn grids as two dots, and frequently interpreted as two separate trees in very close proximity, rather than the same tree with two DBH's. This data can be fixed but its overall error effect on canopy data may be negligible.

  It's useful to note again that these points are NOT the true locations of the trees, but this set gives a representation of where the species occur. The next major step of this dataset will be collecting the heights of the given trees with the laser hypsometer, or collecting the data in 5 x 5 meter cells or both....we'll see which technique proves fit in the coming weeks!