We have moved! Please visit us at ANTHROECOLOGY.ORG. This website is for archival purposes only.


Apr 26 2012

What is Trimble up to?

I am really curious about what Trimble is up to and where the company is headed in the future. Trimble leading manufacturers of mapping and survey grade GPS equipment and software.  Earlier in the month it was announced that Trimble acquired the company Gatewing, developers of a streamlined UAV / computer vision 3D mapping system, press release here.  Today I found that Trimble is also buying Google Sketchup, Sketchup blog post here and Trimble press release here.

A 3D mapping company and a community-based 3D modeling program/warehouse in one month -- clearly massive 3D surveying and mapping are at the top of the list for Trimble.

I think this is very exciting, but what comes next?  More importantly perhaps, where do trees, vegetation and the non-built parts of local ecosystems fit into this?

Apr 25 2012

Finishing up Herbert Run

Saturday April 21st  , Shelby, Dana and I went out to Herbert Run and were able to get a lot done in in the field. We were able to set a decent amount of control points as well as make progress during the stake-out by obtaining roughly 8 more points. The points we were able to get were the ones on the far Eastern edge of Herbert Run 168-175. This leaves us with about 8 or so points left that need to be mapped out and I am convinced that with Dana's help today and a little help Friday that I will be able to complete the Survey either Saturday or Sunday at Herbert Run. Then next week I can do the complete Survey for The Knoll.

This data collector is helping a lot. At Landesign Inc. the data collector I used was no where near as nice as this one is. It can connect to the internet, which could make stake out really convenient. As displayed in the picture on the left the Trimble actually pulls up a complete map of the Survey that I will do. The points you see on the screen are the ones that I have loaded in. It also has my control points which you can see towards the top left of the screen as point numbers 4004 and 4005 These are my traverse 1 and traverse 2. This will be how I establish my traverse loop which because of the small size will probably have quite a small amount of error.

I am considering using existing control at Herbert Run to create control at the Knoll. Since I have specific coordinates for the traverse at Herbert Run I can use those to Run control down to The Knoll. This just means that I will have two traverse loops obviously the one that connects HR to the Knoll is far larger than the one that will be specifically for HR, which entails more error but we can distribute that error evenly throughout the loop and it will be fine.

Finally I am going to need to create the Grid points in GIS this week for The Knoll so that Monday I can get started on that.

Dana also had a great idea when we were out in the field she proposed that we use the GPS to guide us to the points that we are trying to survey. Sometimes these points are in the worst locations such as the points around 170 in Herbert Run and it can be a major hassle trying to read the map and decide what direction and how far is needed to pace to the next point. With the GPS we could get a much better rough estimate of where the next point to be staked out is, for those points that when using the map and actually walking through the forest are impossible to just pace to. This is the situation for large fallen trees in the way, large increases in elevation and streams.

Apr 11 2012

Herbert Run West Surveying & Keystone Rental -- Updates

Saturday April 7th, Shelby and I went out to Herbert Run to stake out some more points on the western portion. We were able to get seven points that are all in the area pictured on the left. A lot of this area was really difficult to get because of the amount of brush that was online but we were able to get them all from a one of the new points that Will had RTK GPS located.

I went to Keystone Precision this morning to ask about renting data collectors, it turns out that they do rent out data collectors and they also give out software packages for them. The rates seem fairly inexpensive at $42.50/day, but they do not rent out prisms. I explained our situation and the sales representative generously said they would let us borrow a prism if we rented the data collector.

The data collectors are called the Ranger, which has a software package called Survey Controller and the TSC2  which has a software package called Survey Pro.

They also had an option to rent an entire total station plus rod set up for $120/day.

The representative at Keystone said that if I were to talk to Brian Wagaman I would be able to have a lot of my questions, (such as information pertaining to the stake out options of the data collector ) , answered.  I vaguely remember the survey company that I used to work for, Landesign Inc, were really big fans of Brian Wagaman’s help with total station questions and data collector trouble shooting issues, he is really approachable.

I believe this is the best way to proceed and hopefully we can rent the data collector and software for this weekend so that I can finish Herbert Run and the Knoll. I can call Brian tomorrow about the data collector rental and hash out the issues with him.

Dec 26 2011

Ecogeo versus spline codes

There was one last thing that I did for the error analysis. 

Going through the raw ply data set from Herbert Run Spring 2010 in an arbitrary coordinate system, I picked out the locations of 5 buckets that were in the shape of an X on campus:
100, 102, 108, 111, 114.

Using ScanView like before, I was able to pick out each location for these buckets by individually chosing points within the area where a bucket should be that appeared to be a part of a clump of orange. I took the average of each x,y,z coordinate for each set of points to obtain an approximate center of where the buckets should be located in the arbitrary coordinate system generated when the point cloud was made. I then paired these coordinates with the referenced locations of where each specific bucket is located in GPS coordinates. 

This data was used by a different python code ecogeo4.py, which is also a way of getting the 7 Helmert parameters needed to transform arbitrary point cloud into the correct GPS coordinate system. This code takes one parameter text file which should be in the following format:

arbitraryX arbitraryY arbitraryZ realX realY realZ,

one point per row, seperated by spaces not tabs.  

Using the 5 buckets mentioned before, I ran the ecogeo code to obtain a new set of helmert parameters. I then used the applyHelmert python code to transform a list of the locations of the buckets in the raw point cloud, consisting of just 14 points.

This yielded data similar to the process of using the spline.py code. The z direction is still inverted, which is the coordinate that most of the error is coming from. The x and y directions are very good.
For the tranformed x values verses the expected x values, the trend line y = 1.0008x - 265.39, with an R2 of 0.9998.

For the y values, y = 0.9998x + 3372.5, also with an R2 of 0.9998.

The z coordinates are odd with a trend line of y = -0.2557x + 68.562 and an R2 of 0.1563, which is really bad not only because the data is inverted, but it seems to be quite unrelated. 

This data resulted in root mean square errors of distances between actual bucket locations and predicted bucket locations of 2.354 in the XY plane, 9.045 in the Z direction and an overall error of 9.346.

The result I recieved with the spline code had RMSE errors of 4.198 for XY, 95.167 for Z and 95.299 overall. Obviously the spline code does a much worse job converting the data in the z direction than this ecogeo code does, but in the xy plane, the errors aren't too far off.

Overall, the spline code seems to work almost as well as the ecogeo code did with this small data set in the x and y directions, but there is still the confusion with the z direction due to inversion.

Nov 30 2011

Georeferencing Code Updates

Continuing from my last post, I did the same analysis on the Herbert Run point cloud that was generated from spring 2011. It turns out at first, the set of GPS data was not ordered properly, so the spline function didn't work correctly. This yielded the following results:

The x-y-z axes show how the orientation of the data is set up. Ideally, this picture would show an untilted image as if one were looking down on the campus perpendicularly. This point cloud was given an incorrect set of helmert parameters, due to having a poorly constructed spline of the GPS and camera data. This problem was fixed and once I analyzed the data again, I got much better results.

 

 This point cloud transformation was much better, now that the GPS points were in the correct order. The x and y axes appear to be close enough to where they should be and it seems that we are perpendicularly looking down onto campus, but there is one glitch that this picture does not show. All of the z coordinates appear to have been inverted. The high points in the point cloud are actually the low points, and the low points in the cloud are the real high points. This is indicated in the analysis of the orange field bucket position in the point cloud versus their actual position in space when the pictures were taken. 

These scatter plots are for this second attempt of transforming the point cloud. The graph is of the X-values of the manually detected buckets in the point cloud, versus the actual GPS coordinates of those buckets in the field. The equation of the trend line for the x coordinates is y=0.996x + 1398.7 with an R-squared = 0.9995. The graph of the y-values of the data is not shown, but is very similar to the first graph, with the trend line for the y values for the buckets being y=1.0073x - 31820 with an R-squared = 0.9994. The graphs  of x and y show a strong correlation between the two data sets for each. Both slopes are very close to 1. 

The second graph shown is for the values of the estimated z coordinates of the buckets versus the GPS z coordinates. You can see a correlation between the two by the trend line, but the slope is negative. The trend line is y = -1.0884x +187.29 and R-squared = 0.9872. This negative slope seems to be tied to the fact that all of the point cloud data had inverted z coordinate values. 
Overall, this data is much, much better than the original result. We are currently trying to find a solution to the inverted z-axis, but the following is the first attempt to fix this problem.

When the helmert parameters were compared to the original data set from Herbert Run in Fall 2010, the fourth parameter which was for scaling turned out to be negative for the spring. We wanted to see how the transformed point cloud would react if we forced the scaling constant to be greater than zero. This change results in the following point cloud orientation:

This did exactly what we wanted for the z-axis, all the real world high points became point cloud high points and lows becames lows. The obvious problem here is that it inverted the x and y axes. This "solution" really did not solve much due to the fact that it caused the problem it was attempting to fix in different axes. The correlation between the 3 sets of variables only changed by making the slopes of the trend lines of opposite sign to what they were before. The R-squared values did not change when the scale parameter was altered. Besides this, despite having the z axis in the correct orientation, the data seems a little wierd. The z coordinates were falling in a range of about (-3,7). I took the differences between the real GPS height of the buckets and the calculated heights of the buckets and it looks like there is a consistent difference between the two. The calculated data is about 50.7 units below that of the expected GPS heights, for each bucket. 
I want to see how just altering the applyHelmert code to multiply anything involving the z-axis by the absolute value of the scale parameter and leaving the x and y axes multiplications alone will do. If we can maintain the x,y axes from the first attempt with ordered data, and use the z-axis orientation with ordered data and only being multiplied by the absolute value of the scale parameter for the z-components, the point cloud should be oriented in the correct way, just translated down too low by a constant amount. (Which is something that has not been explained yet.)

Nov 22 2011

Analyzing the Point Cloud Transformations

This graph represents the data for the Herbert Run site from October 11, 2010. I used ScanView to locate the exact coordinates of the orange buckets in the transformed point cloud that was created with the previously written helmert code. The values on the X-Axis represent the actual GPS values from the georeferencing in the x direction, where higher values are more western, I believe. The values on the Y-Axis correspond to the calculated mean of the orange points I extracted with ScanView. The black line is the line of best fit of the data and has a slope of 0.9941, which is quite close to 1. A slope of 1 would indicate an exact correlation between the two data sets. This is good in two ways: the slope is actually positive, so there's a positive correlation between the two data sets, and the slope is very close to 1, which means the correlation is strong. The graph for the Y values is very similar, with a positive slope of 1.0079. What's really good about this is the results I got before I did this analysis, with the point cloud of a different data set.

This is for the knoll site from fall 2010. There is a negative correlation, and the slope is no where close to 1, so this mean the transformation of this particular point cloud did not turn out well at all. It's possible that I made a mistake running the spline.py code to get the 7 Helmert parameters. The 4th parameter which is for scaling was negative which doesn't seem nice, but it looked like the data wasn't rotated enough either. I still have another data set to test out, and once that is done I'm going to retry this data set to see if it was just a mistake I made.

A small note about the bucket search based on colors, some of the buckets were on top of blue boxes which seemed to be altering the color of the orange points, they looked pretty pink which was not a color I was searching for. This could be a reason why some of the buckets were not registering in my search. Plus Jonathan pointed out that some of the trees were starting to change colors at this point, so that could be a small source of some of the extraneous points.

Aug 03 2011

Pentax WG-1 GPS camera–too slow for scanning

I loved the Pentax WG-1 GPS camera when it first arrived.  It looked cool, had a non-extending lens, and offered the potential for GPS tagging our photos during flight – a feature that could be very time-saving for reconstructions.

But out of the box I quickly noted some major drawbacks.  The first was that the GPS only updates every 15 seconds.  At the average speed of 5 m/s of a Hexakopter, that meant that GPS logs would be something like 75m apart!  The unit also has a slower continuous shooting mode than the SD4000, about 1 fps.  The biggest drawback by far though was the lag, which I can only assume is a memory write lag.

I set up the camera to the maximum image quality settings, in continuous shooting mode, and with 15 second GPS refresh.  I was using a brand new Sandisk Extreme 16GB memory card, which would provide professional grade write speeds.  I strapped down the shutter button by lightly taping a plastic nut over the button and wrapping the unit with a velcro strap, just like we do with the SD4000s.  The Pentax WG-1 would take a continuous stream of about 30 photos then stop.  It would show the ‘number of images remaining’ counting down and just hung out.  After sometimes 10-15 seconds it would then resume taking photos continuously, but then repeat the same thing after another 30 photos.  The camera was not taking photos for 10-15 seconds while in continuous shooting mode.  At a flying speed of 5 m/s that means that for 50-75 meters in the air, no pictures would be taken!

I repeated this test with increasingly lower camera settings until I got down to the lowest possible settings of maximum compression and 640x480 resolution.  This time the camera took lots more photos  (~100 or so) but still had a long lag of no photos.

It was this that finally made us decide to send the Pentax WG-1 back.

Based on my research this GPS camera has the fastest GPS refresh time of any other point and shoot style camera, but the continuous shooting ‘lag’ was a deal breaker.

Jul 27 2011

ArduPilot/ArduCopter Update

As many of you know our attempt at photographing Elbow Ridge Farm via two EasyStars this past weekend was anything but successful. Although the primary issue of Auto mode being inactive was eventually resolved at the field by trimming the endpoints on the mode toggle switch, we were still unable to fly our missions due to the system failing to obtain a GPS lock. Even after that weekend when the GPS was relocated to a more familiar region it was unable to obtain a GPS signal. Out of frustration I had decided to clear the GPS settings and delete the current firmware. Starting from scratch I reloaded both the firmware for the GPS as well as a script which enables it to communicate with the ArduPilot. In doing this I had also updated the GPS to a more recent firmware version. Just as it had in the past the ArduPilot was able to get a GPS lock within 5 minutes of being powered up (as indicated by the solid blue LED on IMU shown in the above picture). I’m planning on flying a short mission with the EasyStar tomorrow afternoon to make sure everything is working as it should be. Hopefully this will better prepare us for our next trip to Elbow Ridge Farm.

This week Jonathan had given me the camera mount and landing gear from one of the old Gaui quads so I could attach it to the new ArduCopter system. I just figured I’d post a picture of the new setup. The landing gear had to be extended by 2 in to provide enough clearance for the camera so I drew up a CAD model and laser cut a set of extensions out of 1/8’’ thick plastic sheet (triangular support structure on bottom). I had also added a servo to the camera mount and attached it to one of the output ports on the ArduPilot Mega to provide automatic camera stabilization. So far it seems to work great but we’ll need to upgrade the stabilizing servo if we plan to fly missions with camera stabilization activated.       

Jul 15 2011

First Altitude Controlled Hexakopter Flight!!!

 

This past week I've been working on flashing the new firmware to fly altitude controlled waypoints. As it turns out there was no need for the newest hardware to use the latest firmware (FC 2.1ME, BL 2.0 required). After working out some compatibility issues with the old version of MKtools, I finally was able to connect to the Hexakopter. Today we were able to do a flight test, check out the video for yourself (best in full screen hd).

Next week I plan to flash the new firmware on to the other 2 remaining Hexakopters.

 

                                                                                        Why are you reading this watch the video!

Jul 14 2011

Sub-centimeter positioning on mobile phones?

Just came across this today at Slashdot: "Sub-centimeter positioning coming to mobile phones": http://bit.ly/pIvQ0e.

Apparently this is based on a technique called “SLAM”.  From wikipedia: “Simultaneous localization and mapping (SLAM) is a technique used by robots and autonomous vehicles to build up a map within an unknown environment (without a priori knowledge), or to update a map within a known environment (with a priori knowledge from a given map), while at the same time keeping track of their current location.”

I could imagine this becoming VERY interesting for high spatial resolution 3D scanning in Ecosynth- but maybe I am missing some potential limitation to this? 

Your thoughts?