Here you can find articles about our products and services, case studies that we have conducted, announcements and updates, white papers and recorded webinars that explain and demonstrate various tasks and workflows that can be performed in our software.
All | Articles | Case Studies | News | Recorded Webinars | White Papers
GeoCue Group Inc. has been involved in the testing of survey-grade instruments on small Unmanned Aerial Systems (sUAS or, more popularly, “drones”) with the California Department of Transportation (Caltrans) for the past several years. We have had the privilege of working with Dr. Riadh Munjy, the lead investigator on this project, on some deep questions concerning accuracy of systems. Dr. Munjy is the department chair of the Civil & Geomatics Engineering Department at the California State University, Fresno and has been a primary contributor to the current state of art in photogrammetry and laser scanning for many years.
As part of this research project, Dr. Munjy’s group at Cal State has laid out a fantastic aerial sensor test site at the San Joaquin Experimental Range (SJER). On my first visit to this range some time ago, I was looking for military weapons! After a bit of research, I realized the “Range” in SJER is “cattle” range! SJER is primarily an agricultural test site jointly managed by Cal State Fresno and a few Federal and State agencies. The sUAS test site comprises about 75 acres and surrounds the SJER facilities cluster, providing some nice planar surfaces (sloped roofs) for examining LIDAR noise and strip to strip conformance. The terrain is high desert with some isolated trees and scrub brush. The terrain relief over the sUAS test area is about 35 meters. The sUAS test range is shown in Figure 1 with the layout of 3D control/check points shown as magenta triangles. I have also overlaid a set of rough 5 foot contours generated from a first pass ground classification (automatic ground classification and contour generation performed in True View EVO) to give you an idea of the terrain relieve – gently rolling.
Steve Riddell (Algorithms lead for True View products and other duties as assigned!!) and I traveled to the range with a True View 410 and True View 620 in hand that last week in October. These systems are True View 3D Imaging Sensors (3DIS®) that comprise a LIDAR laser scanner and dual photogrammetric cameras. We brought along a DJI M600 Pro as our flight platform (I like the M600 Pro for, among other reasons, the fact that we can carry the batteries on a plane as cabin luggage).
We flew the True View 410 on October 26th and the True View 620 the next day (our True View 620 was making trips round the west as lost luggage for 24 nail-biting hours!). We flew parallel east-west flight lines of 50% overlap (with flight lines clipped to 40° off-nadir) with no cross strips (we were flying a pattern specified by the research group).
We processed the data to a colorized point cloud in our True View EVO post-processing software. EVO drives Applanix POSPac (desktop or cloud) to perform the trajectory solution so you can stay in our wizard-driven software for the entire post-processing workflow. We processed the data using the Position and Orientation System (POS) solution only – we did not inject any ground control points. We were able to measure both vertical and horizonal accuracies using the tools built into True View EVO for ASPRS compliant accuracy assessment.
A summary of vertical accuracy is contained in Table 1 below. I will report on Horizontal Accuracy in a future True View Bulletin. It is important to note the conditions of the below accuracy assessment:
Sensor |
Check Points |
Mean Error of residuals |
Std Dev of the Mean |
Std Dev of residuals |
Residual Range |
RMSE Network Accuracy |
True View 410 |
77 |
-3.47 |
0.27 |
2.29 |
10.8 |
4.18 |
True View 620 |
76 |
0.43 |
0.21 |
1.86 |
8.66 |
1.95 |
I find these results quite remarkable. We advertise the True View 410 as a 5 cm Network Accuracy system and here we see 4.18 cm RMSE with no debiasing! Similarly, we observe 1.95 cm Network Accuracy with the True View 620, again with no debiasing. As we know, if the standard deviation of the mean (SDOM) is small relative to the mean error, we can justify using check points to debias the data. Since RMSE2 = Mean2 + STDV2, we can easily determine the accuracy of the above tests should we debias.
Debiased Network Accuracy (in cm):
The other remarkable feature to note is the total residual error range for the two sensors. For example, the True View 410 exhibits a total error range of 10.8 cm. If we assume this is distributed about the mean, we have had a maximum excursion of less than 5.4 cm from any check point. I find this quite remarkable for data collected from an sUAS!
Now you may be thinking “the True View 410 data are nearly as accurate as the 620 for a whole lot less money. Why would I get a True View 615 or 620?” The answer involves other important characteristics of the data – primarily hard surface deviation (“noise”), low energy return detection (e.g. detecting power lines) and ability to penetrate vegetation. These factors are all much better with the True View 615/620.
This study once again adds support to the idea of rapid workflows for data production. We literally processed these data from ingest from the sensor to a contour map (which requires ground classification) in two hours time. Talk about high return on investment!
I will follow this article up in the near future with a discussion of horizontal accuracy in True View 3D Imaging Systems. In the meantime, stay safe!
Type: Articles, Case Studies
Categories: Resources
Topics: 3D Imaging Sensor, 3Dis, Accuracy, Data Processing, LIDAR, photogrammetric point cloud, point clouds, sUAS mapping, True View, UAV LIDAR