Drone LIDAR Mapping is the process of (at least in GeoCue’s perspective) collecting topographic data with a drone-mounted sensor at or near “survey” quality and processing that data into a useful derived product.
What do we mean by “at or near survey quality?” The most prominent aspect of drone mapping, when considering survey requirements, is Network Accuracy. When we speak of network accuracy, we mean how “close” the data are to a known reference in a specific spatial reference system (SRS). For example, if we had a National Geodetic Survey (NGS) benchmark/monument, how close is our mapping position of this reference to the known location (if you are not in the USA, simply translate to your equivalent national mapping agency). Of course, in a typical drone mapping location, we do not have the luxury of a handy known, high accuracy reference point. At GeoCue, we typically use the NGS Online Positioning User Service (OPUS) to establish a proxy for an NGS reference. This involves placing a survey grade Global Navigation Satellite System (GNSS) base station and taking a minimum 2 hour long static observation. The observation file is then processed in the OPUS system to provide the horizontal and vertical location of the base station. Since CORS 96 and beyond NGS reference systems are GNSS defined, this is a valid procedure. One of the very nice things about OPUS (beyond giving us the reference location) is the provision of detailed accuracy analysis. The report provides us the root mean square error (RMSE) and sometimes peak-to-peak for the observation session. Some of you may argue that GNSS is not sufficiently accurate in the vertical and therefore digital leveling should be used. This is a great topic and will be the subject matter for a future article. For now, trust me that OPUS is good enough for this application.
Our process for validating accuracy follows traditional photogrammetric procedures as documented in the American Society for Photogrammetry and Remote Sensing (ASPRS) Positional Accuracy Standards. These standards present a process of using known ground truth (ground control/check points, GCPs) against which are measured the observations under test. At our GeoCue test range, we have a network of signaled GCPs, where “signalized” means they have been marked such that they can be measured in the collected data. We usually employ ceramic tiles on which we either tape an “X” or glue a black/white pattern. A typical target is shown as Figure 1. Once placed, these targets are located in the spatial reference system (SRS) using a Real Time Kinematic (RTK) rover, referenced to the previously discussed base station. The rover software provides us with an RMSE estimate of accuracy relative to the base station. At the very short baselines (distance from base to rover) that we customarily see in drone mapping, this error is well below 1 cm.
Figure 1 – GeoCue True View Drone Mapping Target
Once we have flown a drone mapping mission with a sensor such as our True View 410 Imaging/LIDAR sensor, the data are validated in our True View Evo software, included with every True View sensor. This software includes a tool set that provides all of the functions needed to measure vertical and horizontal accuracy relative to the aforementioned GCPs. The True View Evo horizontal and vertical accuracy tools are fully complaint with the previously mentioned ASPRS Positional Accuracy Standards.
Vertical accuracy is deduced by measuring the vertical distance from a GCP (which is at a known location) to the area of the point cloud directly above or below the GCP. True View Evo will create a local surface from the point cloud by either creating a triangle mesh or using an inverse distance weighted (IDW) interpolation algorithm. The resultant measurement is called a Residual. Obviously we would like all of the residuals to be zero but this is not likely! The True View Evo software computes the root mean square, mean and standard deviation of these residuals. Since the residuals represent the error between the known GCPs and the point cloud, the RMS of the residuals is the Root Mean Square Error (RMSE).
Horizontal error is a bit trickier since you cannot really see most photogrammetric targets in LIDAR data. Fortunately, the True View 410 has dual photogrammetric cameras that are physically tied to the Position and Orientation System (POS) used to “geocode” the LIDAR data. In a factory calibration process, we geometrically tie the laser scanner system to the cameras via a set of calibration parameters. The result is the horizontal accuracy of the cameras can serve as a proxy for the horizontal accuracy of the laser scanner. The tools in True View Evo allow you to measure the horizontal location of the GCPs and compute the RMSE of the X, Y and combined planimetric residuals. The True View Evo accuracy assessment dialog, superimposed on a GCP, is shown in Figure 2.
Figure 2- Measuring Accuracy in a Drone Mapping Scenario
Another characterization of data collected in drone mapping systems is the “noise” level within the data. We consider noise to be some quantification of the random component by which the LIDAR point cloud definition of a surface deviates from the true surface. The accepted method for this measurement is to test the deviation of the point cloud from a flat (non-curved) surface. If all you have is basic statistical software, you will have to find a perfectly level hard surface on which to run this test. With our True View Evo drone mapping software, we have a tool that can measure the distance from any flat surface to a point cloud, regardless of the “tilt” of that surface. We do this by running a Principal Component Analysis (PCA) on the surface to find the normal (perpendicular) to the best fit surface. We then measure the distance, in the direction of this normal, of each point in the test area from the surface. Finally, we compute the standard deviation of these distances. This statistic is the precision of the sensor. We specify this at the 1 standard deviation (“1 sigma”) level. An example of precision measurement in a drone mapping LIDAR/Imagery project is shown in Figure 3.
Figure 3- Measuring Precision in a Drone Mapping Scenario
Well, obviously the best possible values one could have for accuracy and precision are zero! Of course, this is not physically realizable by any system, regardless of cost. I worked with a Transportation Research Board (TRB) study group that tried to define some brackets for different system accuracy levels for mobile mapping systems. While we had some great, in-depth discussions, no final tables were created. Similarly, no definitive accuracy classes have been defined for drone mapping systems. Thus all I can tell you is our typical values for the True View 410 system.
We specify the True View 410 as achieving better than 5 cm, RMSE, vertical accuracy. We typically see around 3 cm RMSE after removing any vertical bias (I will discuss vertical bias in LIDAR data in a future article). Is this mapping grade or survey grade? Well, we do not know since these standards are always discussed but never seem to be specified. 5 cm vertical RMSE is sufficiently accurate to create 10 cm (1/3 foot) contours. In my experience, this is certainly much better than “mapping grade” data.
On the noise front, we specify better than 5 cm precision at one sigma (one standard deviation). We typically see precision of around 3 to 4 cm. This is a reasonably small noise level but certainly not as low as you would get from a Riegl mobile mapping system. Then again, a True View 410 is less than 100K whereas a Riegl mobile mapping system is probably 10x the cost. So, as always, you get (well, sometimes) what you pay for.
The example used in this article is a test flight of a True View 410 at an above ground level (AGL) altitude of 75 meters. For this particular test, the results are:
So for what sort of projects is the True View 410 drone mapping system suited? I will summarize here and provide a more detailed article at a later date. I consider the True View 410 to be the “go to” utility sensor for every day drone mapping projects. Examples of applications include:
What is a True View 3D Imaging Sensor not so well suited for? These are somewhat specialized tasks that typically require a Riegl-class laser scanner or a ground based technique such as mobile mapping using a high accuracy/low noise scanner (again, such as a Riegl). Examples include:
In general, you can see that these are tasks that require a minimum of vertical noise.
From an accuracy point of view, the True View 410 drone LIDAR/Imaging system is suited to a very wide variety of tasks. In fact, unless you are in a specialty survey/mapping field such as road construction, the True View 410 will handle most of your small area, aerial mapping needs with no problems. The fact that it produces a colorized point cloud along with inspection images is a significant differentiator from other sensors in the “utility” class. Please do not hesitate to contact us for clarifications or questions on this very important topic of accuracy and precision.