My friend and colleague Dr. Gene Roe recently published a column in his LIDAR News blog entitled “UAS Lidar – Not a Simple Upgrade from Photogrammetry” (published May 3, 2020). Now Gene and I usually agree on remote sensing issues, but on this one I have to take very strong objection. I think I can safely do this because we fly UAV LIDAR/camera systems on a daily basis and have a pretty good handle on the complexity of both. If you read no further, my observation is that doing accurate UAV LIDAR is considerably easier than doing accurate drone photogrammetry. The operative word here is accurate.
Gene’s article made me think deeply about our own GeoCue experience. Gene tries to make the case that drone-based photogrammetry is pretty easy, but UAV LIDAR is very hard and fraught with risk. The GeoCue True View 410, our current 3D Imaging System (3DIS), fully provides both technologies – photogrammetry from its dual, oblique 20 MP photogrammetric cameras and LIDAR from its Quanergy M8 Ultra scanner (Figure 1). We have flown thousands of missions using drone photogrammetry (particularly with our Loki direct geopositioning system) and hundreds with the True View 410 and other UAV LIDAR systems. In every single flight, we do detailed metric accuracy analysis. We know a lot about this technology.
In really thinking about our experiences with these systems, I have to say that photogrammetry is, without a doubt, more challenging than LIDAR. I think folks who have not used LIDAR might find this statement surprising so allow me to elaborate.
First of all, I am not talking about getting a decent looking visualization orthophoto from a drone camera system; this is actually quite easy. You will be successful with a Phantom 4 Pro (yes, they are once again available) and a cloud-hosted processing solution. It will look great! What I am talking about are 3D point clouds derived from photogrammetry (the process is often called Structure from Motion, SfM, though this is only partially correct) where you are trying to minimize ground control and will do checks for both planimetric and vertical accuracy. An example of where you will get bit very badly if these things are not exactly right is cut and fill volumetrics. Cut and Fill requires flights separated in time. Time separated flights that are going to be used in differencing operations require exacting attention to network accuracy.
In both cases (photogrammetry and LIDAR), you need a good reference strategy (e.g. a local base station) so that is a wash in the comparison. In both cases, you are going to need to understand some geodesy so you won’t get embarrassed delivering ellipsoid data when the client wanted geoid. Again, a wash in the comparison. It is also a good idea that you fully understand the nuances of Root Mean Square Error (RMSE) and what system factors contribute to its constituents, bias and deviation.
Cameras not specifically designed for photogrammetry (those in the True View 410 are) such as DSLR and the various DJI cameras are very, very difficult to calibrate. In fact, DJI even warns users of its Phantom 4 RTK platform that, while the camera is factory calibrated, it is not a “photogrammetric” calibration. If you use lots of Ground Control and know what you are doing, self-calibration works and this concern is removed from your plate. However, if you are trying to reduce control or it simply cannot be deployed (e.g. due to site hazards) you are going to have to deal with this problem. Good luck if you not dealing with a professional aerial survey equipment supplier!
Once you have dealt with camera calibration, the next issue comes in mission planning and data post-processing. With UAV LIDAR, you only need fly slow enough and keep the target (usually the ground) within range of the scanner. A good LIDAR system always includes a high accuracy 6 degrees of freedom (6-DOF) positioning system such as, an Applanix APX. A well calibrated system is really just point and shoot (which you would think would be terms we would use for a camera!).
Generating 3D data from 2D images using “SfM” algorithms requires good correlation between the overlapping areas of imagery. A photogrammetric system cannot directly derive 3D; it has to be algorithmically extracted. This means you will have trouble in a lot of scene types, particularly vegetation. The point matching algorithms also need to see a lot of redundant images on which to apply robust statistics. This makes really simple UAV LIDAR tasks such as collecting a pipeline right of way in a forested area quite difficult, if not impossible, with photogrammetry.
An in-depth comparison of the two technologies requires a long article to do the workflow justice. Just be aware that we can process True View 410 data from time off sensor to a high accuracy RGB colorized point cloud in about half the flight time. For example, a 16 minute collect will take about 8 minutes of post-processing. Not only that but there are no ambiguous decisions to make in the processing; just fill in basic data and press the buttons Photogrammetry, on the other hand, requires extremely careful mission considerations (only a subset of projects will be suited to a photogrammetric solution) and quite a bit of tweaking during processing if you have any sort of adverse terrain considerations. A typical 15 minute flight can require more than 4 hours of processing before you arrive at a usable point cloud.
A final note is Return on Investment (ROI). Yes, LIDAR and 3DIS (dual purpose photogrammetry/LIDAR such as the True View 410) sensors are considerably more expensive than camera-only systems. This extra cost is driven by the laser scanner, of course, and also the high end 6-DOF direct geopositioning system. However, I would counter this with two facts:
Of course, with a True View 410 you can eliminate the ROI issue by first engaging with a subscription model.
Finally, on the subject of having an expensive sensor in the air and the loss incurred if you have a crash (which, at some point, most folks do!).This is one of the factors that allow you to charge a premium for UAV LIDAR services. Be a stickler for good planning/flight producers and have sensor insurance. Be brave!
A final word on this subject. We find the most confusing area for our users, even professionals, are the myriad of spatial reference systems and reference strategies that must be used when doing drone photogrammetry or LIDAR. The requirements here do not differ one whit from doing manned airborne LIDAR and photogrammetry. My advice on this front is to do what it takes to gain some basic knowledge in this area and only work with companies who specialize in high accuracy metric mapping systems when you start down your drone remote sensing path.