What's New in LP360: April 2019

Published On: April 18, 2019

As you read this, we are preparing the first non-experimental release of LP360 for 2019. 

In this article, I will first provide a bullet list of the major new features and then we will explore a few in detail.  In this list, I will use a code of the format (xy - z) to indicate the LP360 platform and the required licensing level.  In this scheme, xy is the platform where S is native Windows ("standalone") and A is the ArcGIS extension.   Z is the Licensing level (Viewer, Basic, Standard, Advanced).  Remember that sUAS is the same as advanced except for the 4 km2 LAS restriction.  Thus, for example, (SA-A) would mean the feature is in both LP360 and LP360 for ArcGIS and the required licensing level is "Advanced."  Note that a few of these features were also delivered to 2018.2 as a Service Pack.


The List

  • Disassemble/Assemble multipart features.  This tool is used to convert multi-part features to separated features for Feature Edit operations.  The Assemble tool allows you to reconstruct the multipart following editing.  This set of tools is especially useful when you are provided multipart hydro features that need to be updated and you need the high performance environment of LP360 (S-S)
  • Polygon Union and Intersect tools (S-S)
  • Save/Restore Live View Filters (SA - V)
  • Load corrupt LAS files into LP360 for immediate repair using LAS File Analyst (S-S)
  • Return Count added to return tab in Live View.  This is very useful for assessing the multi-return quality of a data set (SA-V)
  • GPS time filter added to Live View (SA-V)
  • Filter by Color added to Live View.  This is a major new addition to Live View.  It allows you to enter color ranges to Live View using a variety of methods (including sampling display areas) and then filter data based on these values.  We automatically convert the normal Red-Green-Blue (RGB) color space of LAS data to Hue-Intensity-Saturation (HIS) since color range selection really does not make sense in RGB space (SA-V)
  •  Filter by User Data field added to Live View.  The User Data field is a 1 byte (8 bit) record in every LAS record type that can be freely used by anyone for ad hoc data storage.  It is commonly used in multibeam drone laser scanners (such as Velodyne) to designate a beam.  Being able to filter on this field is extremely valuable for diagnostics when working with these types of data. - (SA-
  • Display by User Data added to all views.  You can now visualize point data by the User Data field.  The display is similar to Point Source ID, allowing you to assign unique colors to each value (SA-V)
  • Set Project-specific class names.  Some projects call for either an extension to the ASPRS class names or a complete replacement (often the case in power line work).  We have added tools to Live View to make this easy to accomplish (SA-V)
  • You can now use a full source filter for the Point Group Tracing task (SA-A)
  • Live View is now used for all Source Filters.  This is a really big deal for two reasons.  The first is that it makes it much easier to set source filters in areas such as Point Cloud Tasks (PCT).  For example, the Live View Class filter is significantly easier to use when you need to combine Class and Class Flags than our previous user interface.  The second and most important is that all the new filtering additions in Live View are now available in every source filter.  This lets you perform operations such as "filter out points more than 40° off nadir as well as those associated with beams 1 and 8½"  (SA-B)

NIPO - High Performance Data Format

LP360 has always had the advantage of being able to easily deal with very large data sets.  We routinely work with data containing billions of LIDAR points.  However, we are not as efficient as we could be when a project contains a very large number of LAS files (1,000's).  To address this issue, we have added an option to use an "Octree" structure in LP360 rather than a simple tiled data structure.  Our implementation of an octree is Nested (meaning no duplicate data in the "reduced resolution data sets") and Indexed (meaning each "leaf" of the tree has a point mapping index for rapid access).  Hence we call our implementation a Nested, Indexed Point Octree or "NIPO."  It is based on the "potree" implementation from Markus Schuetz (GeoCue is a Diamond sponsor of potree). 

Unlike many applications that employ octrees only for viewing, the LP360 NIPO supports both reading and writing to the octree-resident point data.  This means that you can use the octree option for LIDAR editing!   We also implemented our octree scheme at the Layer level rather than the Project level.  This allows you to create projects with a mixture of conventional LAS layers as well as the new NIPO layers.  This is especially useful when performing operations such as looking at small areas of new data superimposed on a very large multi-county data set in change detection operations. 

Before you can use a NIPO layer, you must build it.  The NIPO builder is a component in LP360 (Standalone) and requires a Standard (or sUAS) license).  A new NIPO layer builder tool is accessed from the File menu (see _Ref5781261).  This tool ingests a directory of LAS or LAZ files (they can even be mixed and in sub-folders) and creates a NIPO data structure at the location you designate.  The NIPO comprises many layers and files but you will not have to deal with any of this complexity. 


The LP360 NIPO Builder

Building a NIPO can take some time (on a decent machine, about 2 seconds per 1 million point records) so when you press "Start Processing½", a separate task is spun off by LP360.  This allows your instance of LP360 to be available for other work while the NIPO is building.  We have added a nice email notification feature that (if you set an email address in Project Settings) will send you a completion notice with the NIPO is finished with the build process. 

If you are creating a NJPO strictly for viewing operations (for example, a NIPO can be used as a project data source for LIDAR Server), you can specify compression (LAZ) as the NIPO format.  This will compress your LAS data by a factor of 2 to 3 and create a Read Only NIPO.

You might want to use a NIPO on small projects with very dense data (e.g. drone LIDAR or structure from motion data sets) because the NIPO can provide a better distribution of the points in zoomed out views.  By all means, experiment!!

New Filters in Live View and Source Filters

We have added new filters to both Live View and everywhere you apply a Source filter such as Point Cloud Tasks.  This really adds powerful new features to LP360.  I will give a small illustration using LIDAR data from a 32 channel Velodyne VLP-32C flown on a drone. 

This particular laser scanner has 32 individual laser beams.  These beams are separated in the in-track direction by 1.25°.  If you are flying at an altitude of 100m, then the distance from the nadir-pointing beam to its adjacent beans (one forward and one aft) is nearly 2.2m!  Analyzing the behavior of these multibeam scanners can be very challenging without the proper tools.  Using the new tools in Live View, I was able to set up a filter using both GPS Time and User ID.  I set the GPS time slice to show only 1/10 of a second (one revolution of the scanner as it was set to a 10 Hz rotational rate) and also configured to show all User Channels.  Finally, I configured the Map View to display by User Data.  The result is the beautiful image of Figure 2.  This image gives us a direct picture of the spread of multibeam laser data on the ground. 

Filter to 1/10 sec and Display by User Data


LP360 2019.1 adds some exciting and very useful new features, particularly in the areas of data visualization and analysis.  The ability to filter and view by the User Data field provides very powerful new tools for performing QC and analyzing problems with multibeam LIDAR data.  The NIPO capability really extends the idea of "limitless LIDAR" for LP360!

I hope you spend some time exploring these new features in LP360.  As always, feel free to give feedback on these and to discuss other important features you think we might need.