The Year of LiDAR

2022 is clearly the year of LiDAR. 

At all of the UAS shows in the USA, Mexico, Canada, and EU, the hot topic is LiDAR in 2022, and 2023 is ramping up to be more of the same, with significant growth.

LiDAR is a “LIght Detection And Ranging” sensor, utilizing a laser, position-controlled mirror, an IMU (Inertial Measurement Unit) and internal processing to record geolocation data. 

A LiDAR sensor emits a pulse of light towards the target (ground) The light is reflected from the surface/earth (a point) and returned to the sensor. The receiver detects the returning signal and calculates the distance the light has traveled. Using the position of the sensor, mirror, IMU, the direction in which the light was sent and the distance calculated. Following this return and calculations, the 3D position where the signal was returned may be determined. With millions of reflections striking a terrestrial surface and returning to the LiDAR sensor, these contact “points” are used to generate the 3D model or ortho re-creating the target area in a digital environment.

Because LiDAR sensors generate their own signal pulse, illumination from other sources  (for example, the sun), many LiDAR operator/pilots capture data at night. As long as there is nothing interfering between sensor and surface it is therefore possible to collect data below cloud cover (or in the dark). LiDAR can offer extremely flexible access to areas requiring scans, given the ability to fly at night, or when cloud cover has a negative impact on a site where photogrammetry may not be possible due to lighting conditions. 

LiDAR sensors were previously relegated to fixed wing or rotary aircraft due to weight and cost, now accessible by any mid or heavy-lift UAS. 

The profile seen here, demonstrates the penetration capabilities of the Microdrones HR VLP16 payload. Note the greater resolution of data below trees, both broadleaf and palm. 
The above image is the author’s first experience with LiDAR; a Velodyne VLP16 with Geodetics IMU, mounted to a Yuneec H920 hexcopter.

With ever-increasing flight efficiency coupled with reduced weight and cost of LiDAR sensors, there are several aircraft and LiDAR systems available at affordable price points to suit virtually any budget. While LiDAR may not yet be for casual pilots, commercial pilots report near-immediate full ROI with LiDAR due to the current scarcity of complete systems. 

Sensors may be purchased as a complete/total solution with aircraft, support software, and payload, or owners of medium lift systems may purchase LiDAR sensors separately to mount on whatever aircraft they’re familiar and comfortable with.   For example, there are many LiDAR payloads available for the DJI Matrice 300 platform, Inspired Flight, Freefly, Yuneec, Maptek, Microdrones, and other systems.

LiDAR packages may be stand-alone, combined with separate RGB cameras for photogrammetry, or assembled with both in one housing. For example, the highly popular GeoCue 515 package not only offers a Hesai XT32 LiDAR sensor, it also includes two 20MP RGB cameras for colorizing the pointcloud, or for photogrammetry deliverables. Additionally, the system is designed for properly and precisely scaling RGB data on to the 3D pointcloud, providing not only a very accurate and precise model, but colorized, photo-realistic data for engineers, surveyors, construction teams, graphic designers, game designers, etc. 

Pilots, engineers, program managers, surveyors will want to consider several factors when choosing a LiDAR payload for purchase or rent.

  • Cost
  • Penetration
  • Resolution
  • Software cost/flexibility
  • Difficulty of operation

Different sensors will yield different results. Below are examples from the DJI L1, the Velodyne VLP16 (Microdrones HR), Hesai Pandar XT32 , and the Reigl Vux1 sensors. Profiles/cross sections captured from LP360 illustrate the surface data from the various sensors, and is a confident method of displaying vegetation penetration. 

DJI L1

Pictured above, the DJI L1 is incapable of any effective penetration through vegetation or other porous areas. Additionally, strip alignment may be challenging in some scenarios. This data was captured, initially processed in DJI Terra, and finish processed in GeoCue LP360

Microdrones MD1000HR (VLP16)

The profile seen here, demonstrates the penetration capabilities of the Microdrones HR VLP16 payload. Note the greater resolution of data below trees, both broadleaf and palm.

GeoCue 515

In this image, there are no gaps beneath the trees. In the center, a uniform depression is visible. The Hesai Pandar XT32 was able to “see” below shallow water surface. In this case, approximately 12” of water depth, yet the creek bottom is solid (visible). While the below-water data is not viable for measurement, it does provide greater data for engineering considerations. 

REIGL VUX1

These two illustrations are sourced from the Riegl Vux1 sensor. This sensor provides the highest resolution of all four images compared here, with a much higher price tag to match the image quality. Note in the zoomed in profile, train rails/tracks are not only visible, but accurately measurable. There are no holes in the surface beneath any of the trees, and the tree detail is enough to classify tree types.
 

“Penetrating vegetation is a key function of LiDAR sensors; this is why tree profiles/slices have been used to illustrate these challenging scenarios.”

WHAT ABOUT SOLID STATE LiDAR SYSTEMS?

It is worth noting that solid state LiDAR systems are on the rise, and very much in development for longer-range with high density. Technology hasn’t improved to a point where solid state LiDAR might be broadly applicable for UAS work, while the technology has proved promising due to lighter weight, less power consumption, and speed. However, development is heavily focused on autonomous vehicles at present, yet it is fully anticipated we’ll soon see solid state LiDAR available for aerial applications.

HOW IS LiDAR DIFFERENT FROM PHOTOGRAMMETRY?

Photogrammetry uses multiple images with embedded geodata, matching pixels, and data information to create an orthomosiac. Pointclouds can be derived from images with slightly less accuracy, but a significant time commitment.  A 50 acre field processed as a pointcloud derived from photos may take up to 12 hours on an average computer, while the same computer will process the LiDAR-sourced pointcloud in under 30 minutes.  LiDAR is significantly faster to fly than UAS designed for photogrammetry, as the need for deep overlap is lessened in LiDAR workflow. 

Additionally, LiDAR may be flown at night (unless colorization is needed) while photogrammetry requires daylight hours. 

On the other hand, photogrammetry missions may be flown while there is water on the ground after a flood or heavy precipitation. LiDAR works best in dry, non-reflective environments.  Mirrored windows, water reflecting on leaves, ponds, creeks, etc will display as blacked-out areas in a LiDAR scan.

In this scan of the Colorado River, areas containing water display as black.

Not all software applications are compatible with all the different LiDAR sensors. The way trajectories are read/displayed, how data is managed/handled, even basic features are very different between the various software tools available today. For example, until recently, the data from DJI’s L1 LiDAR system could only be initially processed in DJI Terra software, which is quite limited, and many feel is “kludgy and slow.”  It’s also not a platform known for being stable. 

Recently, GeoCue has added the DJI L1 to it’s compatibility platform, enabling DJI users to use the LP360 software with L1 data, with great stability, flexibility, and speed.

SOFTWARE

When choosing a LiDAR system, there are many considerations, the greatest of which is how important high resolution and precision at ground will be to projects/workflows. Budget frequently makes this determination. However, bottom line vs long-term needs are often at odds with each other; it’s wise to spend “up” to a higher grade LiDAR sensor when customer satisfaction is at the top of the list.  Research often requires higher grade sensors as well.

When choosing a LiDAR system, consider the aircraft carrying the payload, the software required to process the data, and consider flight times as well. Two hours flying a narrow beam sensor vs 30 minutes of a wider throw may make all the difference, particularly when the company has a deep backlog and is focused on efficiency.

Whether LiDAR an organization is ready for LiDAR now, or down the road, there has never been a better time to learn more about LiDAR, pointclouds, and the differences of data processing from photogrammetry workflows. 

Special thanks to Brady Reisch of KukerRanken for the profile slices of data.

Experts Tested 4 Different Drone Mapping Solutions for Crime Scene Investigation

Experts Tested 4 Different Drone Mapping Solutions for Crime Scene Investigation. Here’s What Happened.

At Commercial UAV Expo in Las Vegas, more than 300 drone industry professionals watched as experts tested four different drone mapping solutions for crime scene investigation at night.

Guest post by Douglas Spotted Eagle, Chief Strategy Officer at KukerRanken

Commercial UAV Expo brought UAS professionals, developers, manufacturers, first responders, and related industries under one roof for the first time in nearly two years. Due to the pandemic, the show was less attended than previous years, yet provided robust live demonstrations, night flight, daytime seminars, panels, and case studies for the relatively large audience. There was a strong buzz amongst the crowd about being at an in-person event, and experiencing face to face communication for the first time in many months.

In addition to the “Beyond the Cage” Live Drone Demo Day that launched Commercial UAV 2021, produced by Sundance Media Group, Wednesday night provided attendees with a glimpse of the Crime Scene Investigator tools function in the dark hours. Sundance Media Group developed this methodology several years ago at the request of a law enforcement agency and has been presenting this methodology at academies, colleges, universities, and tradeshows since 2017, with a variety of aircraft including DJI Mavic, Phantom 4, Yuneec H520, Skydio, and Autel EVO series (versions 1 and 2). All successfully output data, excepting Skydio, which struggles with brightly lit events in surrounding darkness.

Presented by FoxFury, Sundance Media Group, Autel, and Pix4D, this event also invited SkyeBrowse to participate in the demonstration, showing the effectiveness and speed of their application.

Testing Drone Mapping Solutions for Crime Scene Investigation: Setting the Scene

With a model covered in moulage, mock slit throat, and blood trail on the ground, the demonstration began with the multi-vendor team led by Brady Reisch, Bryan Worthen of Kuker-Ranken, Todd Henderson and Patrick Harris of SMG,  and David Martel.  The team  placed four FoxFury T56 lighting systems at specific, measured points in the scene, supplemented by FoxFury NOW  lanterns and Rugo lighting to fill in holes and eliminate shadows.

Douglas Spotted Eagle of SMG and KukerRanken emcee’d the event through the two flights.

Douglas Spotted Eagle addresses the crowd of 300 persons

SkyeBrowse had the first flight, with its one-button capture. Brady Reisch set up the mission, with input from the SkyeBrowse developer instructing the exposure levels of the camera for the SkyeBrowse video mission. Once the mission was completed, the photos were uploaded to the SkyeBrowse website, where results were found approximately 30 minutes following the flight.

Brady Reisch of KukerRanken sets up the Skybrowse mission with Bobby Ouyang of Skybrowse

The Autel EVO II Pro was programmed on-site for an automated Skybrowse mission and the demonstration began. The area is highly congested with palm trees and buildings enclosing the small rotunda in front of the Mirage Hotel Convention Center.

Brady Reisch flew the second EVO II  mission manually, in much the same configuration as though the aircraft had flown a double-grid mission, supplemented by high-altitude orbit, coupled with manually captured orbit and select placements. Because of the crowd, time was a consideration. In an actual homicide scene, more low-placed images would have been captured.

Brady Reisch monitors time as Pix4DReact rapid-renders the scene (60 seconds)

The mission photos were uploaded to Pix4dReact on-scene and rendered while the audience observed, requiring approximately 60 seconds to output an ortho-rectified, 2D image, complete with evidence markers/tags, and PDF supplemental report output. Also loaded were the photo images into Pix4D and Leica Infinity software packages, to be rendered for 3D viewing once the show floor opened on Thursday. Pix4DReact is a two-dimensional, rapid-mapping solution, so there is no 3D view.

The four screen captures tell the rest of the story, and readers can determine for themselves what each software is capable of providing.  One point of interest is that there were many claims of “guaranteed 1cm of precision regardless of flight area,” which has yet to be verified. The Kuker-Ranken team will be re-flying a mission with two separate GPS systems (Leica and Emlid) to verify the claims of precision.

Precision is Repeatable

Precision is repeatable. Accuracy is the degree of closeness to true value. Precision is the degree to which an instrument or process will repeat the same value. In other words, accuracy is the degree of veracity while precision is the degree of reproducibility. With a base station, NTRIP, Spydernet, PPK, or RTK workflow, precision is always the goal, well-beyond accuracy. This is a relatively new discussion in the use of unmanned aircraft, and although the topic seems simple enough, complexity holds challenges not easily dismissed by inexperience or lacking education and practice.  We are fortunate to have a partner in Kuker-Ranken, providing precision tools to the survey, forensic, civil engineering, and AEC industries since 1928. The KR team includes PLS’, EIT, and other accredited precision professionals, rarely found in the UAS industry.

Precision is critical for surveyors, civil engineers, forensic analysts and investigators, construction sites, mapping, agriculture, and other verticals in the UAS industry, and this sort of scene is no exception. Being able to properly place a map or model into a coordinate is necessary for many professional pilots in the UAV field, and while this mission is not precise to coordinate, it is precise within itself, or in other words, measurements will be accurate in the image, while being imprecise to the overall location.

We’ll dive more deeply into precision in a future article. For purposes of this exercise, we’re more interested in accuracy of content in the scene, and all four outputs were similar in accuracy within the scene itself. In other words, distances, volumes, and angles may be measured point to point. Pix4DReact is not as accurate as the other three tools, as it’s not intended to be a deeply accurate application given speed of output.

Output Results of Drone Mapping Solutions

Output #1: SkyeBrowse (processing time, approximately 35 minutes)

Output #2: Pix4Dreact (processing time, approximately 1 minute)

drone mapping solution Pix4Dreact

Output #3: Pix4Dmapper (processing time, approximately 2.5 hours)

drone mapping solutions Pix4Dmapper

Output #4: Leica Infinity (processing time, approximately 2 hours, 50 minutes)

drone mapping solutions Leica Infinity

Agencies who would like access to this data are invited to contact Brady Reisch, VDC Specialist at Kuker-Ranken.

Part 91, 101, 103, 105, 107, 137: WHAT’S THE DIFFERENCE?

All these FARs, what’s a drone pilot to do in order to understand them? Do they matter?

YES!

In virtually every aviation pursuit except for sUAS, an understanding of regulations is requisite and part of most testing mechanisms.  As a result, many sUAS pilots holding 

a Remote Pilot Certificate under Part §107 are woefully uninformed, to the detriment of the industry.

Therefore, sUAS pilots would be well-served to inform themselves of how each section of relevant FARs regulate components of aviation.

Let’s start by digging into the intent of each Part.

  • §Part 91 regulates General Operating and Flight Rules.
  • §Part 101 regulates Moored Balloons, Kites, Amateur Rockets, Unmanned Free Balloons, and some types of Model Aircraft.
  • §Public Law Section 336 regulates hobby drones as an addendum to Part 101.
  • §Part 103 regulates Ultra-Light Vehicles, or manned, unpowered aviation.
  • §Part 105 regulates Skydiving.
  • §Part 107 regulates sUAS
  • §Part 137 regulates agricultural aircraft

RELEVANT PARTS (Chapters):

Part §91

This portion of the FARs is barely recognized, although certain sections of Part 91 may come into play in the event of an action by the FAA against an sUAS pilot. For example, the most concerning portion of Part 91 is  91.13, or “Careless or Reckless Operation.” Nearly every action taken against sUAS pilots have included a charge of 91.13 in the past (prior to 107).

Specific to drone actions, The vast majority of individuals charged have also included the specific of a 91.13 charge.

sUAS pilots whether recreational or commercial pilots may be charged with a §91.13 or the more relevant §107.23 (reckless)

It’s pretty simple; if there are consequences to a pilot’s choices and actions, it’s likely those consequences also included a disregard for safety or planning, ergo; careless/reckless. The FAA has recently initiated actions against Masih Mozayan for flying his aircraft near a helicopter and taking no avoidance action. They’ve also taken action against Vyacheslav Tantashov for his actions that resulted in damage to a military helicopter (without seeing the actual action, it’s a reasonable assumption that the action will be a §91.13 or a §107.23 (hazardous operation).

Other parts of Part 91 are relevant as well. For example;

  • §91.1   Applicability.

(a) Except as provided in paragraphs (b), (c), (e), and (f) of this section and §§91.701 and 91.703, this part prescribes rules governing the operation of aircraft within the United States, including the waters within 3 nautical miles of the U.S. coast.

The above paragraph includes sUAS.  Additionally, Part 107 does not exclude Part 91. Airmen (including sUAS pilots) should be aware of the freedoms and restrictions granted in Part 91.

§91.3   Responsibility and authority of the pilot in command.

(a) The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft.

(b) In an in-flight emergency requiring immediate action, the pilot in command may deviate from any rule of this part to the extent required to meet that emergency.

(c) Each pilot in command who deviates from a rule under paragraph (b) of this section shall, upon the request of the Administrator, send a written report of that deviation to the Administrator.

§91.7   Civil aircraft airworthiness.

(a) No person may operate a civil aircraft unless it is in an airworthy condition.

(b) The pilot in command of a civil aircraft is responsible for determining whether that aircraft is in condition for safe flight. The pilot in command shall discontinue the flight when unairworthy mechanical, electrical, or structural conditions occur.

§91.15   Dropping objects.

No pilot in command of a civil aircraft may allow any object to be dropped from that aircraft in flight that creates a hazard to persons or property. However, this section does not prohibit the dropping of any object if reasonable precautions are taken to avoid injury or damage to persons or property.

§91.17   Alcohol or drugs.

(a) No person may act or attempt to act as a crewmember of a civil aircraft—

(1) Within 8 hours after the consumption of any alcoholic beverage;

(2) While under the influence of alcohol;

(3) While using any drug that affects the person’s faculties in any way contrary to safety; or

Sound familiar?

SubPart B also carries relevant information/regulation with regard to operation in controlled airspace, operations in areas under TFR ((§91.133), operations in disaster/hazard areas, flights during national events, lighting (§91.209)

PART 101

Part §101 has a few applicable sections.

Subpart (a) under §101.1 restricts model aircraft and tethered aircraft (balloons). Although subpart (a.4. iiv) is applicable to balloon tethers, there is argument that it also applies to sUAS. Subpart (a.5.iii) defines recreational flight for sUAS/model aircraft.

Finally, §101.7 re-emphasizes §91.15 with regard to dropping objects (may not be performed without taking precautions to prevent injury or damage to persons or property).  Public Law 112-95 Section 336 (which may be folded into a “107 lite” version), clarifies sections not added to Part 101.

Bear in mind that unless the pilot follows the rules and guidelines of a NCBO such as the AMA, AND the requirements of that NCBO are met, the flight requirements default to Part 107 requirements.

PART §103

Part §103 regulates Ultralight vehicles (Non powered, manned aviation)

Although no component of Part §103 specifically regulates UAV, it’s a good read as Part 103 contains components of regulation found in Part 107.

PART §105

Part §105 regulates Skydiving.

Part §105 carries no specific regulation to sUAS, an understanding of Part 105 provides great insight to components of Part 107. Part 107 has very few “new” components; most of its components are clipped out of other FAR sections.

PART §107

Although many sUAS pilots “have their 107,” very few have actually absorbed the FAR beyond a rapid read-through. Without a thorough understanding of the FAR, it’s difficult to comprehend the foundation of many rules.

PART §137

Part 137 applies specifically to spraying crops via aerial vehicles.

Those looking into crop spraying via sUAS should be familiar with Part 137, particularly with the limitations on who can fly, where they can fly, and how crops may be sprayed.
One area every ag drone pilot should look at is §137.35 §137.55 regarding limitations and business licenses.

The bottom line is that the more informed a pilot is, the better pilot they can be.  While there are many online experts purporting deep knowledge of aviation regulations and how they specifically apply to sUAS, very few are familiar with the regulations in specific, and even less informed as to how those regulations are interpreted and enforced by ASI’s. We’ve even had Part 61 pilots insist that the FSDO is a “who” and not a “what/where.” Even fewer are aware of an ASI and how they relate to the world of sUAS.

FSIM Volume 16

It is reasonably safe to say that most sUAS pilots are entirely unaware of the Flight Standards Information Management System, aka “FSIMS.” I’ve yet to run across a 107 pilot familiar with the FSIMS, and recently was vehemently informed that “there is nothing beyond FAR Part 107 relative to sUAS. Au contraire…

Familiarity with the FSIMS may enlighten sUAS operator/pilots in how the FAA examines, investigates, and enforces relevant FARs.

Chapter 1 Sections 1, 2  and 4 are a brief, but important read, as is Chapter 2, Section 2.

Chapter 3 Section 1 is informational for those looking to apply for their RPC Part 107 Certificate.

Chapter 4 Sections 2, 5, 7, 8 are of particular value for commercial pilots operating under Part 107.

Volume 17, although related only to manned aviation, also has components related to 107, and should be read through (Chapters 3 & 4) by 107 pilots who want to be informed.

Gaining new information is always beneficial, and even better if the new information is implemented in your workflow and program. Become informed, be the best pilot you can be, and encourage others to recognize the value in being a true professional, informed and aware.