The Year of LiDAR

2022 is clearly the year of LiDAR. 

At all of the UAS shows in the USA, Mexico, Canada, and EU, the hot topic is LiDAR in 2022, and 2023 is ramping up to be more of the same, with significant growth.

LiDAR is a “LIght Detection And Ranging” sensor, utilizing a laser, position-controlled mirror, an IMU (Inertial Measurement Unit) and internal processing to record geolocation data. 

A LiDAR sensor emits a pulse of light towards the target (ground) The light is reflected from the surface/earth (a point) and returned to the sensor. The receiver detects the returning signal and calculates the distance the light has traveled. Using the position of the sensor, mirror, IMU, the direction in which the light was sent and the distance calculated. Following this return and calculations, the 3D position where the signal was returned may be determined. With millions of reflections striking a terrestrial surface and returning to the LiDAR sensor, these contact “points” are used to generate the 3D model or ortho re-creating the target area in a digital environment.

Because LiDAR sensors generate their own signal pulse, illumination from other sources  (for example, the sun), many LiDAR operator/pilots capture data at night. As long as there is nothing interfering between sensor and surface it is therefore possible to collect data below cloud cover (or in the dark). LiDAR can offer extremely flexible access to areas requiring scans, given the ability to fly at night, or when cloud cover has a negative impact on a site where photogrammetry may not be possible due to lighting conditions. 

LiDAR sensors were previously relegated to fixed wing or rotary aircraft due to weight and cost, now accessible by any mid or heavy-lift UAS. 

The profile seen here, demonstrates the penetration capabilities of the Microdrones HR VLP16 payload. Note the greater resolution of data below trees, both broadleaf and palm. 
The above image is the author’s first experience with LiDAR; a Velodyne VLP16 with Geodetics IMU, mounted to a Yuneec H920 hexcopter.

With ever-increasing flight efficiency coupled with reduced weight and cost of LiDAR sensors, there are several aircraft and LiDAR systems available at affordable price points to suit virtually any budget. While LiDAR may not yet be for casual pilots, commercial pilots report near-immediate full ROI with LiDAR due to the current scarcity of complete systems. 

Sensors may be purchased as a complete/total solution with aircraft, support software, and payload, or owners of medium lift systems may purchase LiDAR sensors separately to mount on whatever aircraft they’re familiar and comfortable with.   For example, there are many LiDAR payloads available for the DJI Matrice 300 platform, Inspired Flight, Freefly, Yuneec, Maptek, Microdrones, and other systems.

LiDAR packages may be stand-alone, combined with separate RGB cameras for photogrammetry, or assembled with both in one housing. For example, the highly popular GeoCue 515 package not only offers a Hesai XT32 LiDAR sensor, it also includes two 20MP RGB cameras for colorizing the pointcloud, or for photogrammetry deliverables. Additionally, the system is designed for properly and precisely scaling RGB data on to the 3D pointcloud, providing not only a very accurate and precise model, but colorized, photo-realistic data for engineers, surveyors, construction teams, graphic designers, game designers, etc. 

Pilots, engineers, program managers, surveyors will want to consider several factors when choosing a LiDAR payload for purchase or rent.

  • Cost
  • Penetration
  • Resolution
  • Software cost/flexibility
  • Difficulty of operation

Different sensors will yield different results. Below are examples from the DJI L1, the Velodyne VLP16 (Microdrones HR), Hesai Pandar XT32 , and the Reigl Vux1 sensors. Profiles/cross sections captured from LP360 illustrate the surface data from the various sensors, and is a confident method of displaying vegetation penetration. 

DJI L1

Pictured above, the DJI L1 is incapable of any effective penetration through vegetation or other porous areas. Additionally, strip alignment may be challenging in some scenarios. This data was captured, initially processed in DJI Terra, and finish processed in GeoCue LP360

Microdrones MD1000HR (VLP16)

The profile seen here, demonstrates the penetration capabilities of the Microdrones HR VLP16 payload. Note the greater resolution of data below trees, both broadleaf and palm.

GeoCue 515

In this image, there are no gaps beneath the trees. In the center, a uniform depression is visible. The Hesai Pandar XT32 was able to “see” below shallow water surface. In this case, approximately 12” of water depth, yet the creek bottom is solid (visible). While the below-water data is not viable for measurement, it does provide greater data for engineering considerations. 

REIGL VUX1

These two illustrations are sourced from the Riegl Vux1 sensor. This sensor provides the highest resolution of all four images compared here, with a much higher price tag to match the image quality. Note in the zoomed in profile, train rails/tracks are not only visible, but accurately measurable. There are no holes in the surface beneath any of the trees, and the tree detail is enough to classify tree types.
 

“Penetrating vegetation is a key function of LiDAR sensors; this is why tree profiles/slices have been used to illustrate these challenging scenarios.”

WHAT ABOUT SOLID STATE LiDAR SYSTEMS?

It is worth noting that solid state LiDAR systems are on the rise, and very much in development for longer-range with high density. Technology hasn’t improved to a point where solid state LiDAR might be broadly applicable for UAS work, while the technology has proved promising due to lighter weight, less power consumption, and speed. However, development is heavily focused on autonomous vehicles at present, yet it is fully anticipated we’ll soon see solid state LiDAR available for aerial applications.

HOW IS LiDAR DIFFERENT FROM PHOTOGRAMMETRY?

Photogrammetry uses multiple images with embedded geodata, matching pixels, and data information to create an orthomosiac. Pointclouds can be derived from images with slightly less accuracy, but a significant time commitment.  A 50 acre field processed as a pointcloud derived from photos may take up to 12 hours on an average computer, while the same computer will process the LiDAR-sourced pointcloud in under 30 minutes.  LiDAR is significantly faster to fly than UAS designed for photogrammetry, as the need for deep overlap is lessened in LiDAR workflow. 

Additionally, LiDAR may be flown at night (unless colorization is needed) while photogrammetry requires daylight hours. 

On the other hand, photogrammetry missions may be flown while there is water on the ground after a flood or heavy precipitation. LiDAR works best in dry, non-reflective environments.  Mirrored windows, water reflecting on leaves, ponds, creeks, etc will display as blacked-out areas in a LiDAR scan.

In this scan of the Colorado River, areas containing water display as black.

Not all software applications are compatible with all the different LiDAR sensors. The way trajectories are read/displayed, how data is managed/handled, even basic features are very different between the various software tools available today. For example, until recently, the data from DJI’s L1 LiDAR system could only be initially processed in DJI Terra software, which is quite limited, and many feel is “kludgy and slow.”  It’s also not a platform known for being stable. 

Recently, GeoCue has added the DJI L1 to it’s compatibility platform, enabling DJI users to use the LP360 software with L1 data, with great stability, flexibility, and speed.

SOFTWARE

When choosing a LiDAR system, there are many considerations, the greatest of which is how important high resolution and precision at ground will be to projects/workflows. Budget frequently makes this determination. However, bottom line vs long-term needs are often at odds with each other; it’s wise to spend “up” to a higher grade LiDAR sensor when customer satisfaction is at the top of the list.  Research often requires higher grade sensors as well.

When choosing a LiDAR system, consider the aircraft carrying the payload, the software required to process the data, and consider flight times as well. Two hours flying a narrow beam sensor vs 30 minutes of a wider throw may make all the difference, particularly when the company has a deep backlog and is focused on efficiency.

Whether LiDAR an organization is ready for LiDAR now, or down the road, there has never been a better time to learn more about LiDAR, pointclouds, and the differences of data processing from photogrammetry workflows. 

Special thanks to Brady Reisch of KukerRanken for the profile slices of data.

A Deep Insider’s Look at a Rugged Terrain Mission to Investigate a Helicopter Crash with Drones

A Deep Insider’s Look at a Rugged Terrain Mission to Investigate a Helicopter Crash with Drones

Crash site investigation with drones has emerged as a leading application for unmanned systems in public safety.  Gathering data that can be used by investigators in a courtroom, however, requires careful mission planning.  Here, sUAS expert and industry figure Douglas Spotted Eagle of  KukerRanken provides a detailed insider’s view of a helicopter crash site investigation.

Unmanned aircraft have become proven assets during investigations, offering not only the ability to reconstruct a scene. When a high ground sampling distance (GSD) is used, the data may be deeply examined, allowing investigators to find evidence that may have not been seen for various reasons during a site walk-through.

Recently, David Martel, Brady Reisch and I were called upon to assist in multiple investigations where debris was scattered over a large area, and investigators could not safely traverse the areas where high speed impacts may have spread evidence over large rocky, uneven areas. In this particular case, a EuroStar 350  aircraft may have experienced a cable wrap around the tail rotor and boom, potentially pulling the tail boom toward the nose of the aircraft, causing a high speed rotation of the hull prior to impact. Debris was spread over a relatively contained area, with some evidence unfound.

crash site investigation with drones

Per the FAA investigators;

“The helicopter was on its right side in mountainous densely forested desert terrain at an elevation of 6,741 ft mean sea level (MSL). The steel long line cable impacted the main rotor blades and was also entangled in the separated tail rotor. The tail rotor with one blade attached was 21 ft. from the main wreckage. Approximately 30 ft. of long line and one tail rotor blade were not located. The vertical stabilizer was 365 ft. from the main wreckage.”

With a missing tail rotor blade and the missing long line, unmanned aircraft were called in to provide a high resolution map of the rugged area/terrain, in hopes of locating the missing parts that may or may not aid in the crash investigation.

The terrain was difficult and unimproved, requiring four-wheel drive vehicles for access into the crash site. Due to rising terrain, we elected to launch/land the aircraft from the highest point relevant to the crash search area, which encompassed a total of approximately 70 acres.

Adding to the difficulty of finding missing parts was that the helicopter was partially covered in grey vinyl wrap, along with red and black vinyl wrap, having recently been wrapped for a trade show where the helicopter was displayed.

drones in crash site investigation

We arrived on scene armed with pre-loaded Google Earth overheads, and an idea of optimal locations to place seven Hoodman GCP discs, which would allow us to capture RTK points for accuracy, and Manual Tie Points once the images were loaded into Pix4D.  We pre-planned the flight for an extremely high ground sampling distance (GSD) average of .4cm per pixel. Due to the mountainous terrain, this GSD would vary from the top to the bottom of the site. We planned to capture the impact location at various GSD for best image evaluation, averaging as tight as .2cmppx. Some of these images would be discarded for the final output, and used only for purposes of investigation.

Although the overall GSD was greater than necessary, the goal is to be able to zoom in very deep on heavily covered areas with the ability to determine the difference between rocks and potential evidence, enabling investigators to view the overall scene via a 3.5 GB GeoTiff in Google Earth, and refer back to the Pix4DMapper project once rendered/assembled.

The same scene minus initial marker points.

Although working directly in Pix4D provides the best in-depth view of each individual photo, the Google Earth overlay/geotiff enables a reasonably deep examination.

Using two of the recently released Autel EVO II Pro aircraft, we planned the missions so that one aircraft would manage North/South corridors while the other captured East/West corridors.  Planning the mission in this manner allows for half the work time, while capturing the entire scene. This is the same method we used to capture the MGM festival grounds following the One October shooting in Las Vegas, Nevada. The primary difference is in the overall size, with the Pioche mission being nearly 70 acres, while the Las Vegas festival ground shooting area is under 20 acres in total.

Similar to the Las Vegas shooting scene, shadow distortion/scene corruption was a concern; flying two aircraft beginning at 11:00 a.m. and flying until 1:30 aided in avoiding issues with shadow.

Temporal and spatial offsets were employed to ensure that the EVO II Pro aircraft could not possibly collide, we set off at opposite sides of the area, at different points in time, with a few feet of vertical offset added in for an additional cushion of air between the EVO II. We programmed the missions to fly at a lower speed of 11 mph/16fps to ensure that the high GSD/low altitude images would be crisp and clean. It is possible to fly faster and complete the mission sooner, yet with the 3 hour travel time from Las Vegas to the crash site, we wanted to ensure everything was captured at its best possible resolution with no blur, streak, or otherwise challenged imagery. Overall, each aircraft emptied five batteries, with our batteries set to exchange notification at 30%.

Total mission running time was slightly over 2.5 hours per aircraft, with additional manual flight over the scene of impact requiring another 45 minutes of flight time to capture deep detail. We also captured imagery facing the telecommunications tower at the top of the mountain for line of sight reference, and images facing the last known landing area, again for visual reference to potential lines of sight.

crash site investigation with drones

By launching/landing from the highest point in the area to be mapped, we were able to avoid any signal loss across the heavily wooded area. To ensure VLOS was maintained at all times, FoxFury D3060’s were mounted and in strobing mode for both sets of missions (The FoxFury lighting kit is included with the Autel EVO II Pro and EVO II Dual Rugged Bundle kits).

Once an initial flight to check exposure/camera settings was performed, along with standard controllability checks and other pre-flight tasks, we sent the aircraft on their way.

Capturing over 6000 images, we checked image quality periodically to ensure consistency. Once the missions were complete, we drove to the site of impact to capture obliques of the specific area in order to create a more dense model/map of the actual impact site. We also manually flew a ravine running parallel to the point of impact to determine if any additional debris was found (we did find several small pieces of fuselage, tools assumed to be cast off at impact, and other debris.

The initial pointcloud took approximately 12 hours to render, generating a high-quality, highly dense initial cloud.

crash site investigation with drones

After laying in point controls, marking scale constraints as a check, and re-optimized the project in Pix4D, the second step was rendered to create the dense point cloud. We were stunned at the quality of the dense point cloud, given the large area.

The dense point cloud is ideal for purposes of measuring. Although this sort of site would typically benefit (visually) from texturing/placing the mesh, it was not necessary due to the high number of points and deep detail the combination of Pix4D and Autel EVO II Pro provided. This allowed us to select specific points where we believed points of evidence may be located, bringing up the high resolution images relevant to that area. Investigators were able to deep-dive into the area and locate small parts, none of which were relevant to better understanding the cause of the crash.

“The project generated 38,426,205 2D points and 13,712,897 3D points from a combination of nearly 7,000 images.”

crash site investigation with drones

Using this method of reviewing the site allows investigators to see more deeply, with ability to repeatedly examine areas, identify patterns from an overhead view, and safely search for additional evidence that may not be accessible by vehicle or foot. Literally every inch of the site may be gone over.

crash site investigation with drones

Further, using a variety of computer-aided search tools, investigators may plug in an application to search for specific color parameters. For example, much of the fuselage is red in color, allowing investigators to search for a specific range of red colors. Pieces of fuselage as small as 1” were discovered using this method. Bright white allowed for finding some items, while 0-16 level black allowed for finding other small objects such as stickers, toolbox, and oil cans.

Using a tool such as the DTResearch 301 to capture the RTK geolocation information, we also use the DTResearch ruggedized tablet as a localized pointcloud scan which may be tied into the Pix4Dmapper application. Capturing local scan data from a terrestrial perspective with GCP’s in the image allow for extremely deep detail in small environments. This is particularly valuable for construction sites or interior scans, along with uses for OIS, etc.

Primary Considerations When Capturing a Scene Twin

  • GSD.​ This is critical. There is a balance between altitude and propwash, with all necessary safety considerations.
    Vertical surfaces. In the event of an OIS where walls have been impacted, the ability to fly vertical surfaces and capture them with a consistent GSD will go a long way to creating a proper model. Shadow distortion.​ If the scene is very large, time will naturally fly by and so will the sun. In some conditions, it’s difficult to know the difference between burn marks and shadows. A bit of experience and experimentation will help manage this challenge.
  • Exposure.​ Checking exposure prior to the mission is very important, particularly if an application like Pix4Dreact isn’t available for rapid mapping to check the data on-site.
    Angle of sun/time of day​. Of course, accidents, incidents, crime, and other scenes happen when they happen. However, if the scene allows for capture in the midday hours, grab the opportunity and be grateful. This is specifically the reason that our team developed night-time CSI/Datacapture, now copied by several training organizations across the country over recent years.
  • Overcapture.​ Too much overlap is significantly preferable to undercapture. Ortho and modeling software love images.
  • Obliques. ​Capture obliques whenever possible. Regardless of intended use, capture the angular views of a scene. When possible, combine with ground-level terrestrial imaging. Sometimes this may be best accomplished by walking the scene perimeter with the UA, capturing as the aircraft is walked. We recommend removing props in these situations to ensure everyone’s safety.

What happens when these points are put aside?

This is a capture of a scene brought to us for “repair,” as the pilot didn’t know what he didn’t know. Although we were able to pull a bit of a scene, the overexposure, too-high altitude/low GSD, and lack of obliques made this scene significantly less valuable than it might have been.

Not understanding the proper role or application of the UA in the capture process, the UA pilot created a scene that is difficult to accurately measure, lacking appropriate detail, and the overexposure creates difficulties laying in the mesh. While this scene is somewhat preserved as a twin, there is much detail missing where the equipment had the necessary specifications and components to capture a terrific twin. Pilot error cannot be fixed. Operating on the “FORD” principle, understanding that ​FO​cus, exposu​R​e, and ​D​istance (GSD) cannot be rectified/compensated for in post processing means it has to be captured properly the first time. The above scene can’t be properly brought to life due to gross pilot error.

“ALWAYS PUT THE AIRCRAFT OVER THE PRIMARY SCENE LOCATION TO CONFIRM EXPOSURE SETTINGS, KEEPING ISO AS LOW AS POSSIBLE. USE ISO 50-100 IN MOST OUTDOOR SCENARIOS TO OBTAIN THE BEST IMAGE. NEVER USE OVERSATURATED PHOTO SETTINGS OR LOG FORMATS FOR MAPPING.”

Ultimately, the primary responsibility is to go beyond a digital twin of the scene, but instead offer deep value to the investigator(s) which may enhance or accelerate their investigations. Regardless of whether it’s a crash scene, insurance capture, energy audit, or other mapping activity, understanding how to set up the mission, fly, process, and export the mission is paramount.

Capturing these sorts of scenes are not for the average run n’ gun 107 certificate holder. Although newer pilots may feel they are all things to all endeavors benefitting from UA, planning, strategy, and experience all play a role in ensuring qualified and quality captures occur. Pilots wanting to get into mapping should find themselves practicing with photogrammetry tools and flying the most challenging environments they can find in order to be best prepared for environmental, temporal, and spatial challenges that may accompany an accident scene. Discovery breeds experience when it’s cold and batteries expire faster, satellite challenges in an RTK or PPK environment, planning for overheated tablets/devices, managing long flight times on multi-battery missions, or when winds force a crabbing mission vs a head/tailwind mission. Learning to maintain GSD in wild terrain, or conducting operations amidst outside forces that influence the success or failure of a mission only comes through practice over time. Having a solid, tried and true risk mitigation/SMS program is crucial to success.

We were pleased to close out this highly successful mission, and be capable of delivering a 3.5 GB geotiff for overlay on Google Earth, while also being able to export the project for investigators to view at actual ground height, saving time, providing a safety net in rugged terrain, and a digital record/twin of the crash scene that may be used until the accident investigation is closed.

EQUIPMENT USED

●  2X Autel EVOII™ Pro aircraft

●  Autel Mission Planner software

●  FoxFury D3060 lighting

●  DTResearch 301 RTK tablet

●  Seko field mast/legs

●  Seko RTK antenna

●  Hoodman GCP

●  Hoodman Hoods

●  Manfrotto Tripod

●  Dot3D Windows 10 software

●  Pix4DMapper software

●  Luminar 4 software

Part 91, 101, 103, 105, 107, 137: WHAT’S THE DIFFERENCE?

All these FARs, what’s a drone pilot to do in order to understand them? Do they matter?

YES!

In virtually every aviation pursuit except for sUAS, an understanding of regulations is requisite and part of most testing mechanisms.  As a result, many sUAS pilots holding 

a Remote Pilot Certificate under Part §107 are woefully uninformed, to the detriment of the industry.

Therefore, sUAS pilots would be well-served to inform themselves of how each section of relevant FARs regulate components of aviation.

Let’s start by digging into the intent of each Part.

  • §Part 91 regulates General Operating and Flight Rules.
  • §Part 101 regulates Moored Balloons, Kites, Amateur Rockets, Unmanned Free Balloons, and some types of Model Aircraft.
  • §Public Law Section 336 regulates hobby drones as an addendum to Part 101.
  • §Part 103 regulates Ultra-Light Vehicles, or manned, unpowered aviation.
  • §Part 105 regulates Skydiving.
  • §Part 107 regulates sUAS
  • §Part 137 regulates agricultural aircraft

RELEVANT PARTS (Chapters):

Part §91

This portion of the FARs is barely recognized, although certain sections of Part 91 may come into play in the event of an action by the FAA against an sUAS pilot. For example, the most concerning portion of Part 91 is  91.13, or “Careless or Reckless Operation.” Nearly every action taken against sUAS pilots have included a charge of 91.13 in the past (prior to 107).

Specific to drone actions, The vast majority of individuals charged have also included the specific of a 91.13 charge.

sUAS pilots whether recreational or commercial pilots may be charged with a §91.13 or the more relevant §107.23 (reckless)

It’s pretty simple; if there are consequences to a pilot’s choices and actions, it’s likely those consequences also included a disregard for safety or planning, ergo; careless/reckless. The FAA has recently initiated actions against Masih Mozayan for flying his aircraft near a helicopter and taking no avoidance action. They’ve also taken action against Vyacheslav Tantashov for his actions that resulted in damage to a military helicopter (without seeing the actual action, it’s a reasonable assumption that the action will be a §91.13 or a §107.23 (hazardous operation).

Other parts of Part 91 are relevant as well. For example;

  • §91.1   Applicability.

(a) Except as provided in paragraphs (b), (c), (e), and (f) of this section and §§91.701 and 91.703, this part prescribes rules governing the operation of aircraft within the United States, including the waters within 3 nautical miles of the U.S. coast.

The above paragraph includes sUAS.  Additionally, Part 107 does not exclude Part 91. Airmen (including sUAS pilots) should be aware of the freedoms and restrictions granted in Part 91.

§91.3   Responsibility and authority of the pilot in command.

(a) The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft.

(b) In an in-flight emergency requiring immediate action, the pilot in command may deviate from any rule of this part to the extent required to meet that emergency.

(c) Each pilot in command who deviates from a rule under paragraph (b) of this section shall, upon the request of the Administrator, send a written report of that deviation to the Administrator.

§91.7   Civil aircraft airworthiness.

(a) No person may operate a civil aircraft unless it is in an airworthy condition.

(b) The pilot in command of a civil aircraft is responsible for determining whether that aircraft is in condition for safe flight. The pilot in command shall discontinue the flight when unairworthy mechanical, electrical, or structural conditions occur.

§91.15   Dropping objects.

No pilot in command of a civil aircraft may allow any object to be dropped from that aircraft in flight that creates a hazard to persons or property. However, this section does not prohibit the dropping of any object if reasonable precautions are taken to avoid injury or damage to persons or property.

§91.17   Alcohol or drugs.

(a) No person may act or attempt to act as a crewmember of a civil aircraft—

(1) Within 8 hours after the consumption of any alcoholic beverage;

(2) While under the influence of alcohol;

(3) While using any drug that affects the person’s faculties in any way contrary to safety; or

Sound familiar?

SubPart B also carries relevant information/regulation with regard to operation in controlled airspace, operations in areas under TFR ((§91.133), operations in disaster/hazard areas, flights during national events, lighting (§91.209)

PART 101

Part §101 has a few applicable sections.

Subpart (a) under §101.1 restricts model aircraft and tethered aircraft (balloons). Although subpart (a.4. iiv) is applicable to balloon tethers, there is argument that it also applies to sUAS. Subpart (a.5.iii) defines recreational flight for sUAS/model aircraft.

Finally, §101.7 re-emphasizes §91.15 with regard to dropping objects (may not be performed without taking precautions to prevent injury or damage to persons or property).  Public Law 112-95 Section 336 (which may be folded into a “107 lite” version), clarifies sections not added to Part 101.

Bear in mind that unless the pilot follows the rules and guidelines of a NCBO such as the AMA, AND the requirements of that NCBO are met, the flight requirements default to Part 107 requirements.

PART §103

Part §103 regulates Ultralight vehicles (Non powered, manned aviation)

Although no component of Part §103 specifically regulates UAV, it’s a good read as Part 103 contains components of regulation found in Part 107.

PART §105

Part §105 regulates Skydiving.

Part §105 carries no specific regulation to sUAS, an understanding of Part 105 provides great insight to components of Part 107. Part 107 has very few “new” components; most of its components are clipped out of other FAR sections.

PART §107

Although many sUAS pilots “have their 107,” very few have actually absorbed the FAR beyond a rapid read-through. Without a thorough understanding of the FAR, it’s difficult to comprehend the foundation of many rules.

PART §137

Part 137 applies specifically to spraying crops via aerial vehicles.

Those looking into crop spraying via sUAS should be familiar with Part 137, particularly with the limitations on who can fly, where they can fly, and how crops may be sprayed.
One area every ag drone pilot should look at is §137.35 §137.55 regarding limitations and business licenses.

The bottom line is that the more informed a pilot is, the better pilot they can be.  While there are many online experts purporting deep knowledge of aviation regulations and how they specifically apply to sUAS, very few are familiar with the regulations in specific, and even less informed as to how those regulations are interpreted and enforced by ASI’s. We’ve even had Part 61 pilots insist that the FSDO is a “who” and not a “what/where.” Even fewer are aware of an ASI and how they relate to the world of sUAS.

FSIM Volume 16

It is reasonably safe to say that most sUAS pilots are entirely unaware of the Flight Standards Information Management System, aka “FSIMS.” I’ve yet to run across a 107 pilot familiar with the FSIMS, and recently was vehemently informed that “there is nothing beyond FAR Part 107 relative to sUAS. Au contraire…

Familiarity with the FSIMS may enlighten sUAS operator/pilots in how the FAA examines, investigates, and enforces relevant FARs.

Chapter 1 Sections 1, 2  and 4 are a brief, but important read, as is Chapter 2, Section 2.

Chapter 3 Section 1 is informational for those looking to apply for their RPC Part 107 Certificate.

Chapter 4 Sections 2, 5, 7, 8 are of particular value for commercial pilots operating under Part 107.

Volume 17, although related only to manned aviation, also has components related to 107, and should be read through (Chapters 3 & 4) by 107 pilots who want to be informed.

Gaining new information is always beneficial, and even better if the new information is implemented in your workflow and program. Become informed, be the best pilot you can be, and encourage others to recognize the value in being a true professional, informed and aware.

 

Six ways drones have proven themselves as a tool for the AEC, Surveying, and mapping industries.

Drones and unmanned aircraft in AEC scanning and construction

Six ways drones have proven themselves as a tool for the AEC, Surveying, and mapping industries

Drones and unmanned aircraft in AEC scanning and construction process are becoming more common.  Unmanned aircraft, or drones are becoming much more common on today’s project sites. many companies in the AEC, Surveying and mapping industries are utilizing these aircraft daily. So how do drones capture data? What are professionals getting out of said data? What makes a drone into a valuable tool versus a toy?

UAS technology has advanced to a point where the aircraft; while still very sophisticated, are quite simple to operate. They utilize; altimeter’s, magnetometers, inertial measurement units, GNSS (GPS) and radio transmitters to control the flight operations, but the end-user would never know it. These sensors and more are all managed behind the scenes so well that an operator can takeoff from any point, fly a “mission” which involves several tasks collecting data, avoid collisions from unexpected obstacles, know when they have just enough battery to return home safely and land all in a constantly changing environment, 100% autonomously starting from a single tap for initiation. Flying a drone is fun but unless you’re collecting data it brings no value. There are many sensors that can be attached to unmanned aircraft such as LiDAR and Gravitometers but in this article we are primarily going to address cameras and their use in Photogrammetry.

Photogrammetry

When you photograph an object from two different angles and add some Trigonometry, three dimensional measurements can be calculated.  The entire process is simple and automated.  A 3D model from aerial imagery is nothing new. Photogrammetry can be summarized as; the art, science and technology of making precise measurements from photos, and has been around since the mid 1800’s.

The whole process works like this: The distance (f) from a Camera Lens to its sensor is proportional to the distance (h) from said camera lens to objects being photographed. This property is written into several equations that photogrammetrists use to calculate things such as the scale of a photo and even the elevation of specific points or pixels in aerial photographs.

When two overlapping photographs are in correct orientation relative to each other, a Stereopair or Stereoscopic Imagery exists.  This imagery creates perspective on objects within the overlap of the photographs and is the principle behind all forms of 3D viewing.

Stereoscopic Imagery drones and unmanned aircraft in AEC scanning and construction

As mentioned above, drone users can pre-program routes to fly over their intended mapping area. Photos are taken with specific overlap which is computed based on altitude, speed, and the resolution of their camera sensor. Drones use the onboard sensors like GNSS or even real time corrected positioning (RTK) to both georeference the photos taken, control the flight of the by changing the RPM’s of the individual motors. This data is all carried over in the image files where they are further processed.

Today’s Photogrammetry softwares use these mathematical principles to orient, scale and combine photographs and data. The software will ultimately generate Point Clouds, Orthorectified (measurable) photos and 3D models with varying output types.

Project Output drones and unmanned aircraft in AEC scanning and construction

Drones and unmanned aircraft in AEC and Construction:  Valuable Applications for AEC, Surveying, and mapping.

Surveying and Mapping. The use of drones and unmanned vehicles in surveying and mapping is almost self-evident. Surveyors and Cartographers have used Aerial Photography dating back about as far as the invention of the airplane. What may not be immediately apparent are the costs to purchase a survey-quality UAS and required software is a small investment in comparison to traditions surveying equipment and the man hours saved easily pays for itself.   Point Clouds and Orthometric photos are great for drafting planimetric features and generating TIN surfaces to represent topography. Whether you’re mapping for design data, a feasibility study, GIS, or performing an ALTA/ACSM survey, using unmanned vehicle to capture data may be significantly more efficient than traditional means.

Reality Capture, which is just that; capturing the reality of the current conditions of a project site. This is a great practice for design, bidding, marketing and simply helping clients “capture the vision.” This may be as simple as viewing an oblique photograph or as complicated as combining a designed structure with a 3D mesh and viewing it in VR. I personally get a kick whenever I see a IFC model inserted in a point cloud.

Building Information modeling (BIM). It would be hard to mention reality capture without mentioning BIM. While flying a drone indoors is doable its not very practical so this is not what we are referring to here. Many companies today, especially in the design-build world are utilizing BIM for much more than building modeling. They are integrating models in all their civil design as well.  These departments are already using laser scanning and are familiar with point clouds so adding a UAS into their tool chest is a natural move. Drones are great for capturing data that can be used for clash detection, QC, and as-built drawings.

Pre-Construction and Takeoffs are a major part of heavy civil construction. When it comes to moving dirt, knowing exactly what must be done can make all the difference in winning a bid, making a profit or losing your shorts. This is done when companies are bidding on projects, but the same process occurs over often in design builds and any time a RFI or change order comes up. Capturing data to that represent the existing site condition is key when building a model and matching existing roadway and other civil tie-in points. Using a drone is a great way to make this happen.

Project Output2 drones and unmanned aircraft in AEC scanning and construction

Project Management. Unmanned Aircraft may be utilized for many processes in project management. Creating progress reports and viewing current conditions may be the most basic use and might just be the most beneficial when it comes to decision making. Billing on some projects is solely based on materials moved and/or installed. This makes tracking linear feet, area, and volumes the bottom line. Some other overlooked uses may include, creating safety plans and incident reports, public involvement, and training. There are also various other project management uses above.

Inspections. Drones are one of the best tools utilized in inspections. Often an environment is not safe for a person such as inspecting a high wall in an open pit mine; or the situation may not be as efficient for an individual such as climbing versus flying to inspect bolts on a suspension bridge. When we apply the use of Infrared /thermals sensors to unmanned aircraft they are capably much more. Infrared light is absorbed by water making it possible to discover moisture that may be invisible to the naked eye. This is great for leak detection among other things. Thermal makes it possible to view and analyze heat signatures. This is often used to find areas of heat loss in anything from mechanical to thermal applications.

One of the biggest challenges today’s companies in the AEC, Surveying and Mapping industry face is a shortage of manpower. The only way to overcome a shortage in manpower is to innovate. Many choosing to innovate are looking to drone to solve their problems. Two trends I’ve noticed in helping companies develop their UAS applications is that they may start with a particular expectation in mind and one drone, but they always utilize their UAS data more than they anticipated and want to expand their drone fleet. I believe UAS technology is one of the best investments for a company in these industries to make. It is very apparent to me that Unmanned Aircraft are a major focus in developing technology. They are a powerful tool and not a toy.

By Bryan Worthen Kuker-Ranken SLC

Examples:
https://cloud.pix4d.com/dataset/812780/map?shareToken=30b94ff7-79a2-46e9-822e-0a97dbd26408

https://cloud.pix4d.com/dataset/788626/map?shareToken=38540ee0-e5a4-47e4-ab1a-6fb57ac48142
https://cloud.pix4d.com/dataset/665273/model?shareToken=612c5c7f-e47c-4721-8c2f-53ba80a6e544

Kuker-Ranken has been in business for nearly 100 years; Customer Service is our top priority, whether precision instruments, unmanned aircraft/drones, or construction support supplies.
Call us today for pricing on drones, training, and service! (800) 454-1310