thermal solar

Making Sense of Drone-based Sensors

thermal image power plantWhen drones pop up in casual conversation, the focus is usually on the “sexy” factor. How high and fast can your drone fly? How maneuverable? How much payload can it handle? How POWERFUL is it (insert Tim “The Tool Man” Taylor pig grunts here)?

But the real secret sauce in the drone tech stew – the unsexy side – is found in UAV sensors. Those thing-a-ma-gigs and doo-hickies attached to the bottoms or sides of that sexy drone – the unsung heroes, the workhorses of the drone sector.

A drone without a decent sensor array is often no more than a hobby toy. Commercial drones depend on sensors like LIDAR, thermal, and multi-spectral and an array of other “doo-hickies” to inspect mines and utilities, analyze crop conditions, find missing persons and gather vital scientific data.

A deep dive into the world of aerial sensors can end up in deep waters. Here’s a handy overview.

Passive vs. Active Sensors

To better understand an overview of sensors, we need to understand the difference between active and passive sensors. Note: many sensor arrays will sometimes deploy a combination of both.

Active sensors are – well – active. They emit some type of wave or radiation. Examples include LiDAR, radar or sonar. Using human senses as an analogy, an active sensor resembles your voice. You emit sonic waves when you speak, grunt, burp, laugh or yell (hopefully not at the same time – see a doctor).

Passive sensors detect waves or radiation, receiving information as it is detected in the natural world. Your eyes and ears are passive sensors, receiving light or sonic waves. Examples include infrared/thermal sensors or specialized cameras.

LiDAR

LiDAR (Light Detection and Ranging) technology resembles its sister technology radar; but, instead of emitting and releasing radio waves (the “r” in radar), it produces and measures rapid laser pulses.

The sensor is used to measure the distance from itself to its target area. It does so by calculating the time between the release of the pulse to time the pulse is reflected back and received.

When combined with powerful orthomosaics or data analysis software, LiDAR functionality can create high-res digital surface imagery, map terrain and elevation via 3D modeling. It can also be collated and analyzed in point-cloud data.

The end game for most LiDAR drone missions is to survey landscapes, sites and terrain to produce a 3D point cloud. The data received can then be crunched by powerful computers to produce the desired output model. The gathering and analysis of such precision data is useful across a number of sectors, including mining, construction, utility-inspection and agriculture.

LiDAR is one of the more accurate sensor technologies; but with accuracy comes a price – high-end systems can cost between $50,000 and $100,000. As a result, most users find it more cost-effective to hire a drone-services contractor. In addition, many drone companies, like DJI, sell UAVs with LiDAR sensors pre-installed.

As one of the most popular and accurate sensor arrays, LiDAR is big business. A recent report by Mordor Intelligence notes:

“In 2019, the LiDAR drone market was valued at $38.16 million, and is projected to reach a value of $229.55 million by 2025, recording a [compound annual growth rate] of 34.88 percent over the forecast period, 2020-2025. The LiDAR drone market is growing significantly over the forecast period, owing to the increasing use of drones across various end-user verticals, such as aerospace and defense, agriculture and forestry, and natural resources management, mining, and oil and gas exploration, among others.”

Major LiDAR solutions providers include Phoenix Lidar Systems, LidarUSA, Riegl Laser Measurement Systems, DraganFly UAS, Xena OnyxScan Advanced Lidar Systems, LidarUSA and Leddartech.

LiDAR rigs require three components in addition to the laser:

Scanners and optics determine the speed at which images can be scanned and developed into an analysis system, as well as the resolution and range of the image. Scanning methods include dual oscillating plane mirrors, polygonal mirrors, dual axis scanner and azimuth-and-elevation.

The photodetector records and “reads” the returned signals; there are two types – solid-state detectors and photomultipliers.

Aerial or other mobile LiDAR systems also require a GPS and Inertia Measurement Unit combo to provide accurate geographical and positional readings.

To dive deeper into LiDAR tech, check out this free online class offered by NOAA.

Thermal

As the name implies, thermal sensors measure surface temperatures. Every object on earth (except an object at absolute zero) gives off a heat signature. Thermal sensors capture that signature and shares the signals with an analyzer to provide vital data across the spectrum.

thermal home scan

Thermal insulation control of a house with a drone.

Much in the way the brain and the eyes work together to produce a sense of color, thermal sensors – a passive sensor – gather and send data to the “hardware brain” of the array that can express visible color models for human evaluation.

Most drones use a combination of a special infrared camera coupled with the thermal sensor array. Thermal sensors are widely used in the drone-inspection arena.

Uses include:

  • Determining crop health over hundreds of acres by measuring changes in ambient temperature from crop to crop;
  • Identifying overheating equipment;
  • Providing detailed construction inspections for insurance underwriting;
  • Assessing the magnitude of uncontrolled fires (as well as evaluating damage);
  • Discovering energy inefficiencies;
  • Finding lost people – injured hikers, missing children or seniors – not to mention runaway suspects.

In 2016, a thermal drone saved a man’s life. Using a SkyRanger drone equipped with thermal sensors, the Ontario Provincial Police found a dementia patient who went missing at night in a rural, wooded area. Within an hour of deployment, the sensor detected the man’s heat signature. Rescuers easily located the man in a cornfield and successfully treated him for mild hypothermia.

Check out our recent blog post to learn more about getting started with thermal imaging.

Multispectral

Much like the human retina, standard sensors “perceive” light in three wavelengths – red, green and blue. Multispectral sensors (as the name implies) detects and collects from the RGB spectrum and wavelengths invisible to the human eye, such as near-infrared radiation (NIR) and short-wave infrared radiation. Remember the monster’s vision in the film Predator (not Arnold)? Now you’ve got it.

Once a multispectral sensor captures the data, it can be compared to other nearby base temperatures on the ground. From there, data analyzers can produce orthomosaic maps that can be used to make informed decisions about crop health, tree cover, solar panel efficiency, oil/gas pipeline trouble spots, etc. The sensors can also identify energy inefficiencies in roofs and other structural components such as towers or bridges.

As an added advantage, a multispectral sensor can be “tuned” to meet a user’s specific mission parameters.

Hyperspectral

Multispectral sensors work well within a limited number of energy bands. When precision “on steroids” is required – such as in viticulture, mineral surveys and detailed chemical terrain studies – hyperspectral sensors can provide between 100-200 spectral bands of data per pixel.

An ideal example would be forestry. Multispectral bands can distinguish trees from bushes. Hyperspectral can identify exact species of trees. Capturing such dense amounts of details also means hyperspectral data is measured in terabytes rather than gigabytes.

“Hyperspectral sensors are like a scalpel that allows you to dissect exactly what is happening within very narrow bands of spectral content,” PrecisionHawk remote-sensing expert Jason San Souci says. “Multispectral sensors are a broad sword that collects less intensive spectral data.”

Hyperspectral scans deploy two methods – push-broom and whisk-broom.

Push-broom scanning is more comprehensive in that it captures a full set of spectral data at the same time as the drone flies in a forward motion. A line of sensors scans in a perpendicular direction to the drone’s heading. As a result, a push-broom array can sense the target area over a longer time period which enables a denser data set.

The whisk-broom method employs a mirror to scan a specific path such that the light is sent to one detector. The data can then be collected on a per-pixel basis.

Another advantage of hyperspectral scanning is its capability to create a three-dimensional “hypercube.” While the normal dimensions (x/y, up and down) are rendered, the cube model adds a third dimension which details the spectral data. In essence, the user ends up with a cubic slice of the ground area to examine in detail.

Sensing a Better World

Sensor and drone technology are truly the Reese’s Peanut Butter Cups of the aerial data candy shop. The combination of the two have revolutionized remote-data analysis. Better data means more nutritious, higher-yielding crops; enhanced energy efficiency, more lives saved in precise search-and-rescue ops, and more productive forestry management practices. In short, drone-based sensor technology is making the world a better place.

We hope you found this information useful. If you are interested in dronegenuity training,  drone pilot jobs, or if you’d like to get more great drone industry content from dronegenuity, simply follow us on our social media channels: YouTube, Twitter, Facebook, and Instagram, and subscribe to our mailing list for regular updates.

About the Author

Jason Reagan

Jason Reagan is a tech journalist and content-marketing creator. Since 2014, Jason has covered the commercial drone industry. Beginning his career as a journalist in 1996, Jason has since written and edited thousands of engaging news articles, blog posts, press releases and online content. TWITTER:@JasonPReagan | EMAIL: jasonpreagan@outlook.com