Hostname: page-component-7c8c6479df-7qhmt Total loading time: 0 Render date: 2024-03-29T06:43:53.979Z Has data issue: false hasContentIssue false

PERSPECTIVES FROM THE FIELD: Remote Sensing from Small Unmanned Platforms: A Paradigm Shift

Published online by Cambridge University Press:  22 September 2015

Christopher D. Lippitt*
Affiliation:
Department of Geography and Environmental Studies, University of New Mexico, Albuquerque, New Mexico, and TerraPan Labs LLC, San Diego, California
*
*Address correspondence to: Christopher D. Lippitt, Department of Geography and Environmental Studies, Bandelier Hall West Room 215, 1 University of New Mexico, Albuquerque, NM 87131-0001; (phone) 505-277-5485; (fax) 505-277-3614; (e-mail) clippitt@unm.edu.

Abstract

Type
Points of View
Copyright
© National Association of Environmental Professionals 2015 

From a remote sensing perspective, small-unmanned aircraft systems (S-UAS) are platforms. The impact of platforms on remote sensing is rarely of consequence. It is the sensor the platform caries, with its detector density, spectral sampling, and radiometric fidelity, that most directly impacts the ability to resolve using remote sensing techniques (Strahler, Woodcock, and Smith, Reference Strahler, Woodcock and Smith1986). Platforms are utilitarian and, as the nomenclature suggests, exist only to carry the sensor. A Cessna carries a sensor as well as a Lear Jet, and a 30-year old Cessna just as well as a new one, for most applications anyway. Perhaps because of their core function and slow pace of innovation relative to sensor and computing technology, it is easy to forget that the arrival of new platform technologies has revolutionized remote sensing throughout history.

Lighter than air balloons first made aerial photography possible, fixed wing aircraft made remote sensing a practical tool for reconnaissance, science, and management, and satellites made remote sensing a synoptic tool for monitoring global and regional processes. Autonomous unmanned platforms, S-UAS in particular, are the next platform technology and, like previous evolutions in platform technology, are poised to induce a paradigm shift in remote sensing. The widespread use of S-UAS for remote sensing will change not only the range of applications appropriate for remote sensing, but also the practice of remote sensing in myriad ways that have substantial implications for our ability to detect, monitor, and measure our environment. Not since the Corona project, the first satellite remote sensing program, has the platform played such a central role in remote sensing, and not since the launch of the Earth Resource Technology Satellite 1 (later renamed LandSat 1) has the capability of remote sensing for environmental sampling been so dramatically expanded.

Unlike their larger counterparts, whose advantages are fairly incremental relative to manned aircraft (i.e., longer flight duration, reduced pilot risk), S-UAS represent disruptive platform technology. The accessibility of S-UAS from a cost and operation perspective, safe operation at low altitudes above ground level (AGL), and the potential for fully autonomous and repeated deployment makes for a veritable bonanza of new remote sensing capabilities. While only history will tell exactly how S-UAS will change remote sensing, there are readily apparent implications: higher spatial resolutions due to low AGL collections, higher temporal resolutions due to autonomous operation from takeoff to landing and reduced deployment cost, lower radiometric resolutions due to payload limitations, and more data than ever before. For sampling the environment, this means the ability to resolve fine-scale features (mm-cm’s) at drastically increased temporal sampling intervals when compared to previous capability. One can now consider remote sensing a viable sampling technology for, for example, hourly samples of plant respiration at the individual level. The range of novel environmental sampling applications S-UAS will enable are far too numerous to speculate, but see Colomina and Molina (Reference Colomina and Molina2014) for a review of what’s been done so far.

Exploitation of hyper-spatial remote sensing data with limited spectral wavebands and radiometric quality necessarily requires information extraction methods (e.g., classification, change detection) that preference the discrimination of targets based on spatial arrangement and pattern over absolute spectral response (brightness values). H-resolution scenes (Strahler, Woodcock, and Smith, Reference Strahler, Woodcock and Smith1986), which assume that a single target or class determines the radiance of a group of pixels, become the norm, with a new order of objects (e.g., sprinkler heads, birds) becoming suitable for H-resolution analysis approaches. This presents a challenge to realizing the full potential of S-UAS acquired remote sensing data for environmental sampling. With the exception of GEOBIA (Blaschke, Reference Blaschke2010) and to some extent texture analysis (Ferro and Warner, Reference Ferro and Warner2002), remote sensing analysis approaches have been developed for L-resolution scenes and ignore the spatial arrangement of brightness values. So, to realize the potential of S-UAS remote data, GEOBIA and other H-resolution methods will need to advance substantially, particularly in terms of degree of automation and scalability to process large datasets. Likely the most advanced example of leveraging the insight into the relative position of features provided by S-UAS acquired image data is their already routine exploitation to create high-resolution digital surface models (Zhang, Xiong, and Hao, Reference Zhang, Xiong and Hao2011). It is the ability to resolve the arrangement of unique features from myriad perspectives that enables the types of high density height estimates now being produced by aerial triangulation (AT, also known as structure-from-motion) applied to UAS acquired image data. As more techniques capable of exploiting the spatial arrangement and characteristics of features become available (e.g., AT, OBIA), it seems likely that the utility of S-UAS acquired image data will expand proportionally.

Increased spatial and temporal resolutions produce a readily calculated increase in data volume when compared to current approaches, but when coupled with the potential for coordinated acquisition from myriad platforms and democratized acquisition strategy enabled by S-UAS, it becomes clear that a veritable flood of data is impending. Simply storing and sharing this volume of imagery presents a challenge, but when coupled with the fact that the H-resolution processing approaches currently available are computationally intensive, it becomes clear that new, central-computing architectures (e.g., cloud) will be needed to facilitate storage and processing of these data and, in turn, the routine use of S-UAS acquired remote data for environmental sampling.

S-UAS are poised to dramatically expand the ability of remote sensing to collect data about aspects of the physical environment that have traditionally been beyond its practical reach. The potential is immense, but so are the challenges to the remote sensing community. Methods for and approaches to data exploitation, both algorithmic and architectural, will have to adapt to these new data and the many novel applications of remote sensing they are likely to spur.

References

Blaschke, T. 2010. Object Based Image Analysis for Remote Sensing. Isprs Journal of Photogrammetry and Remote Sensing 65(1):216. doi: 10.1016/j.isprsjprs.2009.06.004.Google Scholar
Colomina, I., and Molina, P.. 2014. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. Isprs Journal of Photogrammetry and Remote Sensing 92:7997.CrossRefGoogle Scholar
Ferro, C.J.S., and Warner, T.A.. 2002. Scale and Texture in Digital Image Classification. Photogrammetric Engineering and Remote Sensing 68(1):5163.Google Scholar
Strahler, A.H., Woodcock, C.E., and Smith, J.A.. 1986. On the Nature of Models in Remote-Sensing. Remote Sensing of Environment 20(2):121139.CrossRefGoogle Scholar
Zhang, Y., Xiong, J., and Hao, L.. 2011. Photogrammetric Processing of Low-Altitude Images Acquired by Unpiloted Aerial Vehicles. Photogrammetric Record 26(134):190211. doi: 10.1111/j.1477-9730.2011.00641.x.CrossRefGoogle Scholar