Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization

The mapping of peatland microtopography (e.g., hummocks and hollows) is key for understanding and modeling complex hydrological and biochemical processes. Here we compare unmanned aerial system (UAS) derived structure-from-motion (SfM) photogrammetry and LiDAR point clouds and digital surface models...

Full description

Bibliographic Details
Published in:Drones
Main Authors: Margaret Kalacska, J. Pablo Arroyo-Mora, Oliver Lucanus
Format: Article
Language:English
Published: MDPI AG 2021-05-01
Subjects:
Online Access:https://www.mdpi.com/2504-446X/5/2/36
_version_ 1850330309926584320
author Margaret Kalacska
J. Pablo Arroyo-Mora
Oliver Lucanus
author_facet Margaret Kalacska
J. Pablo Arroyo-Mora
Oliver Lucanus
author_sort Margaret Kalacska
collection DOAJ
container_title Drones
description The mapping of peatland microtopography (e.g., hummocks and hollows) is key for understanding and modeling complex hydrological and biochemical processes. Here we compare unmanned aerial system (UAS) derived structure-from-motion (SfM) photogrammetry and LiDAR point clouds and digital surface models of an ombrotrophic bog, and we assess the utility of these technologies in terms of payload, efficiency, and end product quality (e.g., point density, microform representation, etc.). In addition, given their generally poor accessibility and fragility, peatlands provide an ideal model to test the usability of virtual reality (VR) and augmented reality (AR) visualizations. As an integrated system, the LiDAR implementation was found to be more straightforward, with fewer points of potential failure (e.g., hardware interactions). It was also more efficient for data collection (10 vs. 18 min for 1.17 ha) and produced considerably smaller file sizes (e.g., 51 MB vs. 1 GB). However, SfM provided higher spatial detail of the microforms due to its greater point density (570.4 vs. 19.4 pts/m<sup>2</sup>). Our VR/AR assessment revealed that the most immersive user experience was achieved from the Oculus Quest 2 compared to Google Cardboard VR viewers or mobile AR, showcasing the potential of VR for natural sciences in different environments. We expect VR implementations in environmental sciences to become more popular, as evaluations such as the one shown in our study are carried out for different ecosystems.
format Article
id doaj-art-bfcbf5ea1f0e4bc2a25a4d7fb9f68118
institution Directory of Open Access Journals
issn 2504-446X
language English
publishDate 2021-05-01
publisher MDPI AG
record_format Article
spelling doaj-art-bfcbf5ea1f0e4bc2a25a4d7fb9f681182025-08-19T23:18:28ZengMDPI AGDrones2504-446X2021-05-01523610.3390/drones5020036Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) VisualizationMargaret Kalacska0J. Pablo Arroyo-Mora1Oliver Lucanus2Applied Remote Sensing Lab, Department of Geography, McGill University, Montreal, QC H3A 0G4, CanadaFlight Research Laboratory, National Research Council of Canada, 1920 Research Private, Ottawa, ON K1A 0R6, CanadaApplied Remote Sensing Lab, Department of Geography, McGill University, Montreal, QC H3A 0G4, CanadaThe mapping of peatland microtopography (e.g., hummocks and hollows) is key for understanding and modeling complex hydrological and biochemical processes. Here we compare unmanned aerial system (UAS) derived structure-from-motion (SfM) photogrammetry and LiDAR point clouds and digital surface models of an ombrotrophic bog, and we assess the utility of these technologies in terms of payload, efficiency, and end product quality (e.g., point density, microform representation, etc.). In addition, given their generally poor accessibility and fragility, peatlands provide an ideal model to test the usability of virtual reality (VR) and augmented reality (AR) visualizations. As an integrated system, the LiDAR implementation was found to be more straightforward, with fewer points of potential failure (e.g., hardware interactions). It was also more efficient for data collection (10 vs. 18 min for 1.17 ha) and produced considerably smaller file sizes (e.g., 51 MB vs. 1 GB). However, SfM provided higher spatial detail of the microforms due to its greater point density (570.4 vs. 19.4 pts/m<sup>2</sup>). Our VR/AR assessment revealed that the most immersive user experience was achieved from the Oculus Quest 2 compared to Google Cardboard VR viewers or mobile AR, showcasing the potential of VR for natural sciences in different environments. We expect VR implementations in environmental sciences to become more popular, as evaluations such as the one shown in our study are carried out for different ecosystems.https://www.mdpi.com/2504-446X/5/2/36bogdroneOculus Quest 2Mer BleueSfMUAV
spellingShingle Margaret Kalacska
J. Pablo Arroyo-Mora
Oliver Lucanus
Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization
bog
drone
Oculus Quest 2
Mer Bleue
SfM
UAV
title Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization
title_full Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization
title_fullStr Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization
title_full_unstemmed Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization
title_short Comparing UAS LiDAR and Structure-from-Motion Photogrammetry for Peatland Mapping and Virtual Reality (VR) Visualization
title_sort comparing uas lidar and structure from motion photogrammetry for peatland mapping and virtual reality vr visualization
topic bog
drone
Oculus Quest 2
Mer Bleue
SfM
UAV
url https://www.mdpi.com/2504-446X/5/2/36
work_keys_str_mv AT margaretkalacska comparinguaslidarandstructurefrommotionphotogrammetryforpeatlandmappingandvirtualrealityvrvisualization
AT jpabloarroyomora comparinguaslidarandstructurefrommotionphotogrammetryforpeatlandmappingandvirtualrealityvrvisualization
AT oliverlucanus comparinguaslidarandstructurefrommotionphotogrammetryforpeatlandmappingandvirtualrealityvrvisualization