A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft

Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is...

Full description

Bibliographic Details
Main Author: Leishman, Robert C.
Format: Others
Published: BYU ScholarsArchive 2013
Subjects:
Online Access:https://scholarsarchive.byu.edu/etd/3784
https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=4783&context=etd
id ndltd-BGMYU2-oai-scholarsarchive.byu.edu-etd-4783
record_format oai_dc
spelling ndltd-BGMYU2-oai-scholarsarchive.byu.edu-etd-47832019-05-16T03:27:18Z A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft Leishman, Robert C. Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is to develop a new framework that enables improved solutions to this problem and to validate the approach with experiments using a hardware prototype. In Chapter 2 we examine the consequences and practical aspects of using an improved dynamic model for multirotor state estimation, using only IMU measurements. The improved model correctly explains the measurements available from the accelerometers on a multirotor. We provide hardware results demonstrating the improved attitude, velocity and even position estimates that can be achieved through the use of this model. We propose a new architecture to simplify some of the challenges that constrain GPS-denied aerial flight in Chapter 3. At the core, the approach combines visual graph-SLAM with a multiplicative extended Kalman filter (MEKF). More importantly, we depart from the common practice of estimating global states and instead keep the position and yaw states of the MEKF relative to the current node in the map. This relative navigation approach provides a tremendous benefit compared to maintaining estimates with respect to a single global coordinate frame. We discuss the architecture of this new system and provide important details for each component. We verify the approach with goal-directed autonomous flight-test results. The MEKF is the basis of the new relative navigation approach and is detailed in Chapter 4. We derive the relative filter and show how the states must be augmented and marginalized each time a new node is declared. The relative estimation approach is verified using hardware flight test results accompanied by comparisons to motion capture truth. Additionally, flight results with estimates in the control loop are provided. We believe that the relative, vision-based framework described in this work is an important step in furthering the capabilities of indoor aerial navigation in confined, unknown environments. Current approaches incur challenging problems by requiring globally referenced states. Utilizinga relative approach allows more flexibility as the critical, real-time processes of localization and control do not depend on computationally-demanding optimization and loop-closure processes. 2013-04-29T07:00:00Z text application/pdf https://scholarsarchive.byu.edu/etd/3784 https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=4783&context=etd http://lib.byu.edu/about/copyright/ All Theses and Dissertations BYU ScholarsArchive GPS-denied flight relative navigation sensor fusion quadrotor dynamics MEKF Mechanical Engineering
collection NDLTD
format Others
sources NDLTD
topic GPS-denied flight
relative navigation
sensor fusion
quadrotor dynamics
MEKF
Mechanical Engineering
spellingShingle GPS-denied flight
relative navigation
sensor fusion
quadrotor dynamics
MEKF
Mechanical Engineering
Leishman, Robert C.
A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft
description Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is to develop a new framework that enables improved solutions to this problem and to validate the approach with experiments using a hardware prototype. In Chapter 2 we examine the consequences and practical aspects of using an improved dynamic model for multirotor state estimation, using only IMU measurements. The improved model correctly explains the measurements available from the accelerometers on a multirotor. We provide hardware results demonstrating the improved attitude, velocity and even position estimates that can be achieved through the use of this model. We propose a new architecture to simplify some of the challenges that constrain GPS-denied aerial flight in Chapter 3. At the core, the approach combines visual graph-SLAM with a multiplicative extended Kalman filter (MEKF). More importantly, we depart from the common practice of estimating global states and instead keep the position and yaw states of the MEKF relative to the current node in the map. This relative navigation approach provides a tremendous benefit compared to maintaining estimates with respect to a single global coordinate frame. We discuss the architecture of this new system and provide important details for each component. We verify the approach with goal-directed autonomous flight-test results. The MEKF is the basis of the new relative navigation approach and is detailed in Chapter 4. We derive the relative filter and show how the states must be augmented and marginalized each time a new node is declared. The relative estimation approach is verified using hardware flight test results accompanied by comparisons to motion capture truth. Additionally, flight results with estimates in the control loop are provided. We believe that the relative, vision-based framework described in this work is an important step in furthering the capabilities of indoor aerial navigation in confined, unknown environments. Current approaches incur challenging problems by requiring globally referenced states. Utilizinga relative approach allows more flexibility as the critical, real-time processes of localization and control do not depend on computationally-demanding optimization and loop-closure processes.
author Leishman, Robert C.
author_facet Leishman, Robert C.
author_sort Leishman, Robert C.
title A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft
title_short A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft
title_full A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft
title_fullStr A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft
title_full_unstemmed A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft
title_sort vision-based relative navigation approach for autonomous multirotor aircraft
publisher BYU ScholarsArchive
publishDate 2013
url https://scholarsarchive.byu.edu/etd/3784
https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=4783&context=etd
work_keys_str_mv AT leishmanrobertc avisionbasedrelativenavigationapproachforautonomousmultirotoraircraft
AT leishmanrobertc visionbasedrelativenavigationapproachforautonomousmultirotoraircraft
_version_ 1719186084556963840