Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Grabe, Volker | - |
dc.contributor.author | Buelthoff, Heinrich H. | - |
dc.contributor.author | Scaramuzza, Davide | - |
dc.contributor.author | Giordano, Paolo Robuffo | - |
dc.date.accessioned | 2021-09-04T14:54:06Z | - |
dc.date.available | 2021-09-04T14:54:06Z | - |
dc.date.created | 2021-06-16 | - |
dc.date.issued | 2015-07 | - |
dc.identifier.issn | 0278-3649 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/93191 | - |
dc.description.abstract | For the control of unmanned aerial vehicles (UAVs) in GPS-denied environments, cameras have been widely exploited as the main sensory modality for addressing the UAV state estimation problem. However, the use of visual information for ego-motion estimation presents several theoretical and practical difficulties, such as data association, occlusions, and lack of direct metric information when exploiting monocular cameras. In this paper, we address these issues by considering a quadrotor UAV equipped with an onboard monocular camera and an inertial measurement unit (IMU). First, we propose a robust ego-motion estimation algorithm for recovering the UAV scaled linear velocity and angular velocity from optical flow by exploiting the so-called continuous homography constraint in the presence of planar scenes. Then, we address the problem of retrieving the (unknown) metric scale by fusing the visual information with measurements from the onboard IMU. To this end, two different estimation strategies are proposed and critically compared: a first exploiting the classical extended Kalman filter (EKF) formulation, and a second one based on a novel nonlinear estimation framework. The main advantage of the latter scheme lies in the possibility of imposing a desired transient response to the estimation error when the camera moves with a constant acceleration norm with respect to the observed plane. We indeed show that, when compared against the EKF on the same trajectory and sensory data, the nonlinear scheme yields considerably superior performance in terms of convergence rate and predictability of the estimation. The paper is then concluded by an extensive experimental validation, including an onboard closed-loop control of a real quadrotor UAV meant to demonstrate the robustness of our approach in real-world conditions. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | SAGE PUBLICATIONS LTD | - |
dc.subject | ABSOLUTE SCALE | - |
dc.subject | AERIAL VEHICLE | - |
dc.subject | PART I | - |
dc.subject | VISION | - |
dc.subject | FUSION | - |
dc.subject | IMU | - |
dc.title | Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Buelthoff, Heinrich H. | - |
dc.identifier.doi | 10.1177/0278364915578646 | - |
dc.identifier.scopusid | 2-s2.0-84935095381 | - |
dc.identifier.wosid | 000358318800002 | - |
dc.identifier.bibliographicCitation | INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, v.34, no.8, pp.1114 - 1135 | - |
dc.relation.isPartOf | INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH | - |
dc.citation.title | INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH | - |
dc.citation.volume | 34 | - |
dc.citation.number | 8 | - |
dc.citation.startPage | 1114 | - |
dc.citation.endPage | 1135 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Robotics | - |
dc.relation.journalWebOfScienceCategory | Robotics | - |
dc.subject.keywordPlus | ABSOLUTE SCALE | - |
dc.subject.keywordPlus | AERIAL VEHICLE | - |
dc.subject.keywordPlus | PART I | - |
dc.subject.keywordPlus | VISION | - |
dc.subject.keywordPlus | FUSION | - |
dc.subject.keywordPlus | IMU | - |
dc.subject.keywordAuthor | Sensor fusion | - |
dc.subject.keywordAuthor | aerial robotics | - |
dc.subject.keywordAuthor | visual-based control | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.