Show simple item record

dc.contributor.authorClawson, Taylor Scott
dc.description141 pages
dc.description.abstractRecent developments in manufacturing, processing capabilities, and sensor design point to a future in which sensor-equipped robots will assist humans by autonomously performing difficult, hazardous tasks such as visually inspecting utilities and infrastructure, locating a missing person, or monitoring potentially hazardous environmental conditions. To achieve this, perception algorithms must be developed which use data from on-board exteroceptive sensors to gather information on targets while navigating autonomously through unknown environments. Often, these tasks are best performed by highly maneuverable aerial robots which can easily avoid obstacles and access unique vantage points, but also require computationally efficient perception algorithms and power-efficient sensors due to their limited power budgets. Insect-scale flapping-wing robots represent an extreme, motivating example of aerial robots which, although highly maneuverable due to their extreme reduction in size and weight, also require high frequency sensing and control loops that operate on just milliwatts of power. However, conventional sensing and perception frameworks operate using too much data to operate at the high frequencies required by these robots. Energy-efficient, biologically inspired neuromorphic processors and sensors present a potential solution to this challenge. Neuromorphic chips and their software analog, spiking neural networks (SNNs), can be trained to approximate arbitrary functions while learning and adapting online. Neuromorphic cameras consume only milliwatts of power despite operating with microsecond temporal precision. However, existing neuromorphic perception algorithms either collect only sparse information about the environment or do not account for a moving sensor and are thus inapplicable to flapping-wing robot navigation. This work presents a framework of computationally efficient methods for neuromorphic perception and control to enable autonomous obstacle avoidance and target detection using highly agile micro aerial vehicles (MAVs). The SNN-based control method presented here is developed using a comprehensive, full-envelope flapping-wing flight dynamics model also presented in this work. This approach models flapping flight based on blade-element theory and is used to determine a broad class of set points, trim conditions, and quasi-steady maneuvers including coordinated turns. The model, analysis, and stability results are successfully validated experimentally for both stable and unstable modes. The SNN-based control approach is shown to be capable of controlling maneuvers including takeoff, landing, and coordinated turns throughout the flight envelope while adapting online to account for unmodeled parameter variations and disturbances. Computationally efficient neuromorphic visual perception techniques for obstacle avoidance and target detection are also developed comprising methods for dense optical flow estimation, dense monocular depth estimation, and independent motion detection. These methods leverage the high sensing rate of neuromorphic cameras to enable simple, linear assumptions which lead to improved accuracy and reduced computational cost compared with existing methods. In total, the methods presented here represent a computationally efficient framework for target tracking and obstacle avoidance with autonomous micro aerial vehicles.
dc.titleNeuromorphic Sensing and Control of Autonomous Micro-Aerial Vehicles
dc.typedissertation or thesis Engineering of Philosophy D., Aerospace Engineering
dc.contributor.chairFerrari, Silvia
dc.contributor.committeeMemberRuina, Andy
dc.contributor.committeeMemberWang, Z. Jane

Files in this item


This item appears in the following Collection(s)

Show simple item record