TY - GEN
T1 - On-board Deep-learning-based Unmanned Aerial Vehicle Fault Cause Detection and Identification
AU - Sadhu, Vidyasagar
AU - Zonouz, Saman
AU - Pompili, Dario
N1 - Funding Information:
Acknowledgements: We thank PhD student Sriharsha Etigowni for his help in the experiments. This work was supported by the ONR YIP Grant No. 11028418 and by the National Science Foundation (NSF).
Publisher Copyright:
© 2020 IEEE.
PY - 2020/5
Y1 - 2020/5
N2 - With the increase in use of Unmanned Aerial Vehicles (UAVs)/drones, it is important to detect and identify causes of failure in real time for proper recovery from a potential crash-like scenario or post incident forensics analysis. The cause of crash could be either a fault in the sensor/actuator system, a physical damage/attack, or a cyber attack on the drone's software. In this paper, we propose novel architectures based on deep Convolutional and Long Short-Term Memory Neural Networks (CNNs and LSTMs) to detect (via Autoencoder) and classify drone mis-operations based on real-time sensor data. The proposed architectures are able to learn high-level features automatically from the raw sensor data and learn the spatial and temporal dynamics in the sensor data. We validate the proposed deep-learning architectures via simulations and realworld experiments on a drone. Empirical results show that our solution is able to detect (with over 90% accuracy) and classify various types of drone mis-operations (with about 99% accuracy (simulation data) and upto 85% accuracy (experimental data)).
AB - With the increase in use of Unmanned Aerial Vehicles (UAVs)/drones, it is important to detect and identify causes of failure in real time for proper recovery from a potential crash-like scenario or post incident forensics analysis. The cause of crash could be either a fault in the sensor/actuator system, a physical damage/attack, or a cyber attack on the drone's software. In this paper, we propose novel architectures based on deep Convolutional and Long Short-Term Memory Neural Networks (CNNs and LSTMs) to detect (via Autoencoder) and classify drone mis-operations based on real-time sensor data. The proposed architectures are able to learn high-level features automatically from the raw sensor data and learn the spatial and temporal dynamics in the sensor data. We validate the proposed deep-learning architectures via simulations and realworld experiments on a drone. Empirical results show that our solution is able to detect (with over 90% accuracy) and classify various types of drone mis-operations (with about 99% accuracy (simulation data) and upto 85% accuracy (experimental data)).
UR - http://www.scopus.com/inward/record.url?scp=85092743339&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85092743339&partnerID=8YFLogxK
U2 - 10.1109/ICRA40945.2020.9197071
DO - 10.1109/ICRA40945.2020.9197071
M3 - Conference contribution
AN - SCOPUS:85092743339
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 5255
EP - 5261
BT - 2020 IEEE International Conference on Robotics and Automation, ICRA 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Robotics and Automation, ICRA 2020
Y2 - 31 May 2020 through 31 August 2020
ER -