Drone Localization using Global Navigation Satellite System and Separated Feature Visual Odometry Data Fusion
Received: 29 September 2024 | Revised: 22 October 2024 | Accepted: 23 November 2024 | Online: 2 February 2025
Corresponding author: Riza Agung Firmansyah
Abstract
The localization system is the most important part of the overall drone navigation system. The Global Positioning System (GPS) or Global Navigation Satellite System (GNSS) is the main device commonly used in a drone. However, under certain conditions, GPS or GNSS may not function optimally, such as in situations of signal jamming or enclosed environments. This paper implemented a new approach to address this issue by combining GNSS data with Visual Odometry (VO) through Machine Learning (ML) methods. The followed process consists of three main stages. First, performing speed and orientation estimation using VO. Second, performing left and right feature separation on the images to generate a more stable and robust estimation of speed and rotation. Third, refining speed and orientation estimation by integrating GNSS data through ML-based data fusion. The proposed method strives to enhance drone localization accuracy, despite disruptions or unavailability of GNSS signals. The research results indicate that the introduced method significantly reduces Absolute Translation Error (ATE) compared to utilizing VO or GNSS separately. The average ATE produced reached 4.38 m and an orientation of 8.26°, indicating that this data fusion approach provides a significant improvement in drone localization accuracy, making it reliable in operational scenarios with limited GNSS signals.
Keywords:
drone localization, data fusion, machine learning, visual odometry, GNSSDownloads
References
B. Van den Bergh and S. Pollin, "Keeping UAVs Under Control During GPS Jamming," IEEE Systems Journal, vol. 13, no. 2, pp. 2010–2021, Jun. 2019.
F. Jametoni and D. E. Saputra, "A Study on Autonomous Drone Positioning Method," in 2021 Sixth International Conference on Informatics and Computing, Jakarta, Indonesia, 2021, pp. 1–5.
F. Alotaibi, A. Al-Dhaqm, and Y. D. Al-Otaibi, "A Conceptual Digital Forensic Investigation Model Applicable to the Drone Forensics Field," Engineering, Technology & Applied Science Research, vol. 13, no. 5, pp. 11608–11615, Oct. 2023.
S. Mulugeta and T. Kassa, "Investigation of GPS Loss of Lock Occurrence and its Characteristics Over Ethiopia using Geodetic GPS Receivers of the IGS Network," Advances in Space Research, vol. 69, no. 2, pp. 939–950, Jan. 2022.
A. Hussain, A. Ahmed, H. Magsi, J. B. Soomro, S. S. H. Bukhari, and J.-S. Ro, "Adaptive Data Length Method for GPS Signal Acquisition in Weak to Strong Fading Conditions," Electronics, vol. 10, no. 14, Jul. 2021, Art. no. 1735.
W. Magiera et al., "Accuracy of Code GNSS Receivers under Various Conditions," Remote Sensing, vol. 14, no. 11, May 2022, Art. no. 2615.
R. A. Firmansyah, W. S. Pambudi, T. Suheta, E. A. Zuliari, S. Muharom, and M. B. S. Hidayatullah, "Implementation of Artificial Neural Networks for Localization System on Rescue Robot," in 2018 Electrical Power, Electronics, Communications, Controls and Informatics Seminar, Batu, Indonesia, 2018, pp. 305–309.
R. A. Firmansyah, T. A. Sardjono, and R. Mardiyanto, "Improving the Adaptive Monte Carlo Localization Accuracy using a Convolutional Neural Network," Jurnal Nasional Teknik Elektro Dan Teknologi Informasi, vol. 12, no. 3, pp. 167–174, Aug. 2023.
Y. Chen, B. Xu, B. Wang, J. Na, and P. Yang, "GNSS Reconstrainted Visual–Inertial Odometry System using Factor Graphs," IEEE Geoscience and Remote Sensing Letters, vol. 20, pp. 1–5, 2023.
H. Bavle, P. De La Puente, J. P. How, and P. Campoy, "VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems," IEEE Access, vol. 8, pp. 60704–60718, 2020.
T. Zhang, C. Liu, J. Li, M. Pang, and M. Wang, "A New Visual Inertial Simultaneous Localization and Mapping (SLAM) Algorithm Based on Point and Line Features," Drones, vol. 6, no. 1, Jan. 2022, Art. no. 23.
E. Petritoli, F. Leccese, and G. S. Spagnolo, "Inertial Navigation Systems (INS) for Drones: Position Errors Model," in 2020 IEEE 7th International Workshop on Metrology for AeroSpace (MetroAeroSpace), 2020, pp. 500–504.
M. Gazzea, M. Pacevicius, D. O. Dammann, A. Sapronova, T. M. Lunde, and R. Arghandeh, "Automated Power Lines Vegetation Monitoring using High-Resolution Satellite Imagery," IEEE Transactions on Power Delivery, vol. 37, no. 1, pp. 308–316, Feb. 2022.
Y. Wang, H. Chen, Y. Liu, and S. Zhang, "Edge-Based Monocular Thermal-Inertial Odometry in Visually Degraded Environments," IEEE Robotics and Automation Letters, vol. 8, no. 4, pp. 2078–2085, Apr. 2023.
S. Muharom, R. A. Firmansyah, and Y. A. Prabowo, "Local Generating Map System using Rviz ROS and Kinect Camera for Rescue Robot Application," International Journal of Electronics and Telecommunication, vol. 69, no. 3, pp. 621–626, Jul. 2023.
J. Liu, X. Li, Y. Liu, and H. Chen, "RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments," IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 9573–9580, Oct. 2022.
G. Fink, M. Franke, A. F. Lynch, K. Röbenack, and B. Godbolt, "Visual Inertial SLAM: Application to Unmanned Aerial Vehicles," IFAC-PapersOnLine, vol. 50, no. 1, pp. 1965–1970, Jul. 2017.
H. Tang, X. Niu, T. Zhang, L. Wang, and J. Liu, "LE-VINS: A Robust Solid-State-LiDAR-Enhanced Visual-Inertial Navigation System for Low-Speed Robots," IEEE Transactions on Instrumentation and Measurement, vol. 72, pp. 1–13, 2023.
P. Gu and Z. Meng, "S-VIO: Exploiting Structural Constraints for RGB-D Visual Inertial Odometry," IEEE Robotics and Automation Letters, vol. 8, no. 6, pp. 3542–3549, Jun. 2023.
L. Crupi, E. Cereda, A. Giusti, and D. Palossi, "Sim-to-Real Vision-Depth Fusion CNNs for Robust Pose Estimation Aboard Autonomous Nano-quadcopters," in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, Detroit, MI, USA, Oct. 2023, pp. 7711–7717.
H. Jo and E. Kim, "New Monte Carlo Localization using Deep Initialization: A Three-Dimensional LiDAR and a Camera Fusion Approach," IEEE Access, vol. 8, pp. 74485–74496, 2020.
S. Guan and X. Luo, "Fusing Ultra-wideband Range Measurements with IMU for Mobile Robot Localization," in 2021 11th International Conference on Intelligent Control and Information Processing, Dali, China, 2021, pp. 107–111.
Downloads
How to Cite
License
Copyright (c) 2024 Riza Agung Firmansyah, Syahri Muharom, Ilmiatul Masfufiah, Ardylan Heri Kisyarangga, Dzichril Fahimatulloh Mandhia Al Farizi Rosyad

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain the copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) after its publication in ETASR with an acknowledgement of its initial publication in this journal.