Optical Flow Performance in the SUAV Flight Speed Estimation Using Farneback Method
Abstract
This paper evaluates the performance of the Farneback optical flow method for estimating the flight speed of a small unmanned aerial vehicle (SUAV) in a simulated 3D World MATLAB-Unreal Engine environment. Optical flow offers a promising solution for velocity estimation, which is crucial for autonomous navigation. A downward-facing monocular camera model was simulated on an SUAV during steady state, straight flight at 100 m altitude and 25 m/s airspeed. Three simulated flight scenes—forest, city block, and water—representing poor, moderate, and rich textures were used to assess the method’s performance. Results demonstrate that using the median estimate of the optical flow field yields accurate velocity estimations in moderate to rich texture scenes. Over the city block and forest scenes, mean velocity estimation accuracy was 0.6 m/s (σ = 0.2 m/s) and 0.3 m/s (σ = 0.4 m/s), respectively. The impact of camera tilt angle and altitude variations on estimation accuracy was also investigated. Both factors introduced bias, with accuracy decreasing to 1.7 m/s (σ = 0.2 m/s) and 1.9 m/s (σ = 0.2 m/s) for +10° and -10° camera tilt, respectively. Similarly, altitude differences of +10m and -10m resulted in reduced accuracy of 1.9 m/s (σ = 0.2 m/s) and 4.3 m/s (σ = 0.1 m/s), respectively. This study demonstrates the potential of the Farneback method for determining flight speed under steady, straight flight conditions with acceptable accuracy.
References
H. Huang and A.V Savkin, “Aerial surveillance in cities: When UAVs take public transportation vehicles,” IEEE Trans. Autom. Sci. Eng., vol. 20, no. 2, pp. 1069–1080, Apr. 2023, doi: 10.1109/TASE.2022.3182057.
R. Kellermann, T. Biehle, and L. Fischer, “Drones for parcel and passenger transportation: A literature review,” Transp. Res. Interdiscip. Perspect., vol. 4, pp. 1–13, Mar. 2020, doi: 10.1016/j.trip.2019.100088.
I. Bisio et al., “A systematic review of drone based road traffic monitoring system,” IEEE Access, vol. 10, pp. 101537–101555, Sep. 2022, doi: 10.1109/ACCESS.2022.3207282.
M. Lyu, Y. Zhao, C. Huang, and H. Huang, “Unmanned aerial vehicles for search and rescue: A survey,” Remote Sens., vol. 16, no. 13, pp. 1–35, Jul. 2023, doi: 10.3390/rs15133266.
D.C. Tsouros, S. Bibi, and P.G. Sarigiannidis, “A review on UAV-based applications for precision agriculture,” Information, vol. 10, no. 11, pp. 1–26, Nov. 2019, doi: 10.3390/info10110349.
G. Pajares, “Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs),” Photogramm. Eng. Remote Sens., vol. 81, no. 4, pp. 281–329, Apr. 2015, doi: 10.14358/PERS.81.4.281.
T. Elmokadem and A.V. Savkin, “Towards fully autonomous UAVs: A survey,” Sensors, vol. 21, no. 18, pp. 1–39, Sep. 2021, doi: 10.3390/s21186223.
Y. Lu, Z. Xue, G.-S. Xia, and L. Zhang, “A survey on vision-based UAV navigation,” Geo-spat. Inf. Sci., vol. 21, no. 1, pp. 21–32, Jan. 2018, doi: 10.1080/10095020.2017.1420509.
X. Ye, F. Song, Z. Zhang, and Q. Zeng, “A review of small UAV navigation system based on multisource sensor fusion,” IEEE Sens. J., vol. 23, no. 17, pp. 18926–18948, Sep. 2023, doi: 10.1109/JSEN.2023.3292427.
D.J. Yeong, G. Velasco-Hernandez, J. Barry, and J. Walsh, “Sensor and sensor fusion technology in autonomous vehicles: A review,” Sensors, vol. 21, no. 6, pp. 1–37, Mar. 2021, doi: 10.3390/s21062140.
Q. Zhang, X. Niu, and C. Shi, “Impact assessment of various IMU error sources on the relative accuracy of the GNSS/INS systems,” IEEE Sens. J., vol. 20, no. 9, pp. 5026–5038, May 2020, doi: 10.1109/JSEN.2020.2966379.
P. Papadimitratos and A. Jovanovic, “Protection and fundamental vulnerability of GNSS,” in 2008 IEEE Int. Workshop Satell. Space Commun., 2008, pp. 167–171, doi: 10.1109/IWSSC.2008.4656777.
H. Chao, Y. Gu, and M. Napolitano, “A survey of optical flow techniques for robotics navigation applications,” J. Intell. Robot. Syst., vol. 73, pp. 361–372, Jan. 2014, doi: 10.1007/s10846-013-9923-6.
Y. Lu, Z. Xue, G.-S. Xia, and L. Zhang, “A survey on vision-based UAV navigation,” Geo-spat. Inf. Sci., vol. 21, no. 1, pp. 21–32, Jan. 2018, doi: 10.1080/10095020.2017.1420509.
E.S. Calverts, “Visual approach and landing aids for aircraft. Fundamental problems analysed by means of perspective diagrams,” R. Aircr. Establ., Farnborough, Great Britain, Report No. EL 1414, 1947.
J.J. Gibson, P. Olum, and F. Rosenblatt, “Parallax and perspective during aircraft landings,” Am. J. Psychol., vol. 68, no. 3, pp. 372–385, Sep. 1955, doi: 10.2307/1418521.
D.C. Niehorster, “Optic flow: A history,” i-Perception, vol. 12, no. 6, pp. 1–49, Nov. 2021, doi: 10.1177/20416695211055766.
H.W. Ho, G. CHE de Croon, and Q. Chu, “Distance and velocity estimation using optical flow from a monocular camera,” Int. J. Micro Air Veh., vol. 9, no. 3, pp. 198–208, Sep. 2017, doi: 10.1177/1756829317695566.
K. McGuire et al., “Efficient optical flow and stereo vision for velocity estimation and obstacle avoidance on an autonomous pocket drone,” IEEE Robot. Autom. Lett., vol. 2, no. 2, pp. 1070–1076, Apr. 2017, doi: 10.1109/LRA.2017.2658940.
S.C. Diamantas and P. Dasgupta, “An active vision approach to height estimation with optical flow.” in Advances in Visual Computing – ISVC 2013, 2013, pp. 160–170.
O. Araar and N. Aouf, “A new hybrid approach for the visual servoing of VTOL UAVs from unknown geometries,” in 22nd Mediterr. Conf. Control Autom., 2014, pp. 1425–1432, doi: 10.1109/MED.2014.6961576.
N.A. Nemade and V.V Gohokar, “Comparative performance analysis of optical flow algorithms for anomaly detection,” in Proc. Int. Conf. Commun. Inf. Process. (ICCIP) 2019, 2019, pp. 1–12.
J.L. Barron, D.J. Fleet, and S.S. Beauchemin, “Performance of optical flow techniques,” Int. J. Comput. Vis., vol. 12, no. 1, pp. 43–77, Feb. 1994, doi: 10.1007/BF01420984.
I. Fantoni and G. Sanahuja, “Optic flow-based control and navigation of mini aerial vehicle,” Aerosp. Lab, vol. 8, pp. 1–9, Dec. 2014, doi: 10.12762/2014.AL08-03.
H. Deng et al., “Global optical flow-based estimation of velocity for multicopters using monocular vision in GPS-denied environments,” Optik, vol. 219, pp. 1–14, Oct. 2020, doi: 10.1016/j.ijleo.2020.164923.
© Jurnal Nasional Teknik Elektro dan Teknologi Informasi, under the terms of the Creative Commons Attribution-ShareAlike 4.0 International License.