INVys: Indoor Navigation System for Persons with Visual Impairment Using RGB-D Camera

  • Widyawan Universitas Gadjah Mada
  • Muhamad Risqi Utama Saputra Monash University, Indonesia
  • Paulus Insap Santosa Universitas Gadjah Mada
Keywords: Assistive Technology, Image Recognition, Object Detection, Wearable Computers

Abstract

This research presents the INVys system aiming to solve the problem of indoor navigation for persons with visual impairment by leveraging the capabilities of an RGB-D camera. The system utilizes the depth information provided by the camera for micronavigation, which involves sensing and avoiding obstacles in the immediate environment. The INVys system proposes a novel auto-adaptive double thresholding (AADT) method to detect obstacles, calculate their distance, and provide feedback to the user to avoid them. AADT has been evaluated and compared to baseline and auto-adaptive thresholding (AAT) methods using four criteria: accuracy, precision, robustness, and execution time. The results indicate that AADT excels in accuracy, precision, and robustness, making it a suitable method for obstacle detection and avoidance in the context of indoor navigation for persons with visual impairment. In addition to micronavigation, the INVys system utilizes the color information provided by the camera for macro-navigation, which involves recognizing and following navigational markers called optical glyphs. The system uses an automatic glyph binarization method to recognize the glyphs and evaluates them using two criteria: accuracy and execution time. The results indicate that the proposed method is accurate and efficient in recognizing the optical glyphs, making it suitable for use as a navigational marker in indoor environments. Furthermore, the study also provides a correlation between the size of the glyphs, the distance of the recognized glyphs, the tilt condition of the recognized glyphs, and the accuracy of glyph recognition. These correlations define the minimum glyph size that can be practically used for indoor navigation for persons with visual impairment. Overall, this research presents a promising solution for indoor navigation for persons with visual impairment by leveraging the capabilities of an RGB-D camera and proposing novel methods for obstacle detection and avoidance and for recognizing navigational markers.

References

Y. Wu, H.B. Zhu, Q.X. Du, and S.M. Tang, “A Survey of the Research Status of Pedestrian Dead Reckoning Systems Based on Inertial Sensors,” Int. J. Automat., Comput., Vol. 16, No. 1, pp. 65–83, Feb. 2019, doi: 10.1007/s11633-018-1150-y.

H. Petrie, “User Requirements for a GPS-Based Travel Aid for Blind People,” in Proc. Conf. Orientat. Navig. Syst. Blind Persons, 1995.

W.C.S.S. Simões, et al., “A Review of Technologies and Techniques for Indoor Navigation Systems for the Visually Impaired,” Sens., Vol. 20, No. 14, pp. 1–35, Jul. 2020, doi: 10.3390/s20143935.

H. Takizawa, Y. Kuramochi, and M. Aoyagi, “Kinect Cane System: Recognition Aid of Available Seats for the Visually Impaired,” 2019 IEEE 1st Glob. Conf. Life Sci., Technol. (LifeTech), 2019, pp. 189–193, doi: 10.1109/LifeTech.2019.8884061.

A. Khan, F. Moideen, and J. Lopez, “KinDectect: Kinect Detecting Objects,” in Computers Helping People with Special Needs, K. Miesenberger et al., Eds., Heidelberg, Germany: Springer, 2012, pp. 588–595, doi: 10.1007/978-3-642-31534-3_86.

B. Singh, “A Framework of Connected Smart Sensing Canes for Obstacle Detection and Avoidance,” 2022 IEEE 10th Region 10 Humanit. Technol. Conf. (R10-HTC), 2022, pp. 76–80, doi: 10.1109/r10-htc54060.2022.9930047.

M.A. Ikbal, F. Rahman, and M.H. Kabir, “Microcontroller Based Smart Walking Stick for Persons with Visual Impairment,” 2018 4th Int. Conf. Elect. Eng., Inf., Commun. Technol. (iCEEiCT), 2018, pp. 255–259, doi: 10.1109/CEEICT.2018.8628048.

H. Watanabe, M. Sumiya, and T. Terada, “Human-Machine Cooperative Echolocation Using Ultrasound,” IEEE Access, Vol. 10, pp. 125264–125278, 2022, doi: 10.1109/ACCESS.2022.3224468.

M.H.A. Wahab et al., “Smart Cane: Assistive Cane for Visually-Impaired People,” Int. J. Comput. Sci, Vol. 8, No. 4, pp. 21–27, Jul. 2011.

Y. Zhao et al., “Laser Based Navigation in Asymmetry and Complex Environment,” Symmetry, Vol. 14, No. 2, pp. 1–18, Jan. 2022, doi: 10.3390/sym14020253.

S. Real and A. Araujo, “Navigation Systems for the Blind and Visually Impaired: Past Work, Challenges, and Open Problems,” Sens., Vol. 19, No. 15, pp. 1–20, Aug. 2019, doi: 10.3390/s19153404.

N.A. Kumar, Y.H. Thangal, and K.S. Beevi, “IoT Enabled Navigation System for Blind,” 2019 IEEE R10 Humanit. Technol. Conf. (R10-HTC) (47129), 2019, pp. 186–189, doi: 10.1109/R10-HTC47129.2019.9042483.

P.S. Farahsari, A. Farahzadi, J. Rezazadeh, and A. Bagheri, “A Survey on Indoor Positioning Systems for IoT-Based Applications,” IEEE Internet Things J., Vol. 9, No. 10, pp. 7680–7699, May 2022, doi: 10.1109/JIOT.2022.3149048.

D.R. Bolla, S. Trisheela, P. Nagarathana, and H. Sarojadevi, “Object Detection in Computer Vision Using Machine Learning Algorithm for Persons with Visual Impairment,” 2022 IEEE 2nd Mysore Sub Sect. Int. Conf. (MysuruCon), 2022, pp. 1–6, doi: 10.1109/MysuruCon55714.2022.9972494.

R. Oktem and E. Aydin, “An RFID Based Indoor Tracking Method for Navigating Impaired People,” Turkish J. Elect. Eng., Comput. Sci., Vol. 18, No. 2, pp. 185–196, Mar. 2010, doi: 10.3906/elk-0904-3.

S. Koley and R. Mishra, “Voice Operated Outdoor Navigation System for Visually Impaired Persons,” Int. J. Eng. Trends, Technol., Vol. 3, No. 2, pp. 153–157, Mar.–Apr. 2012.

D. Ni et al., “The Design and Implementation of a Walking Assistant System with Vibrotactile Indication and Voice Prompt for the Visually Impaired,” 2013 IEEE Int. Conf. Robot., Biomimetics (ROBIO), 2013, pp. 2721–2726, doi: doi: 10.1109/ROBIO.2013.6739885.

M.R.U. Saputra, Widyawan, G.D. Putra, and P.I. Santosa, “Indoor Human Tracking Application Using Multiple Depth-Cameras,” 2012 Int. Conf. Adv. Comput. Sci., Inf. Sys. (ICACSIS), 2012, pp. 307–312.

A. Kirillov (2011) “From Glyph Recognition to Augmented Reality,” [Online], https://www.codeproject.com/Articles/258856/From-glyph-recognition-to-augmented-reality, access date: 12-Oct-2022.

D.R. Bolgiano and E.D. Meeks, “A Laser Cane for the Blind,” IEEE J. Quantum Electron., Vol. 3, No. 6, pp. 268–268, Jun. 1967, doi: 10.1109/JQE.1967.1074528.

D.I. Ahlmark, H. Fredriksson, and K. Hyyppä, “Obstacle Avoidance Using Haptics and a Laser Rangefinder,” IEEE Workshop Adv. Robot., Its Soc. Impacts (ARSO), 2013, pp. 76¬–81, doi: 10.1109/ARSO.2013.6705509.

T. Hiramoto, T. Araki, and T. Suzuki, “Infrared Guided White Cane for Assisting the Visually Impaired to Walk Alone,” 2022 Int. Conf. Mach. Learn., Cybern. (ICMLC), 2022, pp. 276–280, doi: 10.1109/icmlc56445.2022.9941336.

E.B. Kaiser and M. Lawo, “Wearable Navigation System for the Visually Impaired and Blind People,” 2012 IEEE/ACIS 11th Int. Conf. Comput. Inf. Sci., 2012, pp. 230–233, doi: 10.1109/ICIS.2012.118.

J.M. Benjamin, “The Laser Cane,” Bull. Prosthet. Res., pp. 443–450, 1974.

L.A. Guerrero, F. Vasquez, and S.F. Ochoa, “An Indoor Navigation System for the Visually Impaired,” Sens., Vol. 12, No. 6, pp. 8236–58, Jun. 2012, doi: 10.3390/s120608236.

A. Riener and H. Hartl, “‘Personal Radar’: A Self-governed Support System to Enhance Environmental Perception,” Proc. 26th Annu. BCS Interact. Spec. Group Conf. People, Comput., 2012, pp. 147–156.

Y. Wei and M. Lee, “A Guide-Dog Robot System Research for the Visually Impaired,” IEEE Int. Conf. Ind. Technol. (ICIT), 2014, pp. 800–805, doi: 10.1109/ICIT.2014.6894906.

A. Aladren, G. Lopez-Nicolas, L. Puig, and J.J. Guerrero, “Navigation Assistance for the Visually Impaired Using RGB-D Sensor with Range Expansion,” IEEE Syst. J., Vol. 10, No. 3, pp. 922–932, Sep. 2016, doi: 10.1109/JSYST.2014.2320639.

D. Dakopoulos and N.G. Bourbakis, “Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey,” IEEE Trans. Syst., Man, Cybern. Part C (Appl., Rev.), Vol. 40, No. 1, pp. 25–35, Jan. 2010, doi: 10.1109/TSMCC.2009.2021255.

D. Bernabei et al., “A Low-Cost Time-Critical Obstacle Avoidance System for the Visually Impaired,” 2011 Int. Conf. Indoor Position., Indoor Navig. (IPIN), 2011, pp. 21–23.

V. Mohandas and R. Paily, “Stereo Disparity Estimation Algorithm for Blind Assisting System,” CSI Trans. ICT, Vol. 1, No. 1, pp. 3–8, Mar. 2013, doi: 10.1007/s40012-012-0004-y.

R.G. Praveen and R.P. Paily, “Blind Navigation Assistance for Visually Impaired Based on Local Depth Hypothesis from a Single Image,” Procedia Eng., Vol. 64, pp. 351–360, 2013, doi: 10.1016/j.proeng.2013.09.107.

C. Lee, Y. Su, and L. Chen, “An Intelligent Depth-Based Obstacle Detection System for Visually-Impaired Aid Applications,” 2012 13th Int. Workshop Image Anal. Multimed. Interact. Serv., May 2012, pp. 1–4, doi: 10.1109/WIAMIS.2012.6226753.

T.K. Chuan, M. Hartono, and N. Kumar, “Anthropometry of the Singaporean and Indonesian Populations,” Int. J. Ind. Ergonom., Vol. 40, No. 6, pp. 757–766, Nov. 2010, doi: 10.1016/j.ergon.2010.05.001.

M.R.U. Saputra, Widyawan, and P.I. Santosa, “Obstacle Avoidance for Visually Impaired Using Auto-Adaptive Thresholding on Kinect's Depth Image,” 2014 IEEE 11th Int. Conf. Ubiquitous Intell., Comput., 2014 IEEE 11th Int. Conf. Autonomic, Trusted Comput., 2014 IEEE 14th Int. Conf. Scalable Comput., Commun., Its Associated Workshops, 2014, pp. 337–342, doi: 10.1109/UIC-ATC-ScalCom.2014.108.

N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man, Cybern., Vol. 9, No. 1, pp. 62–66, Jan. 1979, doi: 10.1109/TSMC.1979.4310076.

L. Chen, X. Nguyen, and C. Liang, “Object Segmentation Method Using Depth Slicing and Region Growing Algorithms,” Int. Conf. 3D Syst., Appl. Gen., 2010, pp. 4–7.

F.Y. Shih, Image Processing and Pattern Recognition: Fundamentals and Techniques, 4th ed. Hoboken, USA: Wiley-IEEE Press, 2010.

Kinect for Windows, Human Interface Guidelines v1.7, Microsoft, Redmond, WA, USA, 2013.

X. Guo et al., “A Survey on Fusion-Based Indoor Positioning,” IEEE Commun. Surv., Tut., Vol. 22, No. 1, pp. 566–594, 2020, doi: 10.1109/COMST.2019.2951036.

J. Han, L. Shao, D. Xu, and J. Shotton, “Enhanced Computer Vision with Microsoft Kinect Sensor: A Review,” IEEE Trans. Cybern., Vol. 43, No. 5, pp. 1318–1334, Oct. 2013, doi: 10.1109/TCYB.2013.2265378.

Published
2023-11-28
How to Cite
Widyawan, Muhamad Risqi Utama Saputra, & Paulus Insap Santosa. (2023). INVys: Indoor Navigation System for Persons with Visual Impairment Using RGB-D Camera. Jurnal Nasional Teknik Elektro Dan Teknologi Informasi, 12(4), 293-302. https://doi.org/10.22146/jnteti.v12i4.6372
Section
Articles