Kunal More, Gargi Pawar, Aditya Singh, Aditya Rathi, Prof. Shilpa Dhopte
School of Computing, MIT Art, Design and Technology University, Pune, Maharashtra
Abstract
Visually impaired individuals face significant challenges in safe and independent navigation, particularly indoors and in unfamiliar environments, due to limitations in existing mobility aids. This paper presents a Visually Impaired Assistance AI Model that integrates advanced artificial intelligence and multi-sensor fusion to enhance navigational autonomy. The system employs real-time object detection using YOLOv8, depth estimation via MiDaS, and sensor data from cameras, GPS, IMU, microphones, and light sensors to deliver accurate obstacle detection and hazard prediction. Combining computer vision, Visual SLAM, and AR marker-based indoor positioning, it facilitates context-aware path planning and micro-routing. The platform provides adaptive audio, haptic, and bone-conduction feedback, ensuring accessible and effective user interaction. Experimental results demonstrate significant improvements in navigational accuracy, user confidence, and safety, outperforming traditional aids in diverse scenarios. The system's scalable design supports integration with smart city infrastructure and remote caregiver monitoring, establishing a foundation for future assistive technologies that empower visually impaired users toward greater independence and quality of life. Visually impaired individuals face significant challenges in safe and independent navigation, especially in indoor and unfamiliar environments, due to limitations of traditional aids. This paper presents a Visually Impaired Assistance AI Model that integrates real-time object detection (YOLOv8), depth estimation (MiDaS), and multi-sensor fusion from cameras, GPS, IMU, microphones, and light sensors to enhance spatial awareness. Combining computer vision, Visual SLAM, and AR marker-based indoor positioning, the system delivers dynamic path planning and obstacle hazard prediction. It provides adaptive audio, haptic, and bone-conduction feedback for accessible user interaction. Experimental results show improved navigation accuracy, user confidence, and safety, outperforming existing aids. The scalable platform supports smart city integration and remote monitoring, paving the way for enhanced independent mobility and quality of life for visually impaired users.
Keywords:
Journal Name :
EPRA International Journal of Research & Development (IJRD)

VIEW PDF
Published on : 2026-04-26

Vol : 11
Issue : 4
Month : April
Year : 2026
Copyright © 2026 EPRA JOURNALS. All rights reserved
Developed by Peace Soft