Gandham, Rishith2025-01-042025-01-042025-01-03vt_gsexam:42232https://hdl.handle.net/10919/123896Globally, over 2.2 billion people face vision impairment, necessitating innovative solutions for safe, independent navigation. Traditional aids like canes, guide dogs, and GPS offer basic support but lack the sophistication to provide contextual understanding, precise navigation, or real-time hazard alerts. This project presents SmartGuide, a mobile app designed to enhance the independence of visually impaired users through AI-driven features. SmartGuide offers three main functions: (1) Smart Vision, using the GPT-4 Vision API to deliver spoken feedback about surroundings; (2) Navigation, combining QR code detection via YOLO with ZoeDepth for depth estimation, guiding users to destinations through the shortest path calculated by Dijkstra's algorithm; and (3) Obstacle Detection and Alerts, where YOLO identifies obstacles, and ZoeDepth estimates their distance to inform users of potential hazards. By adapting its responses based on user feedback, SmartGuide provides personalized, reliable guidance that empowers visually impaired individuals to navigate with confidence and safety, advancing the field of accessible technology.ETDenIn CopyrightIndoor NavigationVisually ImpairedComputer VisionSMARTGUIDE: Revolutionizing the Depth and Dependability of Vision-Impaired NavigationThesis