My caption

SeeWay: Vision-Language Assistive Navigation for the Visually Impaired

Publication
IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2022, DOI: 10.1109/SMC53654.2022.9945087

Abstract: Assistive navigation for blind or visually impaired (BVI) individuals is of significance to extend their mobility and safety in traveling, enhancing their employment opportunities and fostering personal fulfillment. Conventional research is mainly based on robotic navigation approaches through localization, mapping, and path planning frameworks. They require heavy manual annotation of semantic information in maps and its alignment with sensor mapping. Inspired by the fact that we human beings naturally rely on language instruction inquiry and visual scene understanding to navigate in an unfamiliar environment, this paper proposes a novel vision-language model-based approach for BVI navigation. It does not need heavy-labeled indoor maps and provides a Safe and Efficient E-Wayfinding (SeeWay) assistive solution for BVI individuals. The system consists of a scene-graph map construction module, a navigation path generation module for global path inference by vision-language navigation (VLN), and a navigation with obstacle avoidance module for real-time local navigation. The SeeWay system was deployed on portable iPhone devices with cloud computing assistance for the VLN model inference. The field tests show the effectiveness of the VLN global path finding and local path re-planning. Experiments and quantitative results reveal that heuristic-style instruction outperforms direction/detailed-style instructions for VLN success rate (SR), and the SR decreases as the navigation length increases.

Related