Human-Centered Computing

Predicting Wheelchair Stability While Crossing a Curb Using RGB-Depth Vision

Abstract: Handicapped individuals often rely heavily on various assistive technologies including wheelchairs and the purpose of these technologies is to enable greater levels of independence for the user. In the development of autonomous wheelchairs, it is imperative that the wheelchair maintains appropriate stability for the user in an outdoor urban environment.

Driver Drowsiness Behavior Detection and Analysis Using Vision-Based Multimodal Features for Driving Safety

Abstract: Driving inattention caused by drowsiness has been a significant reason for vehicle crash accidents, and there is a critical need to augment driving safety by monitoring driver drowsiness behaviors. For real-time drowsy driving awareness, we propose a vision-based driver drowsiness monitoring system (DDMS) for driver drowsiness behavior recognition and analysis.

Vision-Based Mobile Indoor Assistive Navigation Aid for Blind People

Abstract: This paper presents a new holistic vision-based mobile assistive navigation system to help blind and visually impaired people with indoor independent travel. The system detects dynamic obstacles and adjusts path planning in real-time to improve navigation safety.

An Adaptive Stair-ascending Gait Generation Approach Based on Depth Camera for Lower Limb Exoskeleton.” Review of Scientific Instruments

Abstract: The mobility on stairways is a daily challenge for seniors and people with dyskinesia. Lower limb exoskeletons can be effective assistants to improve their life quality. In this paper, we present an adaptive stair-ascending gait generation algorithm based on a depth camera for lower limb exoskeletons.

An Assistive Indoor Navigation System for the Visually Impaired in Multi-Floor Environments (Best Conference Paper Award)

Abstract: This paper presents an innovative wearable system to assist visually impaired people navigate indoors in real time. Our proposed system incorporates state-of-the-art handheld devices from Google’s Project Tango and integrates path planner and obstacle avoidance submodules, as well as human-computer interaction techniques, to provide assistance to the user.

CCNY Smart Cane

Abstract: This paper presents SmartCane - the CCNY Smart Cane system, a robotic white cane and mobile device navigation software for visually impaired people. The system includes software for Google Tango devices that utilizes simultaneous localization and mapping (SLAM) to plan a path and guide a visually impaired user to waypoints within indoor environments.

Guided Text Spotting for Assistive Blind Navigation in Unfamiliar Indoor Environments

Abstract: Scene text in indoor environments usually preserves and communicates important contextual information which can significantly enhance the independent travel of blind and visually impaired people. In this paper, we present an assistive text spotting navigation system based on an RGB-D mobile device for blind or severely visually impaired people.

A Wearable Indoor Navigation System with Context-based Decision Making For Visually Impaired

Abstract: This paper presents a wearable indoor navigation system that helps visually impaired user to perform indoor navigation. The system takes advantage of the Simultaneous Localization and Mapping (SLAM) and semantic path planning to accomplish localization and navigation tasks while collaborating with the visually impaired user.

ISANA: Wearable Context-Aware Indoor Assistive Navigation with Obstacle Avoidance for the Blind

Abstract: This paper presents a novel mobile wearable context-aware indoor maps and navigation system with obstacle avoidance for the blind. The system includes an indoor map editor and an App on Tango devices with multiple modules.