TLDR: NavVI is a new telerobotic simulation system that helps visually impaired individuals navigate complex warehouse environments. It uses a mobile robot controlled via a game controller, providing synchronized haptic (vibrations for obstacles), auditory (voice cues, clock-based directions), and visual (high-contrast paths) feedback. The system dynamically plans paths to avoid both static and moving obstacles, offering a safe and repeatable testbed for accessible teleoperation research and promoting inclusive workforce participation.
Industrial warehouses are bustling, complex environments filled with moving forklifts, shelves, and personnel. For individuals who are blind or have low vision (BLV), navigating such spaces can be incredibly challenging and risky. Despite the growing emphasis on workplace inclusion, BLV individuals remain underrepresented in these industrial settings, largely due to the lack of suitable tools that allow them to independently control or navigate these dynamic spaces.
Most existing assistive technologies, like smart canes or wearable devices, are not designed for the intricate demands of a large-scale warehouse where safety, responsiveness, and detailed environmental awareness are crucial. Current telerobotic systems, which allow remote exploration and maneuvering through robotic control, often rely heavily on visual cues, thus limiting their accessibility for BLV users.
Introducing NavVI: A Multimodal Approach to Accessible Navigation
To address these significant gaps, researchers have introduced NavVI (Navigation Assistant for Vision-Impaired Users), a groundbreaking simulation-based telerobotic interface. Developed in Unity, a popular game development platform, NavVI enables BLV users to control a mobile robot within a highly realistic warehouse environment. What makes NavVI unique is its integration of synchronized multimodal feedback: visual, auditory, and haptic (touch-based) cues.
The system allows users to control the robot using a Sony DualSense controller. This controller provides real-time haptic feedback, which means users feel vibrations that indicate the proximity and location of obstacles. For instance, if an obstacle is to the left, the left side of the controller vibrates; if it’s in the center, both sides vibrate. The intensity of the vibration changes based on how close the obstacle is, providing an intuitive sense of distance.
Auditory feedback is another critical component. NavVI uses text-to-speech technology to deliver clear navigational instructions. These instructions often use clock-based directions (e.g., “3 o’clock” for a right turn) to make spatial understanding easier for visually impaired users. Audio cues also alert users when they are too close to shelves or when they have successfully reached their destination.
For users with residual vision, NavVI incorporates high-contrast visual indicators. This includes a brightly rendered path line guiding the robot towards its destination, and vivid colors for objects like boxes on shelves and the goal itself, making them easier to distinguish.
Smart Navigation and Obstacle Avoidance
NavVI’s simulation environment is built on a detailed warehouse layout, complete with shelves, forklifts, workers, and pallet robots. The system uses a sophisticated navigation mesh (NavMesh) for path planning. This NavMesh identifies walkable areas and marks static objects like walls and shelves as non-traversable. Crucially, NavVI continuously recalculates the robot’s optimal path every two seconds to adapt to the dynamic environment, ensuring safe navigation around both static and moving obstacles like forklifts and human avatars. If an obstacle blocks the current path, the system automatically re-plans the route in real-time, ensuring the robot always moves towards its goal safely.
Also Read:
- Navigating the Future: How Deep Reinforcement Learning is Reshaping Autonomous Path Planning
- Gaze-Guided Robots: Enhancing Efficiency and Robustness with Human-Inspired Vision
A Testbed for Inclusive Telerobotics
NavVI offers a robust and repeatable testbed for research in accessible teleoperation. By using a simulation, it significantly reduces the risks associated with developing and testing prototypes with physical robots in real-world industrial settings. The design principles of NavVI are also aligned with commercial hardware, making it easier to adapt the system to real robots in the future. This research paves the way for rapid feasibility studies and the deployment of inclusive telerobotic tools in actual warehouses, ultimately enhancing the opportunities for BLV individuals to participate in the logistics workforce.
While NavVI presents a promising solution, future work will focus on refining the haptic feedback, improving the detection of low-height obstacles, and conducting comprehensive user studies with BLV participants to assess its usability and impact. For more details, you can refer to the full research paper.


