It is very difficult for the visually impaired to find their bearings in space and therefore to be able to orient themselves when they are moving, especially in an outdoor and urban environment. This restriction of accessibility to space and places therefore reduces the autonomy of the visually impaired. In addition, it can be a source of stress and apprehension when traveling to an unknown destination.

The arrival of GPS has brought much hope to the visually impaired (DV) community because this system is supposed to locate and guide a pedestrian.
Unfortunately the existing systems present 3 major difficulties:

  • Poor location accuracy: in theory 10m but much more in town with the phenomenon of urban canyons.
  • Inadequate Geographic Information System (GIS): the available commercial GIS have been designed for motorists and do not have useful information for pedestrians and in particular for visually impaired people such as sidewalks, pedestrian crossings, street furniture …
  • Primitives for route calculation and guidance unsuitable for visually impaired pedestrians: the system searches for the shortest path while the difficulty of the route (number of crossings, width of sidewalks, shared areas, etc.) is more relevant for a visually impaired person.

By integrating many recent technologies such as GPS, on-board vision, inertial units, augmented reality, etc., this project has several objectives:

  • The coupling of the GPS system to an on-board vision device makes it possible to locate geolocated visual targets (for example the facade of a building). Recognizing one of these targets allows the user to improve their positioning.
  • On-board vision also makes it possible to recognize and locate many objects in the environment (for example a bus stop or an ATM). The identity and location of these targets is provided upon request. The system can therefore partially describe the surrounding space.
  • The project makes it possible to enrich and specify the spatial GIS database by including all the elements deemed necessary in pedestrian movements such as sidewalks, pedestrian crossings, points of interest, landmarks, etc.
  • Several route calculation primitives specific to visually impaired pedestrians have been developed.

The project also aims to design a sound augmented reality system. Information concerning a building (identity, but also opening hours for example) can be offered. Thanks to augmented reality, the user hears information coming from the building itself, which makes it possible to locate it.

Have we aroused your curiosity? Visit the following link for more information: .

Consortia and partners

The funders of this project are:

  • National Research Agency TecSan 2008: Partnership research in Technologies for Health and Autonomy.
  • Midi-Pyrenees region

It brought together different actors:

  • IRIT: Toulouse Institute for Computer Science Research
  • CerCo: Brain and Cognition Research Center
  • LIMSI: Laboratory of Computer Science for Mechanics and Engineering Sciences
  • NavoCap: designer of on-board electronic equipment for aids to navigation
  • SpikeNet Technology: supplier of artificial vision technologies
  • Greater Toulouse: intermunicipal cooperation bringing together 37 municipalities around Toulouse

Skip to content