There is a continuing need for a portable, practical, and highly functional navigation aid for people with vision loss. This includes temporary loss, such as firefighters in a smoke-filled building, and long term or permanent blindness. In either case, the user needs to move from place to place, avoid obstacles, and learn the details of the environment.
The core system is a small computer--either a lightweight laptop or an even smaller handheld device--with a variety of location and orientation tracking technologies including, among others, GPS, inertial sensors, pedometer, RFID tags, RF sensors, compass, and others. Sophisticated sensor fusion is used to determine the best estimate of the user's location and which way she is facing. See the SWAN architecture figure
Once the user's location and heading is determined, SWAN uses an audio-only interface (basically, a series of non-speech sounds called "beacons") to guide the listener along a path, while at the same time indicating the location of other important features in the environments (see below). SWAN includes sounds for the following purposes:
- Navigation Beacon sounds guide the listener along a predetermined path, from a start point, through several waypoints, and arriving at the listener's destination.
- Object Sounds indicate the location and type of objects around the listener, such as furniture, fountains, doorways, etc.
- Surface Transition sounds signify a change in the walking surface, such as sidewalk to grass, carpet to tile, level corridor to descending stairway, curb cuts, etc.
- Locations, such as offices, classrooms, shops, buildings, bus stops, are also indicated with sounds.
- Annotations are brief speech messages recorded by users that provide additional details about the environment. For example, "Deep puddle here when it rains."