UCSB Personal Guidance System (PGS)

UCSB Personal Guidance System (PGS)

Brief Summary



This project ran from 1985-2008 and was concerned with developing and evaluating GPS-based navigation systems for visually impaired people. It began in 1985 with a concept paper by Jack Loomis (Professor of Psychology at UCSB), who directed the project. The late Reginald Golledge (deceased 2009), who had been Professor of Geography at UCSB and Roberta Klatzky (Professor of Psychology, first of UCSB and then at Carnegie Mellon University) were the two other principals in the project. We publicly demonstrated the first version of the PGS in 1993 using a bulky prototype carried in a backpack. Since 1993 we created several other versions of the PGS, one of which was carried in a small pack worn at the waist. Our project mostly focused on designing the user interface and the Geographic Information System (GIS) component (e.g., spatial database and route finding software).
Several wearable systems are now commercially available (notably BrailleNote GPS from Pulse Data and Trekker by VisuAide). These systems provide verbal guidance and environmental information via speech and Braille displays. Because our survey research confirmed our longstanding belief that visually impaired people sometimes want direct percept information about the environment, just as drivers and pilots want pictorial information from their navigation systems, our R&D from 1995-2008 concentrated on "spatial displays" for such systems. Dr. James Marston, who was a Postdoctoral Researcher in Geography and Dr. Nicholas Giudice, who was a Postdoctoral Researcher in Psychology, contributed greatly to this work.
Our R&D dealt with several types of spatial display. The first is a virtual acoustic display, which provides auditory information to the user via earphones (as originally proposed in the 1985 concept paper). With this display, the user hears important environmental locations, such as turn points along the route and points of interest. The labels of these locations are converted to synthetic speech and then displayed using auditory direction and distance cues, such that the spoken labels appear out in the auditory space of the user. A user wishing to go toward some environmental location that is being displayed simply turns to face the spoken label and then begins walking toward it. The intensity of the displayed information increases as the person approaches the locations.
A second type of display, which we called the Haptic Pointer Interface (HPI), emulates the function of an RIAS receiver. The user holds in the hand a block to which are attached an electronic compass and a small loudspeaker or vibrator. When the hand is pointing toward some location represented in the computer database, the user hears a tone or feels a vibration. Supplementary verbal information can be provided by synthetic speech. The user moves toward the desired location by aligning the body with the hand while maintaining the "on-course" auditory or vibratory signal. Other variants of this second type of spatial display involve putting the compass on the body or head and turning the body or head until the on-course signal is perceived.
Six formal route-guidance studies evaluating the different spatial displays indicated that they provide effective route guidance and are well liked by visually impaired users. In particular, our studies indicated that the virtual sound display is preferred by visually impaired users, generally leads to faster travel times than speech, and demands less attention than speech. Our research suggests that spatial displays ought to be available as optional alternatives to synthetic speech on commercial navigation systems for the visually impaired



Designed for display in current browsers such as Internet Explorer 6, Netscape Navigator 7 or Mozilla 1.1