User Experience Designer - San Francisco, CA

University of Idaho Portfolio

An Evaluation of Google Maps AR Using Engineering Psychology Principles

 
 

Google Maps’ Live View Augment Reality feature redefines egocentric navigation in outdoor and indoor spaces. This paper evaluated the strengths and limitations of the Live View using known principles from Engineering Psychology, as mentioned in the book Engineering Psychology and Human Performance by Wickens et al. (2013). The key concepts I focused on are Augmented Reality, Displays, Maps, Navigation, and Environment Visualization. I explored how spatial cognition and navigation concepts are leveraged in the experience and noted how Live View is solving known challenges in the augmented reality space.

Navigation and Spatial Cognition

Live View integrates real-time navigation with well-established information processing principles. It uses a combination of location data, street view, and the phone's camera to give users a clear understanding of their surroundings. The interface combines a 2D map with a camera view overlaid with helpful AR elements. This approach follows the four stages of information processing: perception, attention & selection, decision-making, and action. Live View reduces mental strain using an "ego-centric" perspective similar to a "you are here" map. The camera display extends the user's self, minimizing confusion between the phone and the real world. Additionally, Live View enforces useful constraints. Users must be in walking mode and choose a destination on the map to use AR features. Finally, users take action by walking towards their chosen location. This well-designed system promotes clear navigation and reduces the cognitive load on users.

Inspired by film editing techniques, Live View uses a "visual momentum" to enhance navigation. This means users experience a smooth transition between the 2D map and the camera view. The blue cursor acts as an anchor point, guiding the user's gaze from the map to the real world seen through the camera. While this approach works well for wayfinding, it has limitations.


Challenges of Live View:
Display Issues: Lag or inconsistency between the map and camera view can disrupt information processing and increase users' cognitive load.
Keyhole Effect: The camera's limited view can create a "keyhole effect" where AR elements clutter the display, potentially causing users to miss important cues in their surroundings. This can lead to attention tunneling, fixation on the phone, and even obstructed views, increasing safety risks, especially for pedestrians.

Solving Augmented Reality Challenges

Tracking & Lag
Live View tackles the limitations of traditional phone navigation by addressing issues like tracking and lag. It uses global localization, combining data from Google Maps, Street View, and the phone's camera to create a seamless AR experience. Previously, relying solely on a blue dot on a 2D map required mental effort to match it with the real world. Live View solves this by directly displaying the user's environment on the screen. As the user moves their phone, the 2D map and camera view transition smoothly together, providing a clear and continuous understanding of their location and surroundings.

Depth Perception
Mobile Augmented Reality faces a hurdle in in-depth perception due to single cameras and screens. This creates a flat 2D view that clashes with our natural 3D perception. To address this issue, Live View uses traditional depth cues like occlusion (farther objects being obscured by closer ones). It achieves this by keeping the camera view distinct from the augmented reality elements (map and direction indicators). These elements are placed on top and designed to be visually distinct (bright, colored, with drop shadows) to grab the user's attention and provide depth cues within the limited 2D display.

Live View utilizes some effective engineering psychology concepts:
Relative size: AR elements like directional arrows get larger as you approach the target, indicating urgency. This is especially helpful when other depth cues like atmospheric perspective are unreliable (fog, night).
Depth perception: Keeping the camera view separate from the bright and colored augmented 
reality elements helps them stand out and provide some depth perception within the limitations of a 2D display.

Display Accuracy
Visual cues play a significant role in enhancing directional accuracy, particularly in augmented reality interfaces. These cues, such as changing arrows in direction, varying sizes, and angles, are utilized in Live View navigation systems to aid wayfinding. For instance, directional arrows dynamically alter their orientation to guide users effectively haptic feedback is used as a reinforcing modality to inform users of a particular event. For example, the phone will subtly vibrate when an arrow or marker appears on the screen. Such nuanced interaction is critical, especially when auditory display may not be ideal due to surrounding noise.

User Safety
To mitigate object-based tunneling, a known safety issue for augmented reality applications, Google Maps clarifies that Live View can only be used when the user is in a static position. The feature smartly detects when the user is moving and when the device is held upright. When this happens, the UI changes and a notification appears on the screen, prompting the user to stop using Live View while walking. Live View also adapts its interface based on the phone's angle, smoothly transitioning into a 2D map view when facing downwards.

 
 
Nina Rumbines