a natural language GPS system
Charlie is a natural language navigation system that gives multimodal feedback to the driver.
In this project we investigated how the use of voice interaction could improve the driving experience. We explored how multi-modal interaction could help drivers with navigation during their ride without distracting or confusing them.
Rather than focusing on self-driving technology as the answer to driving challenges, our design focused on improving the interaction and communication between the driver and the car. In the design of self-driving cars, an area of interaction that still needs to be addressed and improved is the handoff between the automated mode and the driver mode. Instead of lessening driver engagement, our solution aims to create a better information loop between the car and the driver by capitalizing on multi-sensory input.
We began our work by conducting secondary research and outlining the current interaction challenges in the GPS navigation experience.
Navigation Applications Don't Adapt Well to Changes
For example, if a driver deviates from the intended route, they will receive insufficient feedback or cryptic, unhelpful commands. However, sometimes drivers intentionally deviate from routes, and continue to receive disruptive and distracting feedback from their navigation system.
Communicating Intentions is Difficult
As mentioned above, drivers sometimes prefer to follow some navigation instructors and intentionally deviate from others. However, they have no way of communicating their intentions to the system. Often this results in drivers needing to fumble with a touchscreen while driving, which is dangerous.
Information Exchange Loops are Inadequate
Currently, navigation systems work within essentially a one-way information exchange. First, drivers input information into the system. Then, while they are driving, they receive information from the system, with few options to interact with the system in a safe way. Current systems do not focus enough on interactivity with the driver, especially in ways that leverage multi-modal interactions to create a safer experience.
Based on these challenges, our design aimed to augment the multimodal capabilities of navigation applications by doing the following:
Implicitly sensing driving behavior
Providing actionable feedback
Adapting intelligently to changes
Empowering drivers to make safer decisions
Charlie is a natural language navigation system for your phone that uses natural language to give you directions. By existing on a user's phone, the navigation assistant can be used in any car the user drives, providing customized navigation on the go.
While driving, users can simply talk to the system or use a double-tap gesture on their steering wheel to engage with it.
The system has three modes (Guided, Learning, and Assisted) that can be enabled via voice commands. It also automatically sense changes to predict and suggest these options to the driver.
In guided mode, Charlie prompts the driver for trip information, such as the destination, and preferences, such as avoiding tolls.
In learning mode, the driver can tell Charlie, "Follow my lead." This command indicates the driver is intentionally deviating from the given route. Charlie will remember the driver's preferences in the future. At any time, the driver can verbally engage with Charlie again to resume navigation.
When other passengers enter the car, Charlie will propose to enter a background mode so your passenger can help you navigate instead. In this mode, the passenger can assist the driver by reading the directions displayed on the screen. Charlie will only activate if the passenger forgets to read off instructions. In this mode, Charlie will also learn conversational cues and lingo preferred by the driver, to provide better directions in the future.