article

Haptic steering wheel and other cloud-based services showcased by AT&T researchers

by: Staff, May 1, 2012
201205_haptic_wheel_top

 

  Turn when the steering wheel tells you

 

What is the best way to convey navigation information to drivers?

Probably not with GPS screens, which draw the eyes away from the road. GPS voice commands may be an improvement in this regard, but place a cognitive load on the driver who must still pay attention, especially when the directions are confusing. Does “turn right in 200 feet” refer to the first or second of two closely spaced turns?

A better way may be through haptics, or touch-based systems that use vibrations, heat, motion, and other tactile sensations to convey information. With drivers already inundated with visual and audio cues, tactile-based signals might more easily cut through existing clutter to deliver navigation information.

A haptic-enabled steering wheel, for example, might indicate a right or left turn by creating motion in the direction of the turn—clockwise for a right turn and counterclockwise for a left turn. The motion becomes faster and stronger as the turn gets closer thereby avoiding the confusion of “turn right at 200 feet.” The eyes never have to leave the road.

A prototype of just such a steering wheel has been built by a team of AT&T and Carnegie Mellon University (CMU) researchers. Linked to a GPS program, the wheel is embedded with 20 motors that vibrate one after the other in close succession to simulate motion in a counterclockwise or clockwise direction. Someone holding the wheel feels naturally inclined to turn the wheel in the direction of the motion.

201205_haptic-steering-wheel

 

Interacting with less effort

The motion is simulated; it doesn’t actually exist because technically nothing is moving. Instead each motor vibrates independently in a carefully timed pattern with one firing and then the next. The brain perceives this pattern as continuous motion with a direction.

Kevin Li, the AT&T Researcher who designed the prototype, is taking advantage of the way the human brain processes tactile stimuli, using this knowledge to work with (or “trick”) the brain into perceiving motion. The idea is to create tactile-based signals that already have some kind of prelearned meaning in the brain so the user “just gets it.”

This is contrary to the traditional approach of creating a language of symbols mapped to specific associations, and then having users learn those associations. The use of icons in graphical user interfaces—the current standard for intuitiveness—still requires users to look at the icons and learn the function of each one. It’s simple to do, but there’s an initial, conscious effort to map each icon to a function.

This potential to bridge the physical and digital worlds gives haptic interfaces enormous potential in any number of applications and services . . .

Haptic interfaces aim to skip the conscious learning part by triggering responses in the brain so the user can be made to feel what to do, without the cognitive effort required when being shown or told what to do.

Tactile feedback is also a more direct way for humans to interact with devices and the digital world than by passively looking or listening. Think of a child’s first encounter with a new object. Looking isn’t enough; the object has to be touched, felt, and directly interacted with. Device users get some sense of this when they tap or swipe on a touch screen or gesture with a game device. In this case, the transfer of information is one way.

Haptics provides the possibility for a two-way conversation, with the user relying on natural gestures to relay information, and the devices taking digital-world content and presenting it in a way formulated so humans readily absorb it with little effort.

This potential to bridge the physical and digital worlds gives haptic interfaces enormous potential in any number of applications and services, and it’s why AT&T Research is actively investigating the field. As computers, devices, and sensors inhabit every aspect of daily life, it is important that the interactions be as seamless as possible, otherwise the effort to continually interpret symbols and learn new associations becomes overwhelming and annoying.

In the automobile, any interaction that requires effort to decipher the meaning of a visual or audio cue becomes a safety concern. AT&T’s Emerging Devices Organization (EDO), which works directly with car manufacturers, is also working with researchers on a number of projects to find the best ways to integrate future services and devices into the car. The haptic steering wheel would seem to offer a more natural, less disruptive way to relay navigation information.

 

User studies under way

But does it work? Since this is the first use of a steering wheel to deliver navigation information, there’s little existing knowledge for evaluating how well the wheel will work. (Haptics is being used in some cars now but only to alert drivers to potential hazards, not to convey information.)

User studies for the navigational steering wheel are now being conducted by CMU researchers (led by SeungJun Kim), who are trying to determine its effectiveness while also evaluating the cognitive load imposed by the wheel. But how to accurately measure cognitive load in a car?

The usual way—presenting a simple secondary task to users while they are performing a primary task—doesn’t work in a car if the secondary task requires drivers to look away from the road. (The secondary task is frequently adding a list of numbers.) Instead, CMU researchers chose to use standard, off-the-shelf biometric responses (sweating, galvanic skin response), which not only cause no disruption to the primary task, but deliver data in real time.

Other studies are looking to see if the wheel actually helps convey navigation information with less distraction. The early indications are promising though they involve only 33 drivers, roughly half of whom are in their 20s, and the other half are over 65. Younger drivers using the haptic steering wheel in combination with normal visual and audio cues experienced a 3.1-percent increase in the amount of time they kept their eyes on the road compared to using visual and auditory instructions alone (a camera captured their eye movements). Drivers over 65 experienced a 4-percent increase in attentiveness while using the haptic-enabled wheel and receiving audio (but not visual) instructions at the same time.

Detailed information on the results of the completed studies will be published in June at the Pervasive 2012 conference.

These early studies have been conducted with the steering wheel attached to a driving simulator, and may more resemble a driving game than actual driving conditions. More realistic data may have to wait for future studies now planned to insert the steering wheel in an actual car. Researchers also want to evaluate how performance changes over time as drivers get used to this kind of technology.

Before any new studies, Li is redesigning the wheel so it better isolates vibrations (the current foam becomes compacted over time, limiting its ability to isolate vibrations). Another issue is making the system accommodate the many ways people place their hands on the steering wheel.

More motors may also allow Li to convey more information, such as the sharpness of the turn or higher-level information such as a left turn followed immediately by a right one.

Detailed information on the results of the completed studies will be published in June at the Pervasive 2012 conference.

 

Press coverage of Living the Networked Life

 

The haptic-enabled steering wheel was only one of several technologies showcased at AT&T's Living the Networked Life event on April 19 in New York City. The following is a selection of web articles generated by the event:

 

Gizmodo: Bio-Acoustics, Haptic Steering Wheels, RFID Everywhere: This is the Future According to AT&T

Bloomberg (video): What's Next: The Virtual Butler

Computerworld: AT&T brings the network closer to the cloud

CNNMonday: At AT&T Labs, universal translators and wearable keys

PCWorld: AT&T Envisions a Smarter Cloud

The Verge: ShadowPuppets prototype lets you pinch-to-zoom, click, and scroll with shadow gestures

All Things D: AT&T Aims to Avoid Opening Can of Worms as It Opens Up Its Network

Engadget (video): QNX's Watson-connected Porsche 911, hands-on

 

Haptic-enabled steering wheel

Wall Street Journal (blog): Driving, With Feeling

ABC News, Tech Bits:  The Next GPS: Vibrating Steering Wheel

PCWorld: Researchers Turn to Vibrating Steering Wheel to Shake Bad Driving Habits

 

AT&T WATSONSM APIs

Technology Review: AT&T Wants to Put Your Voice in Charge of Apps

CNET: AT&T hopes to make WATSON key element in mobile apps

TechCrunch: AT&T Opening Watson Speech Recognition To Developers With New APIs In June