GM — Texting+Driving

Using a harm-reduction approach to design a solution to distracted driving

Overview

2.739 Product Design and Development is a graduate product design class wherein teams of 10 students from Rhode Island School of Design (RISD), MIT Sloan School of Management, and MIT undergraduate and graduate schools build a product from idea to alpha prototype through the lens of engineering, design, and business.

My team and I took on a design challenge sponsored by GM that aimed to address safety concerns of texting and driving. After conducting a survey of over 180 participants, we found that 80% of people text and drive regularly. Existing products on the market attempt to prevent texting and driving by restricting data or even access to the driver's phone while driving. An outright solution would be to employ complete autonomous cars, which is an ambition outside the scope of the class.

Our assumption was that restrictive technologies suffer low adoption rates, so our approach was to create a hardware product that could enable safe texting and driving. As project manager and technology lead, I designed and prototyped a novel input device for the steering wheel and a GUI for a heads-up display (HUD) projected on the car windshield.

role

Chief Design Officer

timeline

Jan 2015 — Jul 2017 (2 yrs, 6 mos)

Deliverables

Research, UI/UX design, Alpha prototype

MIT 2.739 Product DESIGN AND DEVELOPMENT / SPRING 2015

User Needs

Product research

Our first step was to fully understand the problem space by interviewing drivers and investigating their texting and driving habits.

After over 180 responses, we found that 89% of those who took the survey were between 14 and 30 years old, and 80% of which regularly text and drive.

We also conducted observational studies with some of the survey respondents who did text and drive to identify some of the most dangerous behaviors drivers engage in while texting and driving.

After studying several drivers' texting and driving habits, we found that our product can only enable safe texting and driving if the following needs are met: the driver's eyes are on the road, hands are on the wheel, and the driver can complete whatever they want to do as fast as possible (e.g. text, change the music, take a call).

Display + Input

Product CONCEPT

After ideation, we converged on a product concept we believed could address the user needs outlined above. The product would consist of a HUD and capacitive touch pads on the steering wheel that affords typing and other touch gestures. Tethered to a smartphone, the touch pads can enable interaction while keeping hands on the wheel, and the HUD can display information at a focal distance of 40ft away, keeping the driver's eyes on the road.

The product would also incorporate safety features that already exist on the market, like lane departure alert and imminent crash detection. Access to the touch pads and HUD would be allowed only when it is safe, so during turns or excessive speeds, the HUD turns off and the touch pads don't register any input.

QWERTY backlights on the touch pads only serve as a reminder to the key layout—the lights dim during movement to discourage taking eyes off the road and looking down at the touch pads.

UI Prototype

UI design + HARDWARE PROTOTYPING

This is a hardware demo of the touch pads and UI in action taken right after I got it to work :) I bought two Adafruit trackpads and removed the enclosure, placing the click button underneath the pads to enable a push-down click like modern MacBook trackpads.

The stock Adafruit Arduino libraries for the trackpads only allowed data read from one trackpad at a time, so I did some heavy modification to allow for simultaneous data read via asynchronous functions.

Then I implemented my WebKit-based UI in a Node-WebKit app so I could access the trackpad data through a serial port (which is not allowed in a normal web app).

In the video, I'm moving my thumbs over the trackpad which move a white cursor over the keys on the screen. When I'm hovering over the key I'd like to type, I press down for a deep click and it registers the key. Tap-to-type was built soon after this video was made, enabling fast typing via muscle memory similar to typing on a touchscreen without looking.

HUD UI for texting. A split QWERTY layout gives visual feedback for key touches on the touch pads without having to look down, while putting text entry, car speed, and distance from the next car in center view shows all the essential information at once.

This is a UI mockup displayed over stock driving footage. The idle UI displays driving information like the driver's speed and distance from the next car ahead. The main UI is triggered by a two-thumb swipe up. In this demo, the driver sends a message to Cameron and receives a message back, then closes the main UI to get back to driving.