i3 | May 20, 2017

Humanizing the 3rd Space

by 
Robert E. Calem
Man reading a book while in a self-driving car

Self-driving cars promise greater mobility for people who can’t drive, a pleasurable experience for those who hate to drive, and the potential to disappoint anyone who enjoys time behind the wheel. So, automakers, their technology partners and university researchers are focused on how to develop self-driving vehicles that satisfy all of these audiences.

They are pondering how to make the vehicles better at interacting with people riding inside and encountered outside, as well as training them to drive less like robots — in essence, humanizing the driverless vehicle.

Personalization is Central

“When we talk about new human interactions with self-driving vehicles, we’re talking about a new playing field,” says Thor Lewis, director of UX (user experience) and HMI (human-machine interface) at Toyota Research Institute in Los Altos, CA. “We’re taking a very gentle approach to this topic inside the vehicle. The output from the car needs to be very cognizant of the user’s tolerance for sounds, haptics, lights and voice communication. Too much and it will become annoying very quickly. Too little and the user could miss something important. It’s really about communicating in the right way at the right time. We are planning on the car learning, over time, how best to interact with its passengers.”

Moreover, Lewis adds, “Something else to consider is vehicle to pedestrian and other driver communication. Drivers do quite a bit of non-verbal communication to pedestrians and other drivers. It is how we give the right of way at four-way stops, or gesture to a pedestrian it’s OK to cross.” Driverless cars will communicate in the same way, he says. “I love the idea of UX designers and researchers from different manufacturers collaborating on a simple and consistent new physical language which works across all our brands.”

Additionally, self-driving cars will have to accommodate various driving styles. Lewis says, “Traditionally, in HRI (human-robot interaction), there’s this methodology called Sense-Plan-Act. It’s about gathering information, planning what to do and acting accordingly.” Toyota Research Institute has gone a step further devising “Sense-Plan-Act-Learn, where we add a learn stage at the end to find out if the outcome was optimal. With received feedback from the user, the system adjusts to make things better.”

For example, many cars today offer a sport mode setting that provides faster acceleration and more dynamic handling. This may still be available in a self-driving car, and in TRI’s imagination, “if the acceleration is too much for a user to handle around a certain corner, [he or she] can ask [the car] to go a bit slower next time. The car will then learn and remember,” Lewis explains.

“It has to do with learning from whomever is driving today in order to provide the best experience for the passenger of tomorrow,” says Manuela Papadopol, director of business development and communications at Elektrobit, a global supplier of embedded and connected software tools and engineering services for the automotive industry, and a wholly-owned subsidiary of Continental AG, based in Seattle.

“If I get into a fully self-driving car and go on a ski trip, I want the vehicle to give me information about the slopes and stop at Starbucks without me asking, because [the car] knows that was my behavior when I used to drive,” Papadopol says. “Right now, we still study the operator, the driver, because we want to teach the car — but we also want to get the data to provide the best experience when full autonomy is achieved.”

“Personalization will be key. With autonomous driving becoming maybe mandatory in the near future, the car makers will need to rethink the way they prioritize what goes into a car — software and hardware,” Papadopol says. “This will be where the brand differentiation will happen for the car makers...The personalized experience will be far greater in a self-driving car than it is today.” For instance, she predicts, beyond 2050, windshields will function as displays tailored to that experience, such as being used for videoconferencing or other productivity-oriented tasks.

Nevertheless, data security remains a challenge, Papadopol says. Therefore, Elektrobit has partnered with Argus Cyber Security, the world’s largest independent automotive cybersecurity company, headquartered in Tel Aviv, Israel. The alliance, announced at CES 2017, marries the Argus automotive intrusion detection and prevention technology to Elektrobit’s software and tools for developing secure automotive ECUs (electronic control units). Its aim is to boost cybersecurity features in highly connected, autonomous vehicles.

Information Presented Differently

“A lot of HMI development has been focusing on the semi-autonomous use-case,” via augmented reality displays, eye tracking and other driver monitoring systems, says James Hodgson, industry analyst for smart mobility and automotive at ABI Research in Wellingborough, U.K. “But it’s pretty clear that the tide of investment and strategic direction from players outside the ecosystem and [automakers] alike more recently has been towards fully autonomous,” Hodgson says. In this context, passenger cabins have been dubbed “the third space,” characterizing the vehicle as a living space or a workspace.

But often overlooked is the effect that fully driverless technology and changes to business models has on HMIs and vehicle interiors. He suggests, when shared driverless vehicles are mass deployed (a la Uber), HMIs should appeal to disparate tastes. Thus, he says, possibilities include rich multimedia displays (because driver distraction is not a concern), physical reconfigurability of the cabin, and embedded keyboards.

Similar notions were shown in various concept vehicles at CES and the 2017 North American International Auto Show in Detroit, including the Rinspeed Oasis, Volkswagen I.D. Buzz and Chrysler Portal.

When asked what they would like to do in a car that drives itself, people typically say watch a movie or work, says Tim Smith, principal of design and head of ustwo Auto, a London (U.K.)-based digital production agency that helps automakers create in-vehicle user experiences, with specialties in contextual HMI, connectivity, human factors for smart mobility, and humanizing autonomy.

“We found there are some nuanced human factors to consider with self-driving cars over semi-autonomous modes of transport such as trains and planes,” Smith says. “Just because you can turn around and watch The Lion King and eat your pizza [in a self-driving car] doesn’t mean people will want that.” Creating an HMI for a driverless car is “about designing for fundamental human needs and not what is technically possible,” Smith adds.

“Where autonomous is interesting is in the service design, and how it integrates in your life. There’s nothing necessarily visual or tangible about that design. It’s just whatever is most appropriate for the user,” he says. “It’s interrupting or providing information to someone. It could be physical, it could be digital, it could be audio, it could be haptics.”

Ustwo Auto’s research and development partners include the University of Washington in Seattle and University College London. The firm’s clients include BMW, Ford, Jaguar Land Rover and Nissan.

Driverless car HMIs are going to be very different from today’s vehicle’s, says Erik Coelingh, senior technical leader at Volvo Cars in Gothenburg, Sweden. “The type of information we show is what we think is relevant for a rider in a self-driving car,” Coelingh says. This can be a prediction of what the car will do, as a way of building confidence, or perhaps a display of “remaining AV time” — how much longer the vehicle can operate autonomously before the driver must retake control.

“The car will not just suddenly do a lane change. It will tell you in advance, so that when the car is moving sideways [the passenger knows] this is not a mistake,” and depending on how long the car will drive itself, “you can decide on what to do. If it’s five minutes you can check some e-mail on your phone. Maybe if the car says it’s half an hour you decide to read a newspaper.”

And in Volvo’s vision, watching a movie is not unforeseeable, either. In 2015, Volvo introduced Concept 26, a prototype self-driving car interior that integrates a large screen for watching movies plus lounge seats. “Once we solve the technical details for fully autonomous self-driving, then Concept 26 is not so far away. A large portion of it could be implemented in the fi rst fully self-driving vehicle that we will develop,” Coelingh declares. That is expected as early as 2020.

Intel is also studying how to convey information and build passenger confidence in a driverless vehicle, says Kathy Winter, vice president and general manager of the company’s automated driving solutions division in Santa Clara, CA. But Intel has found this becomes a virtuous cycle, with the information less regarded as trust is built, she says. “You can almost envision repurposing that display or that screen for something else, maybe give them the choice of turning it on or off once they’re used to it,” she proposes.

At CES 2017, the company debuted Intel GO, a driverless platform for automakers that comprises computing hardware, software development tools, 5G-ready cellular connectivity, artificial intelligence (AI) technology and a cloud-based data center.

“The user experience can evolve dramatically inside the car, I don’t think that’s going to be road blocked by any regulation,” says Danny Shapiro, senior director of automotive at NVIDIA Corp. in Santa Clara, CA. He imagines a “multimodal” HMI in autonomous vehicles that combines facial recognition, gesture recognition, natural language processing and location awareness — so, the user experience may be adapted to a particular passenger, or a passenger may point at something outside, ask “what’s that?” and receive an answer from the car, Shapiro says.

At CES 2017 NVIDIA introduced AI co-pilot technology, which acts as an in-vehicle assistant, monitoring the driver and 360 degrees around the vehicle. CEO Jen-Hsun Huang explained it at the CES opening keynote.

Teaching the Car

Of course, a self-driving car drives in the safest manner, consistently. But that does not always translate to passenger comfort. Although the vehicle may know it’s in no danger of sideswiping a truck in the adjacent lane, the people involved may perceive the situation differently. So it’s the vehicle that should adjust to the human’s sensibilities, not vice versa.

“If we jump way into the future where every car is driven autonomously, you can [anticipate] there are no street lights. Cars will just go through intersections because they’ll all be managed and communicate with each other [to avoid collisions]. But until that point,” Shapiro says, “we absolutely need to build the self-driving car to drive like a human.”

While there’s no way to hard code this human-centric behavior into a self-driving vehicle because every driving situation is different, AI and “deep learning” enable the car to be trained as they go.

“Self-driving cars in the early years will be conservative drivers, but not fully human,” says Volvo’s Coelingh. “Right now we collect lots of statistics on how people drive to find the driving style of the safest drivers on the road. How do they select their speed, how do they choose their accelerations…that kind of driving style we will mimic.”

To that end, Volvo’s Drive Me project is field-testing a fleet of autonomous-capable XC90 SUVs, placed with 100 families in Gothenburg. The automaker will draw direct comparisons between the way vehicles are manually driven and how they perform in self-driving mode, Coelingh says.

Driving like a human can be as simple as avoiding a pothole by moving over a lane rather than staying on course, even if maintaining the driving path is safe, notes Chris Schreiner, director of the user experience and innovation practice at Strategy Analytics in Newton, MA.

Schreiner says the research firm has tested 13 features on highway self-driving vehicles, with consumers in the U.S. and Germany, examining the transfer of control, trust issues, the “naturalness of the driving experience” and other “fun, nitty gritty UX things that go into autonomous cars at the state that they’re in right now.”

Certainly, automakers cannot let their self-driving vehicles acquire the unsafe habits of an average driver. “But for the consumer to accept a self-driving car, it has to feel natural,” Schreiner concludes. “The sweet spot is there, but it’s going to take a lot of effort and resources to find where that spot is.”

May/June 2017 i3 Cover Issue

Subscribe to i3 Magazine

i3, the flagship magazine from the Consumer Technology Association (CTA)®, focuses on innovation in technology, policy and business as well as the entrepreneurs, industry leaders and startups that grow the consumer technology industry. Subscriptions to i3 are available free to qualified participants in the consumer electronics industry.