Automakers and their suppliers are rethinking mobility for the 21st century and are revising car interiors to add personalization features while removing distractions. Prominent displays and app integration are key components in this endeavor. But now, those are complemented by more inconspicuous tech, including in-cabin sensors for detecting a driver’s or passenger’s mood and human-machine interfaces (HMIs) that stay hidden until needed. More, the coming proliferation of battery electric vehicles (BEVs) and self-driving cars will propel this evolution of car interiors, giving occupants new ways to interact with the vehicle as well as with each other.
Some of these innovations were seen at CES in recent years. Others are debuting at CES 2020.
Up to now, consumers have not had a compelling reason to adopt BEVs. The auto industry has focused on emphasizing increased range or charging speeds, says Kristin Kolodge, executive director of driver interaction and human machine interface research at J.D. Power in Troy, MI. However, with regard to interiors, Kolodge asserts, “these new mobility options” like BEVs and self-driving cars present “a potential white space opportunity to rewrite the script.” Self-driving cars open time spent in the vehicle to “different type experiences,” rendering usefulness even more important, she says.
“For technology that is on the road today, we see consumers making a very clear delineation with respect to what technology is actually useful — worth the money they spent on it — and that quickly catapults itself into what technology do I want tomorrow,” Kolodge says.
Power’s 2019 U.S. Tech Experience Index (TXI) Study, released in August, indicates apps built into cars are not meeting users’ expectations. The attribute for “ease of using built-in apps” ranked lowest (7.63 on a 10-point scale) in the entertainment and connectivity category, and 29% of vehicle owners have stopped using their cars’ built-in apps altogether, the report says. The study also found consumer concern around self-driving car tech plus annoyance with current advanced driver assistance systems (ADAS) technologies. Among owners of vehicles equipped with lanekeeping and centering systems, on average 23% consider the associated alerts annoying or bothersome, according to the study. Of those respondents, 63% want the feature in their next vehicle, versus 91% of respondents who weren’t put off by the alerts, Power says.
So, the auto industry “will be at a crossroads going forward with self-driving cars,” needing to retain customers attention to the car with better in-cabin technology — not favoring a brought-in device — when they’re no longer piloting, Kolodge maintains. Displays stay relevant in future car interiors, but their roles change, she says. Instead of being dedicated to one context, they’ll enable personalization via interchangeable purposes, she explains.
“Maybe you can’t make your car a living room on wheels yet, but you can still repurpose a lot of the smart home experience within, say, a semi-autonomous vehicle,” says James Hodgson, principal analyst at ABI Research in Wellingborough, U.K. This accounts for the emergence of digital assistants like Amazon Alexa and Google Now in autos, he says. Going forward, in-cabin cameras and radars installed for driver and passenger monitoring systems will be leveraged for “emotional intelligence,” too, enabling the car to proactively respond to the user’s mood by adjusting lighting or the music that’s playing. He also foresees biometric technology allowing the car to identify everyone inside it, giving each person access to his or her own content, cloud-based services or settings.
At CES 2020 Hodgson anticipates more focus on this technology in cars. “The automotive industry is scrambling to show things of value in the short and medium term rather than moonshot ideas,” he declares.
Indeed, automotive emotion recognition through artificial intelligence (AI) is “literally the premise on which we developed our technology starting in 2014,” says Modar Alaoui, founder and CEO of Eyeris, based in Palo Alto, CA. Employing multiple in-cabin cameras and an AI chip from various hardware partners, the company’s technology features 10 computer vision neural networks running in parallel to analyze human behavior — from upper body, face and activity — in relation to objects and surfaces in real time, Alaoui explains. The addition of object recognition and surface classification neural networks to Eyeris’s existing portfolio of human behavior understanding algorithms results is “in-vehicle, scene-understanding AI” that enables the car to adapt to what’s going on inside it, he says. This is a “renaissance of the in-cabin space, which never before generated any type of meaningful data that could optimize safety and enhance comfort and convenience.”
Eyeris announced its human behavior understanding technology at CES 2018 and recently introduced the first fully integrated system that integrates combination RGB-IR cameras. At CES 2020 it will demonstrate its system in a Tesla Model S concept car outfitted with eight RGB-IR cameras that, when in production, will be smaller than a cell phone camera.
Eyeris has worked with automakers Honda, Jaguar Land Rover, Mitsubishi and Toyota, and with auto industry Tier 1 suppliers including Bosch. Two more partners will be revealed at CES 2020, and the first production vehicles with Eyeris’ technology built in will roll off assembly lines starting in 2022, Alaoui says.
From steering wheels to seats, car interiors have remained essentially unchanged along with the use case of a driver and passenger(s) or cargo. Looking ahead, “there’s a paradigm shift,” says Alexander van Laack, director of the “cockpit of the future” platform at Faurecia North America, based in Auburn Hills, MI. “What people want to do inside the car will drive how the interior will be structured, and it will enable experiences based on that.”
So, when cars are entirely self-driving, van Laack suggests, the interior could be turned into a complete gaming console, where air conditioning, seating vibrations and fragrance infusion all become part of the game environment and stimulate perceptions. Or, because humans are “multimodal,” stimulating senses may turn the interior into a sort of wellness spa, he says. Faurecia demonstrated this at CES 2019 using seat vibrations combined with music, and the company is also highlighting this at CES 2020.
Seats are evolving, too. At CES 2019, the company showed how they can move, pivot and recline in novel ways while remaining safe, with seatbelts affixed to the seat structure and integrated motors that bring a person upright in the event of an impending collision.
“We’re looking at seats as the cradle of experience in the car,” says John Absmeier, chief technology officer at Lear Corp., headquartered in Southfield, MI. Thus, he says, Lear is exploring “the new SaaS business model: Seat as a Service,” which brings content and functionality that can be delivered on demand, provided as an option at the time of use instead of at the time of sale.
“If you think about the content that we put in seats today, like massage, heating and cooling and connectivity — most of those things are a one-time sale” when the vehicle is ordered and built, Absmeier says. “In the future, it would make sense to put all of the content in these shared vehicles — that are going to be used for different use cases, for different consumers — and then it can be monetized an as-a-service business model,” he says. “You might have landed in Arizona and it’s very hot so you might be willing to pay a little bit of an upgrade charge to ride in a car with a cooled seat.”
Lear’s SoundZone technology integrates Wi-Fi and Bluetooth in a seat for connecting a personal device and using that to listen to streamed music, have a private conversation or watch a video.
A coming Lear seat enhancement adds sensors to the car seat that allow it to adjust itself in response to the user’s perceived needs. “It’s understanding your muscles, and your movements, and potential discomfort situations,” Absmeier explains. “It’s also understanding driving conditions,” including whether the road is bumpy or there are curves, and how long the driver has been sitting and the implications this has for blood circulation. Consequently, he says, the seat may alter its own lumbar support or side bolster, or even recline or raise its backrest “to make you remain comfortable, in many cases before you go to adjust it yourself.”
“What people want to do inside the car will drive how the interior will be structured, and it will enable experiences based on that.”
This ties into another concept for future car interiors, which is smart surfaces that can act as controllers or sensors. Those coming proactive comfort sensors potentially may be radio frequency-based heart rate or respiration monitors built into a seat.
Similarly, Absmeier says Lear is working on ways to integrate lighting in seats that would help to alleviate motion sickness in a car.
Lear is a dominant player in the automotive seating business. It generates $16 billion of annual revenue in the category by providing everything from structure through foam, trim and surface material to final assembly.
Lighting and smart surfaces are subsumed under another trend overtaking car interiors — namely 3D in touch, sound and visual user experiences, says Tamara Snow, head of the systems technology group for interiors, North America, at Continental Automotive in Troy, MI.
At CES 2018 Continental displayed 3D surround sound with actuators built into vehicle surfaces, as well as 3D surfaces used as displays (which won a CES Best of Innovation award). At CES 2019, the company showed an augmented reality (AR) head-up display (HUD) and morphing surfaces, on which 3D controls light up and pop up when a user’s hand approaches — the latter a form of “shy tech” that appears only when necessary. It also debuted a 3D Lightfield center stack display, which will be at CES 2020, Snow says. In July, Continental partnered with Leia Inc., a Silicon Valley startup, to produce a Natural 3D Lightfield Instrument Cluster, slated to be in cars for sale by 2022.
The 3D Lightfield technology is relevant for future self-driving robotaxis because it can provide a 3D user experience for all passengers in a vehicle, not just the person who’s sitting in the traditional driver’s seat, Snow says. And when car cabins are reconfigured for all passengers, it accommodates moving the content from one 3D display zone to another, she adds. It can also become a holographic display for navigation or parking assistance.
Also, at CES 2020, Continental is showing the combination of displays inside the cabin with surround-view camera systems looking outside for “clear visual perception of spatial relations.” Its Transparent Hood technology gives a virtual view through the hood of the car, and for which the company won a CES 2020 Innovation Award. It’s now available in the 2020 Range Rover SUV from Land Rover.
Transparent Hood is related to another Continental 3D display technology that hasn’t been put into a production vehicle yet, the Virtual A-Pillar. Using a flexible OLED display, head tracking and surround-view cameras, Virtual A-Pillar visually cuts out the front side support structures thus eliminating another common forward blind spot.
Bosch is also showcasing some of its display developments at CES 2020, on a large wall that represents a car’s dashboard. The purpose is to illustrate how different technologies can be interactive from an end user’s point of view.
Bosch also envisions 3D displays in the car cockpit of the future, says Robert Finger, director of product management for car multimedia at Robert Bosch GmbH in Stuttgart, Germany. They’re better to emphasize the importance of warnings and provide depth perception for navigation, or as a mirror replacement when paired with a 3D camera.
In fact, displays or touch surfaces can replace almost all hard buttons in a car’s cabin, Finger says. They can be woven materials and placed in a seat or armrest. And in the realm of in-cabin monitoring, the company agrees about the utility of RGB-IR cameras for obtaining a better contextual view and analysis, Finger says. But it also sees use cases, including for videoconferencing from the car.
Continental’s Snow declares, “the car becomes much more than just your mode of transportation. The transportation almost becomes the background of what’s happening — it’s foundational — and then you can do a lot more on top of that.”
Mercedes-Benz will showcase a groundbreaking concept car that incorporates a completely new form of interaction between humans, technology and nature — presented in a keynote by Ola Källenius, chairman of the board of Management of Daimler AG and head of Mercedes-Benz. Källenius is carrying on a tradition started in 2007 by Alan Mulally, then CEO of Ford Motor Co., who used the CES keynote stage to debut SYNC, an unprecedented infotainment system co-developed with Microsoft.
CES is now a major auto show — having been named one of USA Today’s 10 Best Auto Shows — and hosts more than 140 exhibitors representing the entire vehicle technology ecosystem. These include 10 major automakers, top automotive industry suppliers, self-driving car tech companies and automotive software companies.
BYTON is unveiling a production-ready version of its semi-autonomous battery-electric SUV, the M-Byte, first shown as a concept at CES 2018. The interior’s technology fittings remain entirely true to the original plan. But some surprises are to be revealed, too, promises Andre Nitze-Nelson, director of future digital product experience at BYTON.
“The idea is to build and develop a platform in two ways. One is the platform as a car,” Nitze-Nelson says, and the other is a “digital product.” He says, “it must be scalable and designed in a way that it will work in the future without redesigning the entire architecture.” And to that end, he says the vehicle’s interior and HMI (human-machine interface) was designed around the notion that “everyone in the car is an equal user,” and that the “driver” will eventually be relegated to the dustbin of history.
The front seats rotate inward 12 degrees to facilitate conversations — a pioneering feature in the auto industry that required collaboration with the seat supplier, Nitze-Nelson notes. The unique 48-inch wide curved Shared Experience Display (SED) sits atop the dashboard and is composed of three segments for consuming content. It’s not a touchscreen; its controller is an eight-inch touchpad mounted in the center console between the front seats, which is easily accessible to the front passengers.
Of course, there are driver-centric features and controls, including a seven-inch steering wheel-mounted tablet that can be navigated with thumb swipes. It functions as the driver’s SED controller and provides access to the vehicle’s infotainment system but can also double as a “consumption display” when parked, Nitze-Nelson says.
BYTON ID is a cloud-based account that lets anyone bring their personal vehicle settings to any BYTON vehicle anywhere — a concept that is tailored to BYTON vehicles used by carsharing or ride-hailing services.
“Imagine you have access to something which tracks your movement or activities,” he says.
“Now you two accounts and your BYTON ID allows full access even in a car you do not own. You jump in a car, we have facial recognition, we can recognize you and if so, your user profile unfolds in that very moment. “
The vehicle on display at BYTON’s booth is “a representation” of the model that will go on sale in the U.S. next year, but it is 99.9% accurate, and what may change won’t be noticeable to a casual observer, Nitze-Nelson says. Visitors can interact with both the actual vehicle and a “buck” in which they can sit and explore the SED’s functions for themselves.