On the Next Auto User Interface

Cluster_future.JPG
 

June, 2019

We are just starting to see what happens at the intersection of functionally and aesthetically evocative physical and digital design – where pixels meet textures, colors and form as they dance through space and time together to elevate meaning, stimulate emotions and say something unique about your brand.  Now add cinematic effects, new audio vocabularies, customizable user interface frameworks and maybe some new simple input modalities (beyond touch) that are learned and used with some frequency over time.

To-date we have seen these differentiating techniques most applied to Drive Modes (or the behavioral mode of your vehicle – like Sport, Eco, Mud, Snow, etc.) as switching between them and driving in them becomes more unique and dynamic. And other elements of customization and personalization enable you to “make it yours” as seen here. However, these are just the beginning of the many choreographed digitally-driven visual, functional and multi-sensory interactions that we will experience in the vehicles to come.

Reality is though, vehicle owners do not fundamentally understand most of what their vehicles are capable of doing. They do not understand all of the features and functions of the vehicle. And when they attempt to use a function, they do not always understand what the controls mean or do or said another way, “what the car is saying.“ Different states like “Standby” or “Eco-Coach” or “Auto” imply or are interpreted one way, but often work another way. At the most basic signal levels, drivers do not understand what an icon means, what a word on a button or a panel means or whether pushing a button turns the feature On or Off. These are fundamental core design challenges similar to those that have existed for decades within other product categories with complex systems. Does anyone remember the days of the Multi-function device (MFD) office machines that faxed, scanned, copied, printed and emailed all-in-one? Over time, user-informed, iterative human-centered design processes and methods were used to solve these challenges.

At the same time, constant business pressures are being applied. OEMs are looking for more and more opportunities to reuse components across vehicle lines. They are also establishing pricing based on features – the more features, then and the higher the price. What is the value of a feature that you think that you need, but you never really understand how to use it (or) you never realized that its default Setting was Off (and you never turned it On)? Even our everyday features like listening to music, navigating from A to B and making and receiving calls can use some improvement.

Future_Past.JPG

These same OEMs are now considering if more features should be digital (or located in/on the screens) in order to save tooling costs and enable over-the-air updates. The consumer learning curve can be significant once physical controls (like door locks, or drive modes, or sunroof adjustment and/or all of your Climate controls) are placed deep into a screen information architecture. Remember that not too long ago, even our phones and maps were analogue. Many internal HMI and UX teams are establishing guidelines or systems. Should they (or) are their already instinctual rules that humans automatically assume (?) - like changes that I feel are triggered by physical manual interactions (like adjusting or turning a knob). Some of these might include adjusting a seat or controlling the fan speed or heating-up my seat.

Earlier this year, I made a count of the # of feature or function choices that one can make while sitting in the driver’s seat of the 2018 Porsche Panamera Turbo S E-Hybrid. Between switch packs on doors, the steering wheel, the seats and the center console, the many functions of (3) stalks, the touch screen (on Navigation), and other dispersed buttons, switches and toggles – I stopped counting at 100 features within reach. Good design dictates that all of these features, functions and controls are clearly recognized and understood. They are fit for their purpose and all should be thoroughly useful and usable during each day’s journeys and while cruising at 70-75MPH.

The automotive industry is due for a dose of creativity, simplicity and clarity as it approaches the end of the Internal Combustion Engine (ICE) age and the pixels and data collide and reform to serve-up emotionally captivating, sensory rich and contextually relevant experiences – hopefully mostly on electrified drivetrains with really long ranges. I recently read in “UX Matters” that “… the future of computing will be invisible – and integrated. Instead of having a notebook computer and a smartphone and a tablet and a smartwatch, we’ll have computing capabilities that fit organically and seamlessly into the contexts of our lives.” “Yes, there will also be computing environments in which we create, which require high-fidelity inputs and outputs, as well as more processing power. But even these will get better as their screens blend into our environments and improve in resolution, the combination of artificial intelligence and more natural input paradigms reduce the need for typing and clicking, and transitions from our creative platform to mobile devices happen seamlessly.” It seems as if all of this is coming at a very slow pace.

Future_cockpit.JPG

Interior automobile cockpit design is still (with only 1-2 exceptions) a showcase of what can be engineered across components that are combined to create an immersive space (versus) a connected system that is designed around human needs, activities and desires. These products are the experience and these experiences are the company. Whether I am a passenger working in the back seat, a front passenger watching a movie on a long trip or I simply missed a meal and need to eat and drink before my next meeting, the space around me will likely become more flexible, modular, extensible and accommodating to my needs. This will be a direct result of the learnings of building diverse vehicle types on the same vehicle platforms (or) designing for vehicles that can be adjusted and retrofitted from moving people to moving goods (or moving both at the same time).

Our interaction with the world around us will continue to get more useful and convenient as vehicles adopt plug-and-play (or pair-and-play) access to unlimited content, function and services. Consumers will soon expect new vehicles to support contextually relevant transaction services (like coffee & fuel & parking payment), access to favorite media and cost savings from subscriptions and memberships. I can Pause and Mute all of my content in my living room, so why can’t I do that in my vehicle?

Awareness of what is happening in the vehicle, what is happening around the vehicle and what is happening between the vehicles current location and its destination. The vehicle can identify who is on-board and maybe even what kind of devices and how many are on-board. The driver is more aware of what is around the vehicle, approaching the vehicle and going on with say traffic, approaching signals and other conditions from now through the end of the journey. Visibility and awareness is a well-spent features and perhaps the next democratized features will be the enhancement of mirrors and visibility with cameras or the awareness and automatic maneuvering around potholes. I am betting on the tech beating the road crews <grin>. And when we consider ride or whole vehicle sharing, then the daily agenda shifts and maybe expands awareness and planning to the full day and multiple participants and jobs to be done.

And if you chose to converse, then your smart digital assistant will play a significant role in this evolved human-machine dialogue. An existential question exists as to “when can we say that the vehicle or system is actually aware” versus learning the dependencies and relationships from data sets? They are quickly evolving from supervised learners to unsupervised learners with a hunger gain reinforcement through maximizing the reward while minimizing the risk. These systems will quickly evolve beyond human intelligence and at the same time enhance our abilities as humans. In the meantime, Alexa or Siri or Google Assistant is happy to fill your drive time with happy banter.

Admittedly, these are complex topics about complex systems of systems that come to life via creative and precise design and engineering. But maybe more importantly, they are topics related to internal organizational structure, strategic planning processes, the governance of decision making, continuous feedback loops and a culture and enterprise mindset of constant learning, experimentation and adherence to the belief that “the human is at the center of everything that we do” as is often heard these days.

Quote_on_Screens_3.JPG

movotiv

Product, Service & Emotional System - Design Consulting & Collaboration