The automotive industry is getting more and more complex every day. All companies, nevertheless if OEMs or Tier’s, are operating in VUCA (Volatile, Uncertain, Complex and Ambiguous) world. The systems that we build...
In the early days, HMI in a car was basically coming down to buttons. For each functionality there was a button, a knob, or a toggle switch. The more features your car offered, the more it looked like the Cockpit of an A350 airplane. And for sure, in order to keep an overview it was helpful to have a good colourful manual at hand.
HMI design has changed dramatically since. Not only does minimalistic design prevail today; but also drivers may not even need to touch anything to fire up the engine. Sure, most cockpits and infotainment systems today are operated via touch screens and touchpads. But you‘ll find more and more cars equipped with voice control. A simple „Hey Mercedes“ or „Hey Porsche“ is enough and and a voice assistant is activated.
Although today voice control is limited to a few entertainment functions, this will change: According to UI/UX designer Kaveh Shirdel voice control will gain massively in importance. There is, however, still a lot of homework to be done. Although the detection rates of voice control systems in cars reach between 95 and 98 percent, this also means: There are still „conversations“ between driver and car that end up in frustration. Especially in critical situations it must be ensured that a car is under full control of the driver.
What about gesture control? HMI expert Peter Rössger (from the consultancy Beyond HMI) sees little potential, because gesture control requires the driver to learn another language of interaction: One must know exactly which gesture must be performed in front of which visual sensor. Rössger sees, however, the following use case: The driver could point to a switch and ask “What is that?” – And the speech assistant then provides the answer.
Last but not least: We shouldn‘t think of HMI as a one-way communication. The car of the future will know (and learn) about a driver‘s characteristics. For example acceleration behaviour, braking characteristics and – more generally — a risk-taking / risk-avoiding profile. And the car will adapt to the individual driver, the instrument cluster might look different, the presettings of the entertainment system will be different. The car of the future will also monitor the driver: his attention level, for example. And if it detects signs of drowsiness (eye tracking), the car might emit a warning. Or, if required, even bring the car to halt. HMI becomes a two-way-interaction.
Did you know? — Codelab has more than 10 years of expertise in the area of state-of-the-art HMI for cars. Learn more: https://codelab.eu/embedded-human-machine-interface/