In the early days, HMI in a car was basic­ally com­ing down to but­tons. For each func­tion­al­ity there was a but­ton, a knob, or a toggle switch. The more fea­tures your car offered, the more it looked like the Cock­pit of an A350 air­plane. And for sure, in order to keep an over­view it was help­ful to have a good col­our­ful manu­al at hand.

HMI design has changed dra­mat­ic­ally since. Not only does min­im­al­ist­ic design pre­vail today; but also drivers may not even need to touch any­thing to fire up the engine. Sure, most cock­pits and infotain­ment sys­tems today are oper­ated via touch screens and touch­pads. But you‘ll find more and more cars equipped with voice con­trol.  A simple „Hey Mer­cedes“ or „Hey Porsche“ is enough and and a voice assist­ant is activated.

Although today voice con­trol is lim­ited to a few enter­tain­ment func­tions, this will change: Accord­ing to UI/UX design­er Kaveh Shir­del voice con­trol will gain massively in import­ance. There is, how­ever, still a lot of home­work to be done. Although the detec­tion rates of voice con­trol sys­tems in cars reach between 95 and 98 per­cent, this also means: There are still „con­ver­sa­tions“ between driver and car that end up in frus­tra­tion. Espe­cially in crit­ic­al situ­ations it must be ensured that a car is under full con­trol of the driver.

What about ges­ture con­trol? HMI expert Peter Röss­ger (from the con­sultancy Bey­ond HMI) sees little poten­tial, because ges­ture con­trol requires the driver to learn anoth­er lan­guage of inter­ac­tion: One must know exactly which ges­ture must be per­formed in front of which visu­al sensor. Röss­ger sees, how­ever, the fol­low­ing use case: The driver could point to a switch and ask “What is that?” – And the speech assist­ant then provides the answer.

Last but not least: We shouldn‘t think of HMI as a one-way com­mu­nic­a­tion. The car of the future will know (and learn) about a driver‘s char­ac­ter­ist­ics. For example accel­er­a­tion beha­viour, brak­ing char­ac­ter­ist­ics and – more gen­er­ally — a risk-tak­ing / risk-avoid­ing pro­file. And the car will adapt to the indi­vidu­al driver, the instru­ment cluster might look dif­fer­ent, the pre­set­tings of the enter­tain­ment sys­tem will be dif­fer­ent. The car of the future will also mon­it­or the driver: his atten­tion level, for example. And if it detects signs of drowsi­ness (eye track­ing), the car might emit a warn­ing. Or, if required, even bring the car to halt. HMI becomes a two-way-interaction.

Did you know? — Codelab has more than 10 years of expert­ise in the area of state-of-the-art HMI for cars. Learn more: https://codelab.eu/embedded-human-machine-interface/

Sebasti­an Zang is the manging dir­ect­or of Codelab’s Ger­man entity. He reads a lot of books and is an avid blog­ger. Please vis­it his Ger­man spoken blog https://bytesforbusiness.com or link in with Sebasti­an to see more of his pub­lished articles.