In the ear­ly days, HMI in a car was basi­cal­ly coming down to but­tons. For each func­tio­na­li­ty the­re was a but­ton, a knob, or a tog­gle switch. The more featu­res your car offe­red, the more it looked like the Cock­pit of an A350 air­pla­ne. And for sure, in order to keep an ove­rview it was help­ful to have a good colo­ur­ful manu­al at hand.

HMI design has chan­ged dra­ma­ti­cal­ly sin­ce. Not only does mini­ma­li­stic design pre­va­il today; but also dri­vers may not even need to touch any­thing to fire up the engi­ne. Sure, most cock­pits and info­ta­in­ment sys­tems today are ope­ra­ted via touch scre­ens and touch­pads. But you‘ll find more and more cars equ­ip­ped with voice con­trol.  A sim­ple „Hey Mer­ce­des“ or „Hey Porsche“ is eno­ugh and and a voice assi­stant is activated.

Altho­ugh today voice con­trol is limi­ted to a few enter­ta­in­ment func­tions, this will chan­ge: Accor­ding to UI/UX desi­gner Kaveh Shir­del voice con­trol will gain mas­si­ve­ly in impor­tan­ce. The­re is, howe­ver, still a lot of home­work to be done. Altho­ugh the detec­tion rates of voice con­trol sys­tems in cars reach betwe­en 95 and 98 per­cent, this also means: The­re are still „conver­sa­tions“ betwe­en dri­ver and car that end up in fru­stra­tion. Espe­cial­ly in cri­ti­cal situ­ations it must be ensu­red that a car is under full con­trol of the driver.

What abo­ut gestu­re con­trol? HMI expert Peter Röss­ger (from the con­sul­tan­cy Bey­ond HMI) sees lit­tle poten­tial, becau­se gestu­re con­trol requ­ires the dri­ver to learn ano­ther lan­gu­age of inte­rac­tion: One must know exac­tly which gestu­re must be per­for­med in front of which visu­al sen­sor. Röss­ger sees, howe­ver, the fol­lo­wing use case: The dri­ver could point to a switch and ask “What is that?” – And the spe­ech assi­stant then pro­vi­des the answer.

Last but not least: We shouldn‘t think of HMI as a one-way com­mu­ni­ca­tion. The car of the futu­re will know (and learn) abo­ut a driver‘s cha­rac­te­ri­stics. For exam­ple acce­le­ra­tion beha­vio­ur, bra­king cha­rac­te­ri­stics and – more gene­ral­ly — a risk-taking / risk-avo­iding pro­fi­le. And the car will adapt to the indi­vi­du­al dri­ver, the instru­ment clu­ster might look dif­fe­rent, the pre­set­tings of the enter­ta­in­ment sys­tem will be dif­fe­rent. The car of the futu­re will also moni­tor the dri­ver: his atten­tion level, for exam­ple. And if it detects signs of drow­si­ness (eye trac­king), the car might emit a war­ning. Or, if requ­ired, even bring the car to halt. HMI beco­mes a two-way-interaction.

Did you know? — Code­lab has more than 10 years of exper­ti­se in the area of sta­te-of-the-art HMI for cars. Learn more: https://codelab.eu/embedded-human-machine-interface/

Seba­stian Zang is the man­ging direc­tor of Code­la­b’s Ger­man enti­ty. He reads a lot of books and is an avid blog­ger. Ple­ase visit his Ger­man spo­ken blog https://bytesforbusiness.com or link in with Seba­stian to see more of his publi­shed articles.