Gehe zum Hauptabschnitt
What will the Human Machi­ne Inter­face look like in the Car of the Future?

What will the Human Machi­ne Inter­face look like in the Car of the Future?

Sebastian Zang Sebastian Zang

In the ear­ly days, HMI in a car was basi­cal­ly com­ing down to but­tons. For each func­tio­n­a­li­ty the­re was a but­ton, a knob, or a togg­le switch. The more fea­tures your car offe­red, the more it loo­ked like the Cock­pit of an A350 air­pla­ne. And for sure, in order to keep an over­view it was hel­pful to have a good colour­ful manu­al at hand.

HMI design has chan­ged dra­ma­ti­cal­ly sin­ce. Not only does mini­ma­listic design pre­vail today; but also dri­vers may not even need to touch anything to fire up the engi­ne. Sure, most cock­pits and info­tain­ment sys­tems today are ope­ra­ted via touch screens and touch­pads. But you‘ll find more and more cars equip­ped with voice con­trol.  A simp­le „Hey Mer­ce­des“ or „Hey Por­sche“ is enough and and a voice assi­stant is activated.

Alt­hough today voice con­trol is limi­ted to a few enter­tain­ment func­tions, this will chan­ge: Accord­ing to UI/UX desi­gner Kaveh Shir­del voice con­trol will gain mas­si­ve­ly in impor­t­ance. The­re is, howe­ver, still a lot of home­work to be done. Alt­hough the detec­tion rates of voice con­trol sys­tems in cars reach bet­ween 95 and 98 per­cent, this also means: The­re are still „con­ver­sa­ti­ons“ bet­ween dri­ver and car that end up in frus­tra­ti­on. Espe­cial­ly in cri­ti­cal situa­tions it must be ensu­red that a car is under full con­trol of the driver.

What about ges­tu­re con­trol? HMI expert Peter Röss­ger (from the con­sul­tancy Bey­ond HMI) sees litt­le poten­ti­al, becau­se ges­tu­re con­trol requi­res the dri­ver to learn ano­t­her lan­guage of inter­ac­tion: One must know exact­ly which ges­tu­re must be per­for­med in front of which visu­al sen­sor. Röss­ger sees, howe­ver, the fol­lowing use case: The dri­ver could point to a switch and ask „What is that?” – And the speech assi­stant then pro­vi­des the answer.

Last but not least: We shouldn‘t think of HMI as a one-way com­mu­ni­ca­ti­on. The car of the future will know (and learn) about a driver‘s cha­rac­te­ris­tics. For examp­le acce­le­ra­ti­on beha­viour, bra­king cha­rac­te­ris­tics and – more gene­ral­ly – a risk-taking / risk-avoiding pro­fi­le. And the car will adapt to the indi­vi­du­al dri­ver, the instru­ment clus­ter might look dif­fe­rent, the pre­set­tings of the enter­tain­ment sys­tem will be dif­fe­rent. The car of the future will also moni­tor the dri­ver: his atten­ti­on level, for examp­le. And if it detects signs of drow­si­ness (eye tracking), the car might emit a warning. Or, if requi­red, even bring the car to halt. HMI beco­mes a two-way-interaction.

Did you know? – Code­lab has more than 10 years of exper­ti­se in the area of sta­te-of-the-art HMI for cars. Learn more: https://codelab.eu/embedded-human-machine-interface/

Sebas­ti­an Zang is the man­ging direc­tor of Code­l­a­b’s Ger­man enti­ty. He reads a lot of books and is an avid blog­ger. Plea­se visit his Ger­man spo­ken blog https://bytesforbusiness.com or link in with Sebas­ti­an to see more of his publis­hed articles.

 

Sticky news