The Human Machine Interface

The Human Machine Interface

Arval Mobility Observatory 5 Nov 2020

The Human Machine Interface

Paris, 5th of November 2020


The idea of the Human Machine Interface (HMI) is as old as the Industrial Revolution.

It describes how humans interact with tools, either through buttons, voice commands or any other interactive means.


The automobile is an interesting illustration of the HMI evolution throughout the years. Today, AI assistants and near-holographic displays cohabitate with the traditional pedals, steering wheel, and gear-lever, present since the first car.

Some things have changed beyond recognition. Buttons and knobs, for years responsible for climate and entertainment control, are being replaced by touch displays. In 2019, 85% of cars sold worldwide had a touchscreen, a big step from 53% in 2014.

These interactive displays control most, if not all, onboard systems but does this HMI application protect the driver from distraction?

The AAA Foundation for Traffic Safety reports that it can take up to 43 seconds of a driver’s attention to make a music selection on some of the most distracting touch infotainment systems.

Even with the best ones, such an operation can take 22 seconds plus an additional 27 seconds to get the driver’s full attention after completing the task.

One of the limitations of the touchscreen is the absence of tactile feedback to help the driver navigate controls without taking eyes off the road.

Legislation and standardisation of in-car multimedia are sparse, with the EU only going as far as to ban watching movies or TV while driving. Luckily, new screen technologies may ensure that road users stay safe.

Car manufacturers are looking to counteract the impact of distractions with some futuristic-sounding tech. One start-up has developed touch displays you can feel. Technology from Tanvas allows touch interfaces that appear textured to the fingers. Their simulations have

demonstrated a 20-30% increase in the eye-on-road time. One can feel clicks, different virtual buttons and their edges, providing the functionality of hundreds of old-school knobs and buttons.

The Heads-Up Display, or HUD, is an idea lifted from sci-fi that made it to the aeroplane cockpit and more recently, cars. Its simple but effective implementation allows the driver to get essential information such as speed or navigation right before the eyes.

An image projected onto a film on the windshield looks like it's hovering in mid-air. With camera tracking and other technology, there is a possibility for Augmented Reality (AR) to be used as part of the HUD package.

Combine this feature with ever-smarter Ai voice assistants that can helpfully reply to natural commands, the vehicle of the future is not-so-distant.

Human-machine interaction goes beyond the personally-owned car, however. Increasing pressure from urbanization and sustainability challenges is driving a trend toward more shared transportation, including carsharing.

The car-sharing experience requires new functions for operators such as GPS tracking and remote locking and unlocking features, through an app or a call-centre.

Shared fleets of the future will likely include user preferences as part of the experience with custom lighting, music or seat positions when using the vehicle.

Or, perhaps, you prefer to be driven?

Autonomous Vehicles (AVs) are coming. OEMs and start-ups around the world are working on the interfaces that will help us interact and communicate with the increasingly intelligent machines.

Studies have shown the importance of such interfaces as human adoption is seen as the largest barrier to AV proliferation.

As a result,HMI designers are working on systems that will not only show AV passengers what the vehicle is doing but also their intentions at any given moment.

What the vehicle is planning to do must also be clear to people outside the autonomous machine.

There is almost as much tech being developed to show passengers and pedestrians where and how the AV is moving, as there is autonomous hardware and software moving the vehicle.

Non-verbal communication codes such as a flash of light or a gesture can’t be considered without a driver. External display panels, sounds and lights are being researched and designed to fill this gap. This is yet to be validated in dense environments.

In the future, large tactile multi-touch displays placed all over the vehicle will show personal profiles, mood lighting and music ready to stream into the wireless headphones, creating custom experiences even in shared vehicles.

Until the AV transition happens, one can still go and enjoy their car's brilliantly designed and timeless HMI pieces - like buttons and knobs - if it still has them.

Do you consider the HMI when using a vehicle?


Read more Show less