The Quest to Simplify Human-Machine Interfaces



Taking a “clean sheet” approach to HMI.

We’ve come a long way from the earliest days of computing technology when physical switches provided data inputs and lamps were used as output indicators. The following era saw punch cards, paper tapes, Teletype keyboards, and printers— still very clunky and not exactly for novice users. The advent of the visual display unit and keyboard coupled with a command line operating system certainly represented a significant step forward in interface technology, but it wasn’t until the introduction of WIMP (windows, icons, menus, pointer) technology in the 1980s that we saw anything resembling the graphical user interface (GUI) we would recognize today.

Alongside computers, the human-machine interface (HMI) for industrial, commercial, and consumer systems has evolved from switches and indicator lights to use similar computer-style technology. This now includes advances such as touch screens and the voice activation that we’ve become familiar with from our smartphones, while gesture recognition, haptic feedback and head-up displays are being increasingly deployed and not just in high-end medical or military equipment but in automotive applications and home appliances.

The goal of HMI is to provide an intuitive solution to a problem, but unfortunately many hardware and software designs fall short of this ideal and don’t always align with the nuances of human behavior. Intuitive means instinctive, natural and untaught so clearly any equipment design that needs to be accompanied by an instruction manual to operate it has already failed that challenge. That said, the use of a keyboard, computer mouse or touchscreen are all learnt experiences that HMI systems can reasonably build on. More often the confusion comes from experiences with many subtly different systems that condition our behavior to expect something to work in a particular way.

Simplifying HMIs to reduce the number of choices available and the means of responding is one approach, but this is often thwarted by the increasing requirements of modern systems. This complexity is illustrated in Figure 1, which shows the HMI deployed in the cockpit of one of the world’s most advanced commercial aircraft. By contrast, the HMI in the driver’s cabin of an inter-city train, Figure 2, belies the undoubted complexity that lies behind its controls.

Figure 1: Despite Boeing’s 787 Dreamliner being one of the most advanced commercial aircraft in the world, its HMI shows how complex our world has become. (Source: Alex Beltyukov)

Figure 1: Despite Boeing’s 787 Dreamliner being one of the most advanced commercial aircraft in the world, its HMI shows how complex our world has become. (Source: Alex Beltyukov)

Modern HMI design is no longer the sole province of hardware engineers and software developers. Rather it demands an understanding of behavioral psychology and cognitive neuroscience, and it is with the benefit of people working in these fields that the challenges of achieving truly simple and intuitive HMI solutions, using new methodologies and tools, are being overcome.

Figure 2: The modern and surprisingly simple HMI in the driver's cabin of a German Intercity-Express High-Speed Train.

Figure 2: The modern and surprisingly simple HMI in the driver's cabin of a German Intercity-Express High-Speed Train.

Many leading companies are championing this space, some focused on particular interface technologies while others are building on their expertise in key user applications or market segments. Companies like Texas Instruments offer an entire ecosystem for the creation of HMI that streamlines the development of a variety of interfaces by providing virtually all the components, software, and support required. Its portfolio of products for HMI is shown in Figure 3 and includes, I/O devices, processors that include accelerators and graphics engines, audio capabilities, power solutions including Power over Ethernet to simplify wiring, plus support for wireless connectivity.

Figure 3: Texas Instruments has a portfolio of products capable of meeting the HMI development requirements for many different use-case scenarios.

Figure 3: Texas Instruments has a portfolio of products capable of meeting the HMI development requirements for many different use-case scenarios.

The Right Tool for the Job


As electronics equipment evolved from largely analog to largely digital systems, early digital designs were characterized by push buttons that replaced toggle switches and numeric displays in place of dials. Subsequently we’ve seen what some might regard as retro styling, i.e. the mimicking of analog controls but with underlying digital technology. In many cases however, retro designs recognize the superior ergonomics of a dial control, often with the added benefit of including a push button action for turn and click operation, especially when the result is presented visually on a display.

Where this simplicity can be lost in some poorer HMI designs is when such a control is used like a mouse to navigate nested menus to reach the desired function. This may be acceptable for computer-savvy or technically literate users in applications that have their full attention but isn’t necessarily ideal in a more demanding environment e.g. controlling an automotive infotainment or satellite navigation system whilst driving. Hence it is vital to understand how and where the end-equipment will be used. What works for a consumer on their smartphone won’t necessarily suit a surgeon in the operating theatre or a worker on a factory production line.

Nevertheless touchscreens have rapidly become the HMI of choice in a multitude of commercial and industrial applications, from banking and point-of-sale terminals to machine and process control systems. Most of these use capacitive touch technology that allows precise control of individually displayed buttons as well as enabling features such as swiping and multi-finger scrolling. The MTC86303 from Microchip Technology is a very capable touchscreen controller that can be easily integrated into any system.

While very versatile, capacitive touchscreens have their limitations – they can’t generally be operated with gloved hands and are affected by water or other contaminants on the screen. Gesture control, where hand gestures made in the air in front of the display are captured by a camera and used to provide command inputs, is becoming an increasing reality. This is particularly attractive for automotive applications where camera-on-a-chip sensors can capture high-quality VGA resolution images. This device uses advanced active pixel CMOS technology to provide high sensitivity and includes color recovery, programmable gamma correction, sharpening, auto exposure, and many other functions.

Another HMI alternative that provides hands-free operation is voice control. While voice recognition has suffered in the past from poor accuracy and the need to learn a user’s voice and vocabulary, advances in artificial intelligence, digital signal processing, and high-performance processors, mean that good voice recognition systems today are surprisingly accurate and no longer limited in the number of functions they can support. Nuance’s Dragon Naturally Speaking software, which has long been a favorite for journalists and writers, was adopted by the Ford Motor Company for its MySync 2 infotainment system and by Apple for Siri.

Redefining HMI

As discussed already, it’s almost become a convention for modern HMI technology to mimic traditional controls. In many instances this is justified as the previous approaches are tried, tested and familiar to users, and so ensure ready acceptance. Sometimes though it is worth starting over and dismissing preconceived ideas in order to find a better way of implementing functions so that they don’t become trapped in the complexity of nested menus.

Mission-critical systems are amongst those most in need of fresh ideas in order to improve clarity of operation and avoid the uncertain or slow user response that a confusing interface can cause. The requirements of automotive systems, which mustn’t distract a driver, , have already been touched on (no pun intended) and yet, sadly, the trend to adding more and more functionality to the driver console and/or touchscreen seems to be heading away from the simplicity HMI designers ought to be striving for.

To be fair though, alongside merging the functions of entertainment and navigation systems with all the conventional vehicle status displays and control functions, we are now seeing the addition of many safety features, such as lane departure, blind-spot and forward collision warnings, and cameras for assisted or even fully-autonomous parking. And all this needs to be achieved within the confines of a relatively small driver cockpit space, although clearly as we move to driverless cars many of these requirements will either go away or will be just for information.

The challenge with the “clean sheet” approach to HMI is achieving something that satisfies all users, whether novice or expert. One touchscreen user interface, created by San Francisco-based designer Matthaeus Krenn, is perhaps best appreciated by viewing his video. From an initially blank screen, the function invoked depends on the number of fingers used to touch the screen. Then the distance between the fingers determines whether an option is being selected or a setting adjusted.

Haptic feedback is a technique that is now being used to enhance interaction with a touchscreen, e.g. Walt Disney’s research division has developed a rendering algorithm that simulates textures on a touchscreen. It does this by applying an electrical field across the screen that creates friction detected by receptors in the skin when a finger is moved across the surface. By modulating this friction sensation it is possible to produce the illusion of a 3D bump on the surface, which could help a user locate a button by touch rather than having to look for it.

A head-up display (HUD) is another way of reducing the distraction of a car driver having to look away from the road to a console display. HUDs have long been used in military aircraft going back to World War II and are now standard equipment and often include synthetic vision systems—see Figure 4. HUDs are now also appearing in luxury cars and, as with most automotive technology, are likely to permeate down through the model ranges over time.

Figure 4: Synthetic vision systems complement head-up displays in military aircraft.  (Source: Honeywell)

Figure 4: Synthetic vision systems complement head-up displays in military aircraft. (Source: Honeywell)

The automotive market already addresses driver fatigue with various systems used to detect alertness and provide appropriate warnings but again typically only on higher-end models. In industries such as mining, detecting operator fatigue and distraction can be even more crucial in preventing accidents. Caterpillar Global Mining is working with Australian company Seeing Machines Limited to monitor operator alertness through eye movement, facial tracking and head position using computer vision algorithms coupled with other vehicle mounted sensors. Seeing Machines has also developed solutions for automotive Advance Driver Assistance Systems (ADAS), avionics, and rail transportation systems.

Like the detection of hand gestures, face detection is a recent addition to the HMI arsenal. Omron Electronic Components’ B5T HVC Face Detection Sensor Module uses various algorithms to perform facial recognition and the estimation and evaluation of expressions based on the company’s OKAO™ vision image sensing technology. It can achieve this with high precision in as little as 1.1s and can measure blink rate, and even estimate age and gender.

“Horses for Courses”

Simplifying HMI applications by updating them to take better account of human psychology and cognitive function will not provide a single solution that suits every requirement. Rather each case will need to recognize what’s most important to its users, taking advantage of the latest interface technologies—touchscreen, voice recognition, haptic feedback, head-up displays, gesture and facial recognition—to provide information that is quickly assimilated and controls that are intuitive to use.

Not all systems require a large display with a single, multi-function control knob; sometimes a minimal display with a few buttons may provide a more innovative solution, for example in wearable devices such as fitness bands. Similarly applying leading-edge technology isn’t always the answer, which is where employing design teams that aren’t just made up from hardware and software engineers can pay dividends. Some of the advances in HMI may be most apparent in the auto industry but aren’t necessarily the best so it’s important to learn from these experiences and pay more attention to the end-user.

______________________________________________________________________________________________

Photo-PaulGolata_webPaul Golata is Senior Technical Content Specialist, Mouser Electronics. He joined Mouser Electronics in 2011. Golata is accountable for contributing to the success in driving the strategic leadership, tactical execution, and overall product line and marketing direction for advanced technology related products. He holds a BSEET from DeVry Institute of Technology – Chicago, IL; an MBA from Pepperdine University – Malibu, CA; and a MDiv w/BL from Southwestern Baptist Theological Seminary – Fort Worth, TX. Golata may be reached at paul.golata@mouser.com.

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google