Home | Human Factors | Work Equipment Design Issues | Display Design Principles – Tactile Displays

Display Design Principles – Tactile Displays

General requirements

Tactile feelings and feedback keep us in touch with our environment. Our sense of touch is derived from a range of receptors in the skin that take messages about pressure, vibration, texture, temperature, pain and the relative position of limbs and signals pass through the nervous system to the brain. In general, vibrating stimuli are easier to detect than punctate stimuli. Similar to auditory perception with equal loudness contours in the range of 100 Hz to 10 kHz (across varying sound pressure levels), tactile perception shows high vibration sensitivity for frequencies in the range of 200 to 400 Hz (across varying stimulus intensity). Tactile perception with tactile displays refers to every type of sensation related to that sense of touch, be it cutaneous (pressure, vibration, temperature), kinesthetic (limb movement) or proprioceptive (position of the body).

Tactile and kinesthetic perceptions usually go hand in hand. Often, they are considered together and named haptic or tactile feelings. This happens, when operators not only receive direct tactile stimulation but also when perceiving indirect stimulation, e.g., when handling tools, wearing clothing, gloves or other material on skin or body. Examples of tactile displays are tactile bumps on the home keys (“F”, “J”) of computer keyboards, a tactile bump on numerical keyboards (see dot on key “5” in Figure 1) or the Braille alphabet for the blind. The tactile sense can also provide subtle information like temperature or surface condition (wet, sticky, slimy, round, pointed etc.).

Tactile displays can be designed for active or for passive touch. For active touch, when the operator perceives objects by touch, he/she moves the skin or body part for detecting surface information from the interface, e.g., identifying opening or closure of butterfly valve by touching the lid or identifying the actual gear by touching the gear switch. This is quite similar to grasping or manipulating objects for identification. For passive touch, when the operator perceives pressure on the skin, skin and body parts of the operator are rather stationary while an external pressure stimulus is applied to it, e.g., while identifying vibration signals or identifying product output by sensing product friction or movement.

Tactile display functions will become more important in adaptive assistance systems. Since the human tactile system is not as sensitive in stimulus detection as compared to visual and auditory signal detection, care shall be taken when choosing tactile displays. In general, tactile display functions shall be used to extend or augment task relevant sensational perception or used in special situations such as in quality control tasks or in areas not comfortably accessible by hearing or sight but by touch.


Specific requirements

A highly developed tactile system, often challenging human-computer interfaces, can reproduce the tactile parameters of an object, such as shape, texture, roughness, and temperature. It shall always be a complement or substitution of a visual or auditory presentation of information.

Several characteristics can be designed to introduce haptic feedback. These are considered as haptic displays even though application refer to control actuators.

  • Directing force and perceiving resistance indicates impact when stroking or pressing keys or buttons.
  • Overcoming pressure point or clearance introduce feedback about initiating state change, e.g., in membrane key or levers.
  • Reaching latching point introduced by stroke or force indicates state change, e.g., in latching keys.
  • Touching an angle to latching point or stop informs about e.g., toggle or rotary switch.
  • Perceiving existence of detents and distance between detents indicate use of rotary controls switch or linear control switch

Haptic displays shall be developed according to the task and consider the special technology used. Typical technology examples are:

  • Vibro-tactile displays: No more than nine different levels of frequency shall be used. Level differences should be at least 20 %.
  • Coding information by temporal patterns: the temporal sensitivity of the skin is high, although the time between signals shall be at least 10 ms.

Various technological methods are available to create tactile stimulation of a surface.

  • mechanical needles actuated by electromagnetic technologies
  • electrorheological fluids (ERF)
  • magnetorheological fluids (MRF)

Tactile interfaces can be classified into mechanical, electrical, and thermal.

Wearable tactile displays have potential for development as the skin is the largest organ of the human being. Tactile and/or haptic devices can represent real world or virtual or mixed realities (e.g., in minimal invasive surgery or interacting with exoskeletons, wearables, force feedback devices).



Recommendations for the design of tactile display are as follows:

  • careful choice of body location for stimulation, since there is a wide range of sensitivity affected by e.g., clothing and body posture.
  • careful choice of method of contact with the body, since startling reactions may be detrimental to safety
  • choosing the lowest effective magnitude of stimulation in order to allow for reliable indication
  • careful choice of the control actuator for providing feedback, as e.g., membrane keyboards vary in tactile feedback
  • specific design of control actuators to display use, e.g., as used in aviation with landing gears looking and feeling as wheels or flap controls looking and feeling as rectangular flaps
  • consider context of use, when for example PPE is required, as gloves require specific design to maintain tactile feedback or allow for display input
  • opt for tactile displays in addition to visual and auditory, especially in environments characterised by ample sights and sounds.

For combinations of visual, auditory, and tactile displays see also the section on display design with specific sub-sections.



  • EN 894-2 (2008). Safety of machinery – Ergonomics requirements for the design of displays and control actuators – Part 2: Displays. Brussels: CEN.
  • EN IEC 60073 (2002). Basic and safety principles for man-machine interface, marking and identification - Coding principles for indicators and actuators. Brussels: CEN.
  • EN ISO 9241-920 (2023). Ergonomics of Human system Interaction, Part 920: Tactile and haptic interactions. Brussels: CEN.
  • Lee, J.D., Wickens, C.D., Liu, Y. & Ng Boyle, L. (2017). Designing for People: An Introduction to Human Factors Engineering. CreateSpace: Charleston.
  • Proctor, R.W. & van Zandt, T. (2018). Human factors in simple and complex systems. Boca Raton: CRC Press.
  • van Erp, J.B.F. (2002). Guidelines for the use of vibro-tactile displays in human-computer interactions. Delft: TNO Human Factors.
  • Woodson, W.E. & Conover, D.W. (1973). Human engineering guide for equipment designers. Berkeley: University of California Press.