Published: 19 Feb 2019

Hing-yu So, CQP FCQI, Services Director of Quality and Safety at New Territories East Cluster, Hong Kong Hospital Authority, explains how a focus on human interaction can improve patient safety.

In healthcare, I see a shift from focusing on system design to human elements. At Hong Kong Hospital Authority, we operate a socio-technical system that recognises the interaction between people and technology. This means we have to learn more about human interactions.

A report released by the Institute of Medicine in 1999, 'To Err is Human: Building a Safer Health System', highlighted the importance of safety in healthcare. The report revealed that an estimated 44,000 to 98,000 people die annually in the US from medical errors.

Don Berwick, a former Administrator of the Centers for Medicare and Medicaid Services, said: “Expecting perfection in human action, or simply telling doctors and nurses to “try harder”…has nothing at all to do with our eventual success in improving safety…The remedy is in changing systems of work. The remedy is in design”. 

Over the past two decades, there have been multiple examples of change in management systems, which often involve the introduction of technology. The use of information technology, such as computerised physician order entry to improve medication safety, is a common example. Another example is the use of checklists. The WHO Surgical Safety Checklist was developed after extensive consultation aiming to decrease errors and adverse events, and increase teamwork and communication in surgery.

Healthcare systems are complicated sociotechnical systems and humans are always an important element of the system.

An article in The Lancet about checklists and safety is sobering: “The mistake of the “simple checklist” story is in the assumption that a technical solution (checklists) can solve an adaptive (sociocultural) problem. To improve safety, health care needs to get the technical and adaptive work right.”

Many errors involving technology are not related to failure of the technology, but rather the interaction humans have with the technology.

In my hospital, a patient was given an overdose of heparin. When we inspected the infusion pump, we found that the knob for selecting the parameter for adjustment and the indicator for infusion rate and volume are next to each other. A nurse could easily mistake the position for infusion rate with setting the volume to be infused. This would result in an overdose. 

If we had included this consideration in the usability test, we could have identified the issue before choosing the pump. Better yet, the manufacturer could have discovered this while designing the pump. Human factor engineering is the scientific discipline concerned with understanding the interaction among humans and other elements of a system, and there are many ways healthcare safety can benefit from better understanding human factor principles. 

Crew Resource Management (CRM), a set of training procedures used by flight crew to enhance the safety of every flight, is another good example, which we borrowed from the aviation industry. It is concerned with cognitive and interpersonal skills such as communication skills, leadership skills and decision-making – which can all contribute to improving patient safety. 

Healthcare systems are complicated sociotechnical systems and humans are always an important element of the system. We therefore need to know more about how humans adapt to other elements within this sociotechnical system.