
Think Like a Human Factors Engineer: Five Principles for Healthcare Leaders
The strongest safety improvements begin with better questions. Here are five principles that can reshape how you lead change.
When something goes wrong in healthcare, the instinct is often to blame the person closest to the error. A nurse mislabels a specimen. A physician overrides an alert. A technician skips a checklist step. But often the real issue lies not with the individual but with the system around them.
That's the premise of clinically informed human factors engineering (HFE), and it's a critical mindset shift for healthcare leaders responsible for quality, safety, and performance. It’s been said that "Every system is perfectly designed to achieve exactly the results it gets.” If we want different outcomes, we need different systems.
This blog introduces five core principles drawn from human factors thinking, each designed to help leadership teams rethink how they assess, design, and improve the environments in which care occurs. They are practical shifts in perspective that can change how your organization handles incident reviews, safety interventions, and improvement planning.
Let's get into them.
Principle 1: Assume Everyone Came to Work to Do Their Best
It's tempting to look at a mistake and ask, "Who messed up?" But the more productive question is often, "What made this error possible in the first place?"
HFE starts from a basic premise: People don't show up intending to fail. If someone takes a shortcut or skips a step, it's usually because the system around them isn't supporting the work as intended. In other words, behavior makes sense within a given context, even when it produces an undesirable outcome.
This principle focuses on understanding mistakes, not excusing them. When quality and safety teams assume positive intent, they can shift the focus from blame to design. Was the process overly complex? Did the environment create unnecessary pressure? Was the technology intuitive to use?
Beginning with this mindset during incident reviews or system redesigns allows healthcare leaders to create room for deeper learning and more durable fixes.
Principle 2: Consider the Principle of Local Rationality
When someone skips a step or breaks protocol, the usual response is, "What were they thinking?" But a better question is, "Why did that choice make sense to them at the time?"
This is the essence of local rationality. People act in ways that seem logical based on the information, pressures, and tools available to them in the moment. That logic may not be obvious from the outside, especially after an incident, but it was real. And if it made sense once, it could make sense again to someone else. Therefore, it is important to speak both to the person who made the error and to others in the same role in order to understand the depth of the system challenge that led to the error.
Understanding local rationality helps explain errors. That's critical if the goal is to improve systems. Asking why a deviation seemed reasonable in context helps identify design flaws, communication breakdowns, or environmental pressures that need attention.
The path to smarter systems begins with assuming every behavior has a reason and making that reason visible.
Principle 3: Encourage Ideas That Are Outside the Norm
When addressing challenges, it is common to fall back on the same solutions because they are familiar and easy to implement. Instead, bring together frontline staff and other representative stakeholders and begin solution development by saying, “No idea is off the table.” What’s something that could work, even if it isn’t by the book?
After following Principles 1 and 2 to better understand your system and its challenges, ask the solution development team to brainstorm ideas to solve each challenge. Then, discuss those ideas and, with each person contributing, narrow the ideas down to a solution set.
We often say, “Begin from the perspective of plenty of resources”—not because you have money and time to burn, but because when you free people to think outside the box, they do! The reality is that meaningful improvement often starts with an uncomfortable idea; however, a collaborative team of invested people can work together to hone those ideas into something realistic and manageable. The result is a finished solution set that is a great return on time and/or dollar investment and provides effective, sustainable solutions that are supported by the people who do the work.
Ideas outside the norm don't have to be accepted wholesale, but they deserve serious consideration. The gap between policy and practice is often where the next improvement lives.
Principle 4: Follow the Hierarchy of Intervention Effectiveness

When a safety event occurs, the default fix is often retraining. It's fast, familiar, and reinforces accountability. But from an HFE perspective, it is rarely the best approach.
Training and policies are at the bottom of the hierarchy of intervention effectiveness. They depend on people doing the right thing, the same way, every time, regardless of distractions, workload, poorly designed equipment, cumbersome processes, or other system shortcomings. Systems that rely on perfect human behavior are vulnerable to adverse events.
The most effective interventions reduce reliance on memory or vigilance. For example, a forcing function is a design that compels the correct action, such as a tubing connector that prevents misconnection. This type of design is far more reliable than another round of reminders.
Healthcare leaders reviewing incidents or designing processes should ask: Are we trying to change the people or the system around them? When possible, move upstream. Make the right action the easiest one to take.
Principle 5: Remember That Everything in the System Is Interconnected
In healthcare, change does not happen in a vacuum. If you shift one step in a process, what other steps will be affected?
That's why this final principle is important. Making changes to intake forms or staffing schedules will likely influence documentation, billing, throughput, and even clinical outcomes. Seemingly simple fixes can introduce delays, confusion, or rework elsewhere in the system.
Effective teams apply a systems mindset from the start. Before launching an initiative, they ask: What's upstream of this step? What's downstream? Who else is involved in this process?
Small, thoughtful changes can lead to meaningful results, but only if you understand how they impact the rest of the system. Often, you can leverage this understanding to implement seemingly simple changes that have a positive impact across your system.
Where Will You Start?
Healthcare improves not by asking people to try harder, but by building systems that support better outcomes.
These five principles offer leaders a practical way to start making the shift from retraining to design, from reaction to reflection. Assume good intent. Understand behavior in context. Encourage unconventional ideas. Choose stronger interventions. Employ systems thinking.
Each principle is simple, but none are easy. They require leaders to ask harder questions and listen longer to the answers.
How can these ideas change the way your team thinks about improvement?
ECRI works with healthcare organizations to put these principles into practice. As your organization works to enhance safety for patients and staff, consider partnering with human factors engineers from ECRI. Learn how
We can help you take a truly human-centered, total-system approach to safety.
Author:
Vicki R. Lewis, PhD
Senior Manager of Human Factors, ECRI

Dr. Lewis recently joined ECRI after working as a healthcare consultant with hospitals, outpatient facilities, and home care organizations. Dr. Lewis has extensive experience applying the safety science of HFE to complex work systems in the healthcare domain. She has an established research portfolio, which includes directing, collaborating, managing, and consulting on topics such as healthcare-associated infections, usability of health information technology, evaluation of RCA effectiveness, and safety process improvement. She is practiced in facilitating sentinel event analyses to identify root causes of adverse events and develop system-based solutions that maximize effective and sustainable safety solutions. Dr. Lewis received her Ph.D. in Industrial and Systems Engineering with an HFE option from Virginia Tech in Blacksburg, Virginia.