A guest column by the American Society of Anesthesiologists, exclusive to KevinMD.com. A nurse is assisting the attending physician in placing a central line in the intensive care unit. It’s 3 a.m., and the attending is clearly tired, having admitted eight patients so far on her shift. The nurse notices the physician pushing her eyeglasses up her nose – after donning her sterile gloves. She then moves to touch the central line tray. “Wait,” the nurse says, “you’ve contaminated yourself.” “No, I didn’t. Let’s get this done quickly,” the attending snaps. The nurse begins to doubt what he saw and does not speak up for the rest of the procedure. This is a disturbingly common daily scenario in medicine — someone witnesses an unsafe event and is either afraid to speak up or tries to speak up and is ignored. Unfortunately, when clinicians do not speak up, patient harm may result. In the medical setting, speaking up indicates raising concerns about patient safety when an actual or potential risk is present. Rather than being an attempt to place blame on a health care provider, it is intended to refocus attention on the patient and their wellbeing. While many clinicians intuitively recognize the importance of speaking up for safety, it can be challenging. Many barriers exist that prevent them from speaking up. One pervasive reason is the medical hierarchy amongst physicians (for instance, there is a power differential between medical students and residents and between residents and attendings) and across clinician roles (such as between nurses and physicians). If the person witnessing the safety event is perceived to be in a “lower” hierarchical standing than the person committing the error, it can be especially challenging to go against established hierarchies and say something. Gender and cultural differences can present additional barriers, especially if “saving face” is valued highly by the individual. Interpersonal relationships affect whether a person feels comfortable speaking up. In the example above, given the attending physician’s negative reaction to the nurse, it is unlikely that the nurse will speak up with this physician in the future. Other sources of hesitation include fear that speaking up will worsen existing relationships (one might think, “I work with this person every day; will they treat me differently if I say something?”) and fear of retaliation from hospital leadership against the person who spoke up. Even worse, passive or blatant dismissal of the safety issue being raised can create an environment that discourages individuals from bringing up additional issues. Lack of action on reported issues can send the message that hospital leadership does not prioritize patient safety. Many of the barriers described are characteristics of production-focused organizations, which emphasize optimization and standardization but can be resistant to change and are error-prone as a result. They treat adverse events as anomalies and often respond by blaming the providers involved. In contrast, high-reliability organizations (HROs) prioritize safety. They build redundant systems that are adaptable and flexible. They treat adverse events as valuable information about potential system dysfunctions and reward messengers who speak out. In other words, hospitals that are HROs facilitate individuals speaking up. Institutions that facilitate speaking up also tend to have practical means for individuals to report safety issues via hospital reporting systems that are accessible and easy to use. Clinicians are unimpeded from reporting actual incidents and “near misses,” namely, situations where an adverse event can occur. Safety leaders take reported safety issues seriously and provide feedback to the reporter, creating positive reinforcement to those who speak up that their actions create actual change. In the case of reported adverse events, safety leaders do their best to view incidents with an eye toward improving systems-based practice. In contrast to the “blame and shame” ideology, where it is felt that the individual who committed the error must simply have been deviant, systems-based practice recognizes that humans are imperfect and work in imperfect settings. Specifically, it is the faulty system that allowed the error to occur. Then, using principles of human factors engineering, these institutions aim to optimize systems and processes to prevent individuals from making an error, thereby improving quality and safety. These positive feedback loops are a key component of psychological safety, a concept developed by Amy Edmondson at Harvard Business School. In psychologically safe environments, individuals feel comfortable speaking up, knowing they will be taken seriously and will not be blamed. Additionally, individuals work as a team, aiming to improve safety for the benefit of all. They also use principles of “just culture,” providing support for individuals who made an error despite aiming to perform their best while holding reckless individuals accountable for knowingly acting outside of safe practices. How does an institution develop a culture of safety? Developing a culture of safety requires hospital systems, departments, and department leaders to implement and maintain an environment that allows all providers to feel safe and secure when reporting and analyzing medical errors. At the same time, individual clinicians are key stakeholders and must also be engaged in this process. Maintaining open lines of communication is key, and both clinicians and leaders should be prepared to speak clearly and listen to one another. There are a variety of communication tools developed specifically for hospitals by TeamSTEPPS, a teamwork system for health care professionals. One is the “two-challenge rule,” where the person who witnessed the safety event vocalizes their concern and repeats if their concern is not addressed. For the two-challenge rule to be effective, the team member being challenged must acknowledge they have heard the concern; if they do not, the person speaking up should be empowered to pursue a stronger course of action or seek a supervisor for help. TeamSTEPPS also provides phrases that can help facilitate communication, such as “CUS,” which stands for “I am Concerned! I am Uncomfortable! This is a Safety Issue!” A number of other algorithms exist, including “DESC” (Describe, Express, Suggest, Consequences) and “PACE” (Probe, Alert, Concern, Escalate). Environments that promote psychological safety and encourage speaking up are critical for maintaining patient safety. High-reliability organizations are those that recognize that humans make mistakes, acknowledge that errors and risk will continue to present challenges to patient care, and provide conditions that will help identify, manage, and mitigate risk. Overall, a robust culture of safety must be continuously fostered – by individual clinicians and department and hospital leaders – to help all health care providers speak up for their patients. Source