Culture by Design is Now ---- The Leader Factor

Psychological Safety in Healthcare

In this episode of Culture by Design, Tim and Junior discuss the importance of psychological safety in healthcare. They highlight the significant issue of medical errors in the industry and propose creating a culture of rewarded vulnerability to overcome the fear of speaking up. They also discuss the barriers to psychological safety in healthcare and the ultimate impact of psychological safety, which is to improve patient outcomes and reduce preventable medical errors, ultimately saving lives.

Cover of The 4 Stages Behavioral Guide

Download The Episode Resources

Thank you! Your download is on it's way!
Oops! Something went wrong while submitting the form.

Episode Show Notes

In this episode of Culture by Design, Tim and Junior discuss the importance of psychological safety in healthcare. They highlight the significant issue of medical errors in the industry and propose creating a culture of rewarded vulnerability to overcome the fear of speaking up. They also discuss the barriers to psychological safety in healthcare and the ultimate impact of psychological safety, which is to improve patient outcomes and reduce preventable medical errors, ultimately saving lives.

The 4 Stages of Psychological Safety offer strategies for measuring and improving the fear of speaking up at all levels of the organization. By prioritizing psychological safety, healthcare leaders can create a better work environment and improve the quality of care for patients.

Important Links and References

World Health Organization. (2017). Global Priorities for Patient Safety Research. Retrieved from https://apps.who.int/iris/bitstream/handle/10665/258881/WHO-IER-PSP-2017.11-eng.pdf?sequence=1

Institute of Medicine. (1999). To Err is Human: Building a Safer Health System. Washington, DC: National Academies Press. Retrieved from https://www.nap.edu/catalog/9728/to-err-is-human-building-a-safer-health-system

Achieving Physical Safety Through Psychological Safety
https://www.leaderfactor.com/podcast/achieving-physical-safety-through-psychological-safety

Magill SS, Edwards JR, Bamberg W, et al. Multistate point-prevalence survey of health care-associated infections. N Engl J Med. 2014;370(13):1198-1208. doi:10.1056/NEJMoa1306801

Allegranzi, B., Bagheri Nejad, S., Combescure, C., Graafmans, W., Attar, H., Donaldson, L., & Pittet, D. (2011). Burden of endemic health-care-associated infection in developing countries: systematic review and meta-analysis. The Lancet, 377(9761), 228-241. doi:10.1016/S0140-6736(10)61458-4

Zimlichman E, Henderson D, Tamir O, et al. Health care-associated infections: a meta-analysis of costs and financial impact on the US health care system. JAMA Intern Med. 2013;173(22):2039-2046. doi:10.1001/jamainternmed.2013.9763

Episode Transcript

[music]

0:00:02.6 Producer: Welcome back Culture by Design listeners. It's Freddy, one of the producers of the podcast, and in today's episode we'll explore psychological safety in healthcare. Tim and Junior will discuss the barriers of psychological safety in healthcare and what we hope is the ultimate impact of psychological safety, and that is to improve patient outcomes and reduce preventable medical errors, ultimately saving lives. It's a fascinating episode, even if you are not working in a healthcare related field. As always, today's episode show notes can be found at leaderfactor.com/podcast. Thanks again for listening. Enjoy today's episode on psychological safety in healthcare.

0:00:45.2 Junior: Welcome back everyone to Culture by Design. I'm Junior and I'm here with Dr. Tim Clark. And today we'll be discussing psychological safety in healthcare. Tim, are you ready for today? 

0:01:01.9 Tim: This is a very serious, important high stakes topic. I'm ready.

0:01:06.8 Junior: It is. We're gonna do it justice.

0:01:09.1 Tim: Let's do it.

0:01:10.0 Junior: It's a topic that we run into fairly frequently. A good portion of our client base here at LeaderFactor is in healthcare and we're gonna dive into some of the challenges, some of the obstacles in healthcare and what we believe to be some of the most poignant solutions. If you're not in healthcare, stick around because the principles that we're gonna talk about today have very broad application. We wanted to start out by sharing some numbers with you. These are some pretty compelling data. The World Health Organization estimated that medical errors may be responsible for as many as 2.6 million deaths each year, and that they may cost the global economy trillions of dollars annually. Fully burdened numbers, trillions isn't outside the realm of reason and medical errors are a leading cause of death in the United States with estimates suggesting that they may be responsible for as many as 250,000 deaths a year in the US. If you break that down even further, medication errors alone account for 7,000 deaths a year.

0:02:10.5 Junior: Medical errors cost the US healthcare system between 17 billion and 29 billion a year, and the majority of these medical errors are preventable. We're gonna talk about that today. These errors come in the form of surgical errors, communication errors, diagnostic errors, HAIs, healthcare associated infections, and HAI specifically, globally, around 7.6% of hospitalized patients get an HAI, cost the US healthcare system between 28 and 45 billion annually, the CDC estimates. The CDC also estimates that approximately one in 31 hospitalized patients has at least one healthcare associated infection. That's amazing. It could be a surgical site infection, a bloodstream infection, it could be pneumonia, which is fairly common. So these are massive, massive numbers.

0:03:06.2 Tim: Junior, we just need to pause for a minute to absorb what you just said. 2.6 million deaths each year. The order of magnitude is something that it takes a minute to get that 250,000 deaths per year in the United States alone. These are numbers that are hard to comprehend.

0:03:28.1 Junior: They're very big numbers.

0:03:30.8 Tim: Big numbers.

0:03:31.7 Junior: And part of what's interesting about those numbers is that they are attributable to, as we said, medical errors and the premise that it's an error means that it's preventable. So if it's preventable, what's the upside? Well, the upside is the opposite of what we said. It's saving the lives of 2.6 million people a year and trillions of dollars. So if those are the stakes, which they are, then what have we done so far to combat those medical errors? There are a few things that institutions have tried and that we've become quite good at or much better at over time. The first is technology. We've done a whole host of things here. We've got computerized physician order entry, electronic health records, bar coding for medication administration, and then we've got standardized protocols and processes. Process improvement has been much better. Communication, training, ongoing professional education. You look at all of these things that we've done and it's not enough. You could be at the forefront of all of the aforementioned things and still experience a high error rate and bad outcomes. So all of those things, they're tools, they're scaffolding, but the root of the issue is something different. And that's the crux of our conversation today. The issue is interpersonal, the issue is cultural. So what do we do? 

0:04:53.0 Tim: Yeah, Junior, let me just make a comment. We have made enormous strides, as you say, in the solutions that we've pursued with technology, with standardization of processes and systems and protocols and training and education. But it gets to a point where we reach diminishing returns. So we acknowledge the enormous strides that we've made since Florence Nightingale organized the hospital in the Crimea. We've come a long ways.

0:05:24.0 Junior: We've come a long way.

0:05:25.5 Tim: But we are running into bumping up against this issue of communication and cultural barriers again and again and again. We've gotta dig into this. That's what we're going to do.

0:05:37.7 Junior: So the title today, Psychological Safety in Healthcare, we're gonna be talking about psychological safety. So what is psychological safety? We define it as a culture of rewarded vulnerability and we wanna apply that lens specifically to healthcare. So rewarded, vulnerability. What are some acts of vulnerability that take place in healthcare? It's constant. Healthcare providers are navigating patient care, internal personal relationships to the hospital. Professional growth, and this is broader than just hospitals, healthcare generally, but asking a question, that's an act of vulnerability. Reporting someone else's error, that's a vulnerable activity. Admitting a personal mistake, challenging a physician or someone higher in the hierarchy, seeking help. Maybe you're faced with a difficult case, an unfamiliar procedure, there's some sort of knowledge gap and you have to raise your hand, say, "I need some help." Advocating for a patient that's a vulnerable activity. Participating in interdisciplinary collaboration, what does that require? 

0:06:39.7 Junior: Acknowledgement of the expertise of other people. You don't know everything, that's vulnerable. Engaging in professional development. Even more general than that, that's vulnerable. Admitting you don't know everything. There's something for you to learn. Cultural and professional humility, being aware of your own biases and assumptions, being open to learning about and respecting the values and beliefs of other people. All those things are acts of vulnerability and what it boils down to is how we treat them. If those acts of vulnerability are rewarded or if they're punished, that's the fork in the road. That's the thing on which this all hinges regardless of your area, regardless of your sector or industry. This is the governing principle and that's why we define psychological safety the way that we do. If you follow that logic, areas of rewarded vulnerability are areas of high psychological safety, areas of punished vulnerability are areas of low psychological safety. Gets kind of the root cause, doesn't it? 

0:07:44.8 Tim: It is. And let's go back to our working definition of culture. We define culture as the way we interact. Now that's a operationalized definition of culture that we use. It becomes very, very important in a healthcare setting when we're talking about clinicians in the way that they interact. We also acknowledge a corollary principle that human interaction is a vulnerable activity. So culture is the way we interact, but human interaction is a vulnerable activity. Psychological safety, as you just said Junior, is a culture of rewarded vulnerability. That means that we've created a pattern, we've created a prevailing norm where we model and reward each other's vulnerable behavior. And you just listed several examples of vulnerable behavior that go on constantly in a healthcare environment. These acts of vulnerable behavior, they will be rewarded or they will be punished. There's not a neutral response to these things. So the the way that we interact is the regulator. It dictates the safety and the effectiveness and the reliability of the care that is given to the patients. We'll talk more about how that works.

0:09:09.8 Junior: We also wanted to share with you some data that we've gathered from an instrument that's called the Ladder of Vulnerability. Some of you may be familiar with this, some of you may have taken it even. It's a self-assessment in which you rank order a list of acts of vulnerability similar to raising your hand and answering a question or asking a question. Tim, you wanna share some of that data? I think this is powerful.

0:09:34.3 Tim: I would. So Junior, as you say, this is a research project that is ongoing for us. We house the world's largest normative database on psychological safety. We also have a research project going on that we call the Ladder of Vulnerability. And what we try to do is systematically measure the level of risk that is associated with different acts of vulnerability. We have a set of 20 of the most common acts of vulnerability and we measure those based on risk across the world with people in all kinds of different organizations. And so we have empirical data to help us understand the relative risk associated with different acts of vulnerability. Because what do we know? All acts of vulnerability are not created equal. They're different levels of risk, but it's an individual thing. Yes, we perceive different levels of risk, but we also can look at the empirical patterns based on the mean scores associated with each of these acts of vulnerability.

0:10:35.9 Tim: So let me give you an example. Well first of all, before I give you the data, let me just point out the issue. In a healthcare organization, one of the most crucial norms that must be solidified in the organization is that we establish an organization where we can, or a norm, where we can challenge the status quo, we can challenge each other, we can point out errors, we have to be able to do that. But people often don't understand what that means and what that requires. And so they go to the organization, it could be a hospital and they say we're gonna have a speak-up culture, we need a speak-up culture. And they understand why, because the stakes are high and the margin of error is low. Here we are, patient lives are at risk here. So we need a speak up culture, we need psychological safety.

0:11:25.8 Tim: It's easy to say. Let me give you a little data. So the highest risk act of vulnerability based on our empirical research worldwide is giving an incorrect answer, that is the highest risk act of vulnerability. Number two is making a mistake. Number four, I'm gonna skip down to number four. Number four is pointing out a mistake. This is number four in the list of of 20. So that gives us a perspective. It gives us context to understand the relative level of risk associated with pointing out a mistake. Let's add to this that in a healthcare environment, this is an environment where the power distance index is incredibly high. So we are deeply socialized to defer to authority. And so we develop a very deep authority bias in a healthcare organization where everyone is credentialed and we have a very clear hierarchical structure. So on the one hand, we have deep socialization to defer to authority, but paradoxically, we need a norm in which we can challenge authority on a very regular basis without fear of retribution.

0:12:50.4 Tim: Here's the problem. Fear breaks the feedback loop in any organization. It breaks the feedback loop. So if there is fear, if there is deep authority bias, if there's exaggerated deference to the chain of command, who is going to speak up? Very few people will. The reason I share these empirical data, these findings is because I hope that listeners will understand the magnitude of what we are talking about. To establish a speak-up culture is an incredibly difficult thing to do and it hinges on, as Junior as you said, it hinges on whether the leaders of that team or department or functional area or therapeutic unit, it could be on the clinical or the non-clinical side, it doesn't matter. It hinges on whether the leaders model and reward vulnerability up to and including challenging the status quo and pointing out an error. If they don't do that, all bets are off. It doesn't matter how much you communicate, it doesn't matter how much you train, it doesn't matter what metrics you track, all of that will be secondary and it will not overcome the modeling behavior of the leaders. That's how this works. It goes back to the mechanism, the central mechanism of modeling and rewarding vulnerability. So there's a little empirical research to support what we're saying, Junior.

0:14:29.5 Junior: The data's fascinating and the application to healthcare is particularly interesting to me because there are some things about healthcare that work against us and that work against the modeling and rewarding vulnerability. So what is it about healthcare that's unique? Well, the hierarchy is more clear in healthcare than in most other places. This is is not a fuzzy, ambiguous gray hierarchy. We all know what the hierarchy is, especially on the clinical side. So at the institutional level, that can work against us. And I was thinking about those Ladder of Vulnerability items through the lens of healthcare. And there are some things from very early on in academic and professional training that work against us in healthcare. Admitting a mistake or giving an incorrect answer. Think about that on the academic side, that gets pressured and squeezed out of you for years because the whole system is based on giving the correct answer.

0:15:28.8 Junior: That's fascinating to me. So the institution, the academic institution, can sometimes work against that. So we have hierarchy at the institutional level. We have potentially, not that it's unique to healthcare, but authoritarian leadership at the team level. If you have a real domineering leader, that can work against you because of the punished vulnerability, that's what it comes down to. And then you have other things like personality. Dominant personalities overpowering conversations. Maybe you have shy team members that don't speak up and that's just something that's, it's not unique to healthcare, but it's something we all deal with. So what happens when those acts of vulnerability that we mentioned that are unique to healthcare are punished? What actually happens? It's safe to say that those acts of vulnerability would happen less frequently than they would otherwise. I think that's pretty safe to say.

0:16:22.1 Tim: Oh, it shuts you down, Junior, it thrusts you into a defensive mode of performance. So you're retreating, you're withdrawing, you're recoiling, you're managing personal risk. It's about loss avoidance, it's self-preservation. So think about the cascade of consequences. It's unbelievable in a healthcare environment, maybe you could point out some specific examples.

0:16:46.7 Junior: If you ask a question and the question, which is an active vulnerability, and the question that act of vulnerability is punished. So let's say you are asking, "Hey, I'm not quite sure about this thing. What does this do? What does this mean?" And someone says, "You should know that, what kind of a question is that." And sometimes the punishment is very overt, it's very on the nose and sometimes it's subtle. But what's going to happen to the number of questions that get asked from you is gonna go down. And what happens to the number of questions that get asked by the team around you that's gonna go down 'cause they're watching the whole thing and they're saying, "Okay, so that's how it goes around here. I'm not gonna ask the question that I had." So the result is fewer questions. And how does that play out over time? 

0:17:32.3 Junior: And that's bad news. You'd get fewer error reports. Let's say that you reported an error as an act of vulnerability that got punished, are you gonna do more or less of it? Probably less on average and in general. Blanket statement across healthcare, you're gonna get fewer error reports, fewer critiques, less seeking for help, less patient advocacy. You're going to get less interdisciplinary collaboration, you're going to get less professional development, you're going to get less cultural and professional humility. And all of those things, you'll get fewer of them and less of them in the short-term and you'll get fewer of them and less them in the long-term. Those things compound for better or for worse. And in this case, if it's punished, it's for worse. And so what happens? You get a very quiet deferential organization and that feedback loop gets broken. The local knowledge does not get circulated and we end up in a pretty bad scenario.

0:18:31.0 Junior: And it affects three things, at least as far as I can tell in a general sense, at least three things. Patient safety is the first one. Identification and resolution of errors. What happens to the healthcare team, let's say, that has the same number of errors as the team next to it, but has double the rate of reporting and double the rate of resolution? That compounds. Give the team six months a year, they will be light years ahead of the team next to them in learning behavior. Here's an interesting point. This was published by the Journal of Organizational Behavior. So Amy Edmondson found that psychological safety was associated with higher levels of learning behavior in healthcare teams. So talking specifically about healthcare, this means that teams with high psychological safety were more likely to engage in activities such as discussing errors and seeking feedback, which help them to learn and improve quality of care. Job satisfaction. Job satisfaction is one that doesn't get talked about nearly enough. We talk often about patients and patient care, patient outcomes, but what about the people providing that care? Are they going to be satisfied enough to stick around, to stay and develop? This was published in the Journal of Nursing Care Quality.

0:19:39.0 Junior: They found that psychological safety was positively associated with nurse empowerment and job satisfaction, and then what's the consequence of that, which in turn were associated with better patient outcomes, such as fewer falls and medication errors. Furthermore, there's evidence of an inverse correlation between psychological safety and nurse burnout, and we certainly have seen that over the last few years. So as those three things decrease, the patient safety, quality of care and job satisfaction, everything becomes worse, and as those three things increase, everything becomes better. So that's what it boils down to. If we consistently punish vulnerability, then patient safety will go down, quality of care will go down, job satisfaction will go down, and all of those will contribute to some of the numbers that we talked about on the front end that we're trying to improve.

0:20:28.8 Tim: That's really true, Junior. Let's go back to mistakes or errors. If you're doing threat detection, you're a member of the clinical team and you're lower in the hierarchy, let's just put yourself there. Okay, let's just say you're a nurse and you're working on a clinical team as a member of a clinical team, and you do threat detection and you observe and you listen and you watch and you perceive and you see that there's a consistent pattern of being punished for your vulnerable acts, your vulnerable behaviors, if you commit an error, if you make a mistake, what are you strongly tempted to do with that mistake.

0:21:04.3 Junior: You're gonna hide that, I think.

0:21:04.3 Tim: Hide it. You're gonna bury that it goes underground because the risk of repercussions, reprisals, retribution, negative consequences is so clear and present that you're dissuaded from putting it on the table, and yet the mistake, the error is clinical material to learn and get better. We can't do root cause on a problem that we don't even know about, we can't do corrective action on a problem that we don't even know about, so if we're driving errors and mistakes underground, we are inflicting where it's these are self-inflicted wounds. That's what's happening. And as you said, Junior, the team, the organization loses its adaptive capacity because it is not circulating local knowledge, it is not reporting errors and mistakes and learning from those, that's the most important clinical material that that team has.

0:22:05.8 Junior: I wanna call out the difference between a system or a process or a protocol and the behavior of the team, because those two things are often very different. The reason I bring that up is because an organization may be tempted to say, "Well, we have a process for error reporting and it's anonymous," or "We've already solved for that," but what's actually happening at the interpersonal level? What is the leader of that team modeling? Are they modeling acts of vulnerability, are they rewarding those acts of vulnerability? Because that's eventually what it boils down to. It's insufficient to tell the organization, "We here at this place report errors. It's a safe place." One of the things that we say that psychological safety is not is rhetorical reassurance, and so it's easy for some leaders and organizations to say, "We've already solved for that because we've sent out communications, we've done a big awareness campaign, and we have systems in place." All of those things are wonderful things, they are things that we should be doing, but you have to look at the interpersonal relationships, what's going on at that level, and if you're not measuring at that level, which we'll also talk about, then you could be missing a huge part of what's going on inside the organization.

0:23:29.9 Tim: That's very true. Junior, let me give one example. Let's go back to the pandemic for a minute.

0:23:37.9 Junior: Let's not. [laughter]

0:23:40.5 Tim: Yeah. Let's not. But to cite an example, I was speaking with leaders at the National Health Service in the UK during the pandemic, and they said, "Tim, we have discovered something very interesting, so in some of our hospitals, obviously we've been overloaded with patients, we're just beyond capacity, and we have also had to throw together clinical teams at a moment's notice. And so we put doctors and nurses and other clinicians together as teams at the drop of a hat. These are people that don't know each other. They've never worked together a day before, have no knowledge of each other." And they said yet, in some cases, these teams, teams that were put together kinda flash point teams, they demonstrated an incredibly high level of psychological safety on day one, and I asked, "Oh, that's amazing. How did they do that?" They said, "In every case, there was a clearly discernible pattern, in every case, the leader of that clinical team was explicit about the terms of engagement, the ground rules, the way we were going to interact goes back to culture, and that leader did model and reward vulnerability."

0:24:56.6 Tim: It's very interesting that when humans get together and they start interacting, norms appear instantaneously, culture formation begins immediately, and they would set up what we call at LeaderFactor, we call it a tent culture. A tent culture is a culture that is created through continuous interaction at a moment's notice, so you can set up a tent of culture like a day, with a new clinical team and then you take it down at the end of the day, and maybe those teams never get together again, they may never. But for the period of time that they interacted, they created a tent culture of extremely high psychological safety, so that was possible. So they have case studies, they have proof of concept that you can create psychological safety almost instantaneously if you're explicit about your terms of engagement, and if you model and reward vulnerability. That was a beautiful, beautiful example and finding that came out of the pandemic environment. But we understand that that can happen immediately, so that is the bright side.

0:26:08.6 Junior: It's an amazing example. Thank you for sharing that. So let's break psychological safety down a level further. Many of you have probably heard of the four stages of psychological safety, and we'll just walk through that for new listeners. So psychological safety isn't binary, it's not that you have it or you don't. It occurs on a spectrum, and it's built through four stages. Stage one is the very beginning. We're asking ourselves, "Can I be my authentic self? Do I feel included? Do I feel a sense of belonging?" If that's not true and that's not there, the subsequent stages become much more difficult, especially over a long time period. So stage two, we have learner safety, can I learn and grow? 

0:26:51.1 Junior: Stage three, contributor safety, can I create value through meaningful contribution. And stage four, can I be candid about change. So think of the application of those questions in the healthcare environment. Can people show up and be themselves? Can they learn and grow? Can they create value through meaningful contribution in some level of autonomy, and can they be candid about change? Those four questions are very piercing penetrating questions as it relates to healthcare. Particularly challenger safety, because that's what we're talking about when it comes to error reporting and some of those things that are even more difficult and more risky interpersonally.

0:27:30.6 Tim: Yeah, what we're saying is that error prevention does not only require psychological safety, it requires stage four, the culminating stage, challenger safety. Unless you feel that you can challenge the status quo and point out an error and be rewarded for that behavior chances are you're not going to do it. So it's not just psychological safety, but it's the highest stage of psychological safety, which is challenger safety. We need to put that in perspective.

0:28:02.2 Junior: We do, and if you think about it relative to the Ladder of Vulnerability, it gets really interesting. So let's say that those top four acts of vulnerability that you mentioned are the most vulnerable, which is what they are, they would require a level of psychological safety commensurate with the risk. So we're saying that if it's most risky to point out a mistake or give an incorrect answer, that we would need a level of psychological safety that's high enough to support us in that act of vulnerability. So that point, it's appreciable, because if we don't have that and we just say blanket statement, "We need psychological safety," it's probably or it may not be enough, we may not be talking about the same types of risk that are contributing to some of the problems. It may be a learner safety issue, it may be a contributor safety issue or a challenger safety issue. So I think looking through the lens of the four stages helps us by giving us a road map and helps peel back the layers, it helps us differentiate and become more specific about what the problem is, and also about what the solution might be. Measurement and improvement let's go there for a moment.

0:29:21.4 Junior: Measurement is interesting, and one of the pieces of measurement that we wanted to talk about briefly is HCAHPS. So for those of you who are not familiar with the HCAHPS survey, HCAHPS stands for the Hospital Consumer Assessment of Healthcare Providers and Systems, and this is an instrument very important to the healthcare space, it determines hospital reimbursement rates from CMS, The Centers for Medicare & Medicaid, and the level of reimbursement is based to some degree on the HCAHPS scores and how patients are rating the care that they received across a variety of categories. So it's 29 questions, and those 29 questions are targeting a few specific things: Communication with nurses, communication with doctors, responsiveness of hospital staff, pain management, communication about medications, cleanliness and quietness of the hospital environment and discharge information and care transition. So those are the areas that we're getting at through those 29 items, patients rate the care across those 29 items, and that rolls up into the institution's HCAHPS score.

0:30:36.2 Tim: Yeah, think about, this is a federal instrument, determining the reimbursement, I mean think about the incentives surrounding this instrument and the scores. So just to communicate the gravity of the importance of this, let's dig into this a little bit, Junior.

0:30:52.5 Junior: Oh we've thought about this particular instrument for years, we've spent a good chunk of time thinking about what it is from a construct perspective too.

0:31:02.2 Tim: Well and analyzing the data, right? 

0:31:04.8 Junior: And analyzing the data. And part of what strikes us, there's a lot that we could say, but we're looking at outcomes and symptoms, that's what we're measuring through those items, not inputs, and that can be problematic. There are a lot of things that the instrument does well, but some things that it doesn't show, some things that it leaves out that we think affect the very outcomes that it's measuring.

0:31:35.5 Tim: That's right.

0:31:35.6 Junior: And so there's this issue of sequence and this issue of order of consequence, that is a really interesting thing. So when we go into institutions, we measure psychological safety, and we do that because we believe that to be the most or highest upstream variable that affects everything beneath it. So if you look at each one of those categories, communication with nurses, is that an input or an outcome relative to psychological safety? It's an outcome. What about communication with doctors, outcome of psychological safety. What about responsiveness of hospital staff, what about communication about medications, what about all of these other categories are arguably tied to psychological safety? 

0:32:19.3 Tim: That's true, Junior.

0:32:21.2 Junior: And some not even arguably, but obviously. And so the logic would be that as psychological safety increases, these outcomes would improve because so much of them are based on human interaction. So as the quality of those interactions improve, all of these outcomes improve as a consequence.

0:32:40.3 Tim: Yeah, these are lag measures, as you've rightly point out. And so if we really want to address the root cause and do corrective action, let's go to the root cause, that means go upstream, that means address psychological safety, the way we interact, the members of the team that is the regulator. It's the same logic as EX drives CX, right? The employee experience drives the customer experience. So we've got to go upstream. Now, this is a valuable instrument, it's valuable data. But you can't solve the problem with lag indicator data, you've got to go upstream and measure psychological safety as an independent variable that occurs earlier in the causal chain.

0:33:28.1 Junior: We need to be measuring the dynamics, the norms of the clinical staff. Another thing that you don't get from HCAHPS is the richness, qualitative data that you want regarding the interaction of the internal members of that team. That's one of the things that we lean on very heavily, a lens to look through as we're looking at the quantitative data that we also gather.

0:33:52.1 Tim: That's right.

0:33:52.6 Junior: And so when you apply the four stages methodology, it also allows you to see what are the norms as they relate to each of those stages, because the difference between the four is very important. Are we talking about an inclusion issue? And how is that showing up in the way that we interact with each other? Is it a learner issue, or contributor issue, is it a stage four issue, and looking through that lens gives us a much clearer picture of the quality of interaction. If we're not looking at the quality of interaction that's most upstream, then we're missing the boat. And we need to back up and look at, "Okay, what actually... What are the lead measures and what are the outcomes that they're tied to?" It's fascinating. So psychological safety clearly lies at the heart of what we can do to improve cultures in healthcare. We measure psychological safety and then create sustainable interventions that change behavior at all levels of the organization. And to a previous point that I made, it's not an awareness campaign, that's not what we're doing here, it eventually needs to translate back to behaviors person to person, the awareness is not enough. And it's also true, you can take our word for it, that the awareness won't magically translate into behavior change.

0:35:11.4 Junior: Just because someone is aware that they need to be more inclusive, aware that they need to challenge the status quo, does not translate necessarily into the behaviors needed to do those things. It needs to show up in the modeling behavior of the leaders, it needs to show up in rewarded vulnerability, top down, bottom up and peer to peer. If you don't get even that lateral reward person to person, "Hey, good question. Yeah, I appreciate you asking that question. That was something that I was thinking about too." Those types of behaviors need to show up consistently in the organization over time to change the culture. If they don't, you will not get there.

0:35:48.7 Tim: Yeah, that's really true, Junior. It becomes a process of self-discovery. Yes, we measure it at a team or an institutional level, but the reality is that it's a process of self-discovery, it's immersive, it's experiential, you have to go do it, you have to create the confirming evidence that what you're doing is right and it works, and that becomes the reinforcing mechanism. And everyone is, again, as we said before, they're doing self-detection. In the environment, they're looking around they're trying to figure out what the norms are, they're doing a risk reward calculation in their heads as they perceive that environment, and it does go back to the leaders that set the tone and model the behavior and then reward the vulnerability.

0:36:35.1 Junior: I wanna talk for a second about our allocation of attention when it to these issues and these problems relative to technology. We talked at the beginning that we had made great strides in the technology that we use, the systems, the protocols, and that's true. It's also true that most of the technology that we use is going to become obsolete at some point. And you may choose the wrong technology, you may choose the wrong system, you may choose the wrong protocol, we may find out later, "Hey, that actually wasn't the right choice." So there's some big downside risks as we're choosing our technologies, as we're choosing our systems and our protocols, you could get something wrong. But what is the downside risk of improving the quality of interaction in your organization? Show me the downside risk. There's none that I can think of, yet the upside, the dividends, the ROI is incalculable. So we have to be very careful as we look at the allocation of our attention as it's spread across those things: Technology systems, protocols, culture. Culture is something that should be getting a lot of our attention, it should be getting the lion's share.

0:37:51.3 Junior: Because everything else is downstream. If we're evaluating a technology, we need to be able to debate the technology. We need to be able to have a candid conversation. People need to be able to express their views. So everything else is downstream. And if you think about it that way, you think about the order of operations, I think it becomes very clear. The beginning of an organization is two people, that's the beginning of any organization. "Hey, do you wanna work together?" "Yes." And eventually, the organization grows and proliferates and becomes dozens and hundreds and tens of thousands, sometimes, of people. But it all goes back to the quality of interaction between those 10,000 people. And I'll also put in a just... Insert that the tone at the top is what dictates those patterns more than almost anything else, is the behavior of the leader. Everyone's watching all the time, they're taking their cues from the people that are above them in the hierarchy, like it or not, because that shows them what the path of incentives is. How do you grow in this organization, is it based exclusively on performance and technical competence, is there a cultural element, what happens to the people that are culturally tone-deaf in this organization, do they get promoted, do they stick around or are they managed out? 

0:39:14.1 Junior: All those things are bread crumbs for people to look at and follow to pick up on what it is that's actually going on inside the organization. They're looking at all of that data and then they're making some decisions. And some of that data would lead them to believe that it's in their best interest to act out of self-preservation and to be quiet and to not report that error, to not acknowledge that mistake, to not go out and learn something new, to not call the other department, and pretty soon we end up with the big problem that we talked about in the beginning of more than two million people that are dying every year from these types of errors. And I don't mean to be doom and gloom, but those really are the stakes when we think about healthcare and part of our job, our institutional mission at LeaderFactor is to influence the world for good at scale. So part of the reason we wanted to take on this conversation is to bring awareness to the issue, help people understand our point of view that largely, this is a cultural issue, there's something that we can do about it, and if we do, the consequences will be positive and far-reaching.

0:40:20.3 Tim: Junior, I wanna make a comment about how medical science advances. Medical sciences advances through the falsification of what we know today, through the falsification of current knowledge, through the falsification of the standard of care. Let me give you an example. Years ago, I had my first knee surgery, years ago. [chuckle] Over 30 years ago, what did they do? They performed the surgery, they put a big cast on my knee, they immobilized me for three months and they gave me crutches. That was the standard of care. Unbelievable. My last knee surgery what did they do? They performed the surgery, they had me up after, I think it was two hours after the surgery, yes, I had crutches, but I was weight-bearing and I was walking down the hall in the hospital, the standard of care had been completely turned on its head. This is how we advance, but the social psychology of this advancing process requires that we have psychological safety, that is the accelerator. It regulates the speed of discovery, it's what allows us to identify, surface, and solve problems, report and solve errors and mistakes. It is crucial to everything that happens in healthcare.

0:41:50.0 Junior: So psychological safety in healthcare is something that we should be talking about constantly. If we can create this culture of rewarded vulnerability in healthcare organizations, we're gonna achieve a lot of things: Open communication, learning from mistakes, improve collaboration and teamwork, enhanced innovation and problem solving, increased reporting of safety concerns and arguably their resolution, greater employee engagement and well-being, better adaptation to change to equip us for the future ahead that we don't know what it's gonna be like, but we know that we will need good interaction to do well. So what a conversation, Tim. I appreciate you taking the time today to go through this. A lot of very interesting points.

0:42:32.9 Tim: I'll just make one other point before we wrap up, Junior, and that is that we do acknowledge also the economic pressures that are on healthcare institutions. You take a hospital, the pressure associated with bed management, the pressure associated with the revenue cycle, the pressure associated with the entire cost structure, we acknowledge that, and we understand that the conditions are incredibly difficult within which that we are asked to perform. And yet in spite of all of that, we still need psychological safety. Remember the fundamental principle that we talked about at the beginning, fear breaks the feedback loop, and yet the feedback loop is the single most important thing in our adaptive capacity in helping us to move forward and to reduce error rates and to protect people. Protect patients.

0:43:31.4 Junior: Well said. So thank you everyone for your time and attention and we appreciate your listenership. If you're in health care and would like to talk to us about bringing these principles to your organization, do not hesitate to reach out. And to all of you we're thankful for the work that you do in the world, and we're here to support you. It's why we exist as an organization. As always, we appreciate your likes, your reviews and your shares, so if you found value in today's episode, send it to someone who might find it valuable. And take care, everyone, and we will see you next episode. Bye-bye.

[music]

0:44:09.4 Producer: Hey Culture by Design listeners. You made it to the end of today's episode. Thank you again for listening and for making culture something that you do by design and not by default. If you've enjoyed today's episode, please be so kind to leave us a review, it helps us reach a wider audience and accomplish our mission of influencing the world for good at scale. Today's episode show notes and other relevant resources related to today's topic can be found at leaderfactor.com/resources. And with that, we'll see you next episode.

Show Notes

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

Episode Transcript

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

Recent Episodes

Redefining Intelligence

Published
April 29, 2024

The Resilience Cycle: Disturbance, Adaptation, and Recovery

Published
April 22, 2024

The Dangers of Contingent Self-Esteem

Published
April 15, 2024