Create an account and get 3 free clips per day.
Introduction & Objectives | Building a Culture of Safety in IR
Introduction & Objectives | Building a Culture of Safety in IR
2018AVIRchaptercultureductfull videohaskaljvirnot transcribedradiationtalktraining
A Lesson in Clinical & Cultural Responsibility | Building a Culture of Safety in IR
A Lesson in Clinical & Cultural Responsibility | Building a Culture of Safety in IR
2018airAVIRchaptercirclecolombiaculturefull videonot transcribedpilotplanetrafficyork
Tools to Improve Safety in IR | Building a Culture of Safety in IR
Tools to Improve Safety in IR | Building a Culture of Safety in IR
2018ahrqAVIRchapterfull videonot transcribedsafetysedatedwebsite
Medical Error & the Second Victim | Building a Culture of Safety in IR
Medical Error & the Second Victim | Building a Culture of Safety in IR
2018AVIRbeckchaptereventfull videointernnot transcribedpatientpericardialphysicianprovidertalktriad

- So I get the great pleasure of introducing our next speaker, Dr. Dixon who's been a staunch supporter of the AVAR for a number of years. Over the last half a dozen years or so I've gotten to know him. He's such a gem of a gentlemen

and really just an incredible person and teacher. He wants to do something new beyond radiation safety for us and sorta share other things that he's workin' on, important things that mean a lot to us. There's obviously supporters in the lab as technologists. Just a little bit about Dr. Dixon,

he got his medical training at upstate New York. From there he moved to North Carolina, Chapel Hill where he did his vasculin intervention fellowship. He spent some time as an emergency physician which is pretty unique.

He's currently the residency in the fellowship director at his facility. He has won numerous awards including the prime award for teaching and he is a really dynamic exciting guy who likes to move around a lot as you'll see.

(audience laughs) So with that let's welcome Dr. Dixon, a long time friend of ours. (audience applauding) (man coughs) - Here's a little bit.

And I may move around so I'm torturing the crew here. They're trying to make sure I have this mic live. So, if you can't hear me I'll just start speaking louder but give me some feedback if you can't hear what I'm sayin'. So, we're gonna shift gears a little bit. (microphone screeching)

We're gonna shift gears a little bit from a very technical talk to kind of a broad wide open topic that you could apply no matter what you do. I have come and spoken to you for many years about radiation safety.

Last year I spoke about the burn out which is something that also can be, anybody can suffer from burn out. What I thought I'd do this year instead of talking about radiation safety, I thought I'd talk about how to build

a culture of safety in IR. How can you make it even a little bit safer? And so, when you start looking at this you can get overwhelmed with all the information and some of this stuff you learn is very hard to implement. But I'm gonna talk about a few things

I think you could implement next week when you go back. I'm sorry about the feedback. Can you hear me now? Okay. So there was an article that Ziv Haskal, and editorial that Dr. Haskal wrote last year

in JVIR and the quote from it is that, "We must evolve from cowboys to legionnaires." And the idea was, is that better than this? Uh oh. We're swappin' out. How am I gonna do all this?

Swap mics, keep talkin'. - [Man] Anybody can do what you did. - Alright. So, the idea, he was talkin' to the new trainees right? So I think you probably have heard a little bit from (laughs)...

(audience laughs) Yeah I don't really like to move I just like to see other people move. (audience laughs) You may have heard picked up a little bit during this meeting that the whole,

and you may know from back home that the whole training paradigm, uh oh it is live. The whole training paradigm has changed. And the residents are now going to be exposed to essentially two years of training in IR.

(mic rustling) Probably better on this side when it's facing me. (mic rustling) Don't make it squeak again. Okay. So two years with us,

but we're not just gonna teach 'em really technical stuff. We also want them to be spending time taking care of patients before and after the procedure. Whole new paradigm. And in the origin of this (audio cuts out) we were cowboys. We figured out how to make a copy

and some duct tape and fix everything. We just MacGyvered it. But now Dr. Haskal's point is that we should have good research. We should try to make things safer and not be just cowboys.

We need the (audio cuts out) but we need to have some foundation to work on. It used to work. There. So objectives. We're gonna talk about culture.

I wanna talk about culture because that really impacts how we communicate and how we perceive each other and then give you a couple tools like I said that you could maybe take back and use next week. And then the other thing that I learned about

not too long ago was this concept of a second victim and I just wanted to bring that to your attention because if you do want to build a true culture of safety, you need to know about this second victim concept. All of us can be second victims and the fall out of being a second victim

can be quite significant.

Okay so let's talk about culture. Yeah. There's two ways you could look at this. The clinical side of it where we have a lot of stuff in place.

You can go right now and read about radiatio safety guidelines. You can go to two websites, Image Gently and Image Wisely and learn all about how to optimize your dose to people. There are pre-procedure check lists

just like a pilot has, we have pre-procedure check lists. Do we have this, this, this and this in the room? And then if you really wanna get into it, you can go to the Agency for Healthcare Research and Quality or AHRQ and they have thousands of talks, videos, queue cards, worksheets,

and you can learn all about how to do this. Now many of you already know all of this so I don't wanna belabor it, but there's a link there if you wanna go. Now what about cultural responsibility? That requires buy-in.

Buy-in from the team, the team on the ground doing the work but also buy-in from your leadership. They have to support. They have to provide money. They have to provide education. They have to support you when times

get kind of bumpy and rough. But the nice thing is that if you yourself start to adopt some of these things, the workforce can influence leadership. The workforce can influence the culture of safety. So you can make a difference.

Sometimes it's hard to get our leaders to back us. Okay so I'm gonna talk about a flight. This is a famous flight that pilots who are training study. And you may know about this flight. This is Avianca Flight 052 they call it. And this flight teaches pilots

what can go wrong and how to avoid it and we learn from this flight too. So these guys were flying from Bogota, Colombia to New York City. It is not a short flight. When a plane crashes how many things

do you think has to go wrong? One error? A dozen errors? 25 errors? Somebody? How many? - One.

- One. (thuds) In fact, it's not one. It's seven. Usually seven things go wrong before a plane ends up on the ground.

And what are the errors like? If it's one you would think it's a very technical error. The pilot didn't properly bank. He was at a low, it was something to do with the technical skill of the pilot.

Wrong. It almost always is very very very simple stuff. One pilot, there's a pilot and a co-pilot. It's built because we make mistakes. So somebody just doesn't notice something. Or somebody notices, the co-pilot notices something,

but doesn't tell the pilot hey, there is a problem with our auto pilot. You can fly without an auto pilot so maybe he feels like he doesn't have to let him know. Now the three things that are most commonly present is that the weather's bad.

Not terrible. Not unflyable but just enough that it puts the pilot on edge. The pilot doesn't feel comfortable. Just a little edgy. What else?

Most commonly the captain has been up for more than 12 hours so the pilots are tired. So you have a tired pilot, who's not talking to his co-pilot and things start going wrong and that's exactly what happened to these guys.

So the weather wasn't terrible but there was a nor'easter up in the North East. And that made them get delayed three times. So they're flying, flying from Colombia to New York City and they get stopped and told to circle over Virginia Beach. And they fly and they circle over Philadelphia.

They fly. They had, they circle again. So they weren't gettin' where they were goin'. They were stayin' in the air and they were flyin' in circles. What happens when you fly an airplane

in circles for a long time? - [Woman] Run out of fuel. - You run out of fuel. (clears throat) So, what kind of plane were they flyin'? They were flyin' a 707.

I am not a pilot so all the pilots in the audience, I apologize if I'm using wrong terms and stuff but this 707 is an older plane that requires a lot of manual pushing and pulling. New planes you can fly with one finger. This one, and they were flying into that nor'easter

and it would like be, they had to really gun it and all of a sudden they had to really drop back. So with one hand the pilot's doin' this with the gas, and then he's tryin' to man handle and fly. He was getting exhausted. So exhausted that he started to ask

the New York air traffic controllers to speak in Spanish. He was just getting fatigued. His co-pilot should have noticed a whole bunch of stuff, should have told air traffic control we're running out of fuel, he should have said there is an emergency,

we are running out of fuel. We can't make it. And he didn't use those words. So the rough tough almost bullying air traffic control at the New York JFK, when the co-pilot said okay we came in.

We didn't make the landing. We're gonna go around, he just said causally we're gonna go around. And we're runnin' out of flu, out of fuel. That's how he said it. The air traffic controllers think everybody coming

into JFK is running out of fuel 'cause they're at the end of the flight. And he didn't say yo, I'm runnin' out of fuel. It's an emergency I have to land now. He didn't use those words. What's that called?

Mitigated speech. That's like if you're in a restaurant and you're choking, and you say I'd like some more coffee and by the way I'm choking. The waiter won't believe you. So air traffic control didn't believe

they were really running out of fuel. And the pilot who was tired said did you let 'em know? And the co-pilot said yeah I let 'em know. They know. And so he said that right 0180 on the heading and we'll try one more time and we're runnin' out of fuel.

That's how he said it. And instead of makin' 'em just loop and come right back they sent 'em all the way out over Long Island. Way out and they're loopin' back, loopin' back. That's what happened.

They ended up in John Mcenroe's fathers back yard. Crashed. In a little tiny town. Not everybody was killed but a lot of people were killed. Where else might you be in a place where communication is really important

and it doesn't happen the way it should? So one thing goin' on in addition to everybody bein' tried and the weather was bad, it was an old machine, they could've landed in Philadelphia but they didn't they flew right over Philadelphia, is that in some cultures you're

not supposed to speak up to your superior. You might say, oh the instruments are really good instruments. They can measure the gas fuel really well. You won't say to the superior we're runnin' out of fuel. But the superior if he's in that culture

will recognize that as oh he's trying to tell me we're out of fuel and he'll respond appropriately. Well New York is not like so when the guy from Colombia said by the way we're runnin' out of fuel, they didn't believe him.

So, they crashed. Where else can you find a culture where people will not raise their hand and say hey you've been irradiating this person to death or hey maybe you shouldn't approach it from that way you should go this way, it's in our lab.

So this is a shot from the radiation safety course that I gave just a couple days ago. Keck Hospital USC was kind enough to let us go over there. We had a little workshop and right there, workin' with my good friend Kyle Jones

I asked the participants do you let your physician know when something's going wrong? Do you raise your hand and say hey Dr. Dixon, you have delivered 7,000 miligray and maybe we should stop? Do you say hey we've used all this contrast and we should stop?

How many in this audience feels comfortable speaking to all their physicians about that kind of thing? That's pretty darn good. I'm very proud of you. That's the way it should be. It should be a culture where you

can speak to us and we shouldn't respond by saying shut up, I'm almost there. Go away. I don't care. We should say thanks. So what's the alternative approach?

So here's another flight story. A big huge long flight from Dubai all the way to New York. All of a sudden there's a patient in the back starts having a seizure and throwing up all over the place. They think that she's having a stroke. They could land in Russia but the couple,

they have no money and the pilot knows that if I land in Moscow they're not gonna get what they need. So he decides to go to Helsinki and he got so much fuel he's 60 tons over weight. But what he does is he's very good,

he's good at making decisions. He quickly educated himself about the runways at Helsinki. Where's the wind goin'? What's goin' on? He communicated with all those people there including his boss back home

and then he told air traffic control there when they said come in this way he said no I'm heavy. I'm really heavy I have to fly into the wind and they never have anybody come in that way but he did it and he communicated with everybody.

So that's the key. You have to be clear, completely transparent, encourage, cajole, bribe, push, make things happen and you have have to stay calm and if you need to negotiate. Now, this is an interesting thing.

In the middle of this century, in World War II that is a P47 Thunderbolt. It was flying tank. It was very very sturdy and it was a, it was very good for fighting wars. And the folks who flew that,

the pilots who flew that were very good and the aviation philosophy for years was we will just build better pilots. That is the hernia operation going on in 1941. The same thing in medicine. We will build better and better and better surgeons

and will make less mistakes. There is an article, there is a series of articles coming out and I know you heard about it because Kevin Dickey spoke to you just a coupled days ago I believe and in that issue of techniques in VIR

there will be an article by Bane Sellby talking about the differences between aviation and medicine and his point is that in aviation now a days, now a days you don't just have really really good pilots, you have a system that looks for errors

and in 2017 in this country there were no fatalities in the aviation world in commercial airlines. So that's very impressive. And his point is that medicine with a really nice machine, we might be a little bit behind. Now I said that to a surgeon the other day.

I said maybe we're just behind. Look at this. The pilot now a days will spend a long time, long time looking at the weather, where they're going, is the machine okay? You know that we go alright,

we're gonna embolize the liver, let's go. And a lot of times we don't do a lot of preparation. And Dr. Sellby's point is that we should maybe prepare a little bit more. The surgeon I spoke to, he's a surgical oncologist, he said I am so tired of being compared to pilots.

The pilot is not flying as many missions as we are flying each day. That's true. We do 45 cases a day in our place. There's no way I could give each of those cases the kind of attention

that pilots give to their one flight. So some people would argue we're really not like pilots but I would argue that if you look at the things that put planes on the ground, when they're not trying to land when they crash, it is very simple things that we could adopt.

So communicate. Okay tools.

This I think most of you know. So AHRQ or the AHRQ has this things called Comprehensive Unit-based Safety Programs, and there's the website there.

And the idea is that it's unit based. So it's related to the floor or the ITU or the IR department that you're working in to try to build a better place. You can go to this website and this is where the thousands and thousands

of resources are available for you. CUSP that's a unit based safety program was developed in John Hopkins long ago. They had a terrible tragic case with an 18 month old and they shifted gears and built a new culture. Other people have used it to reduce light infections.

The AHA was the team, was the force that helped roll it out and it's a blueprint for building a safety culture. And you can use this material to make your place safer. TeamSTEPPS many of you know about TeamSTEPPS. That too is a well known, very well run program.

But the one thing that I think we should take away and this is what pilots do, is have a little brief before. Maybe during the procedure, huddle. How are we doing? How's the flight going?

And then at the end debrief. And this is something I think that you can do right now. Next week. Go home. The problem is you might have to do it with the physicians you feel you can speak freely with

but what I do even after simple cases now a days, is stop and say okay how'd we do? And the person might say well we could've, maybe we should've done this with GA or I could've sedated better or I coulda had a better IV and maybe the tech will say I coulda had the rooms in

and maybe I will say well, I probably shouldn't have done X I could've done Y. So we actually talk about it a little bit and then you learn. And we have some crazy near misses. And if you talk about it right there people will learn

and I think it goes along way. So if you look at all these acronyms and all these safety programs and what not, just talk about the case right afterwards. Doesn't have to be really in depth. Now what's the difference between us and the aviation world?

When the aviation world sits down and talks about a crash, they go through that thing with a fine tooth comb. And everybody admits freely what they did wrong and the pilot who might be responsible gets to really lay it out there without any repercussion. We worry about getting sued.

We worry about our reputation. We worry about all sorts of stuff and that's, it's a different culture. But if we can sort of change that a little bit by making it more open, I think that will go a long way.

This is a famous acronym. You can use it in the room and when my nurses say I'm a little concerned, or when they say I'm very uncomfortable, I haven't had to many people say to me stop. But I do have some very good nurses

and very good techs who will say, Dr. Dixon I just want to let you know bum bum bum. And you may be really under fire and you don't want to stop but that should nudge you. That should just make you think am I just way way off from where I should be?

Or ask them for some ideas. Sometimes the person on the outside watching can give you good feedback.

Okay. Second victim. What the heck is a second victim?

So (clears throat) imagine that you're an intern on service in a hospital late at night and you go to see the patient and he's having a lot of trouble breathing and the nurse says his blood pressure's a little bit low. And you're tired and you miss the fact

that on physical exam he has these three things. He has JVD, jugular venus distention. So his EJ is, you might even see, his veins are distended in his neck. If you were to listen in with a stethoscope, you couldn't hear the heart.

It would sound like it's far away. It's muffled. And his blood pressure is soft. That's Beck's Triad named after Claude Beck and described in 1935. Claude Beck was a, he's not a famous rock 'n roll guitarist,

he's not the other Beck, he was a thoracic surgeon. And if you're an intern late at night you might miss those physical findings and just give them some diuretic and go to sleep. Well that's what happened and the patient ended up having to go to

the operating room to decompress the, it's a sign of tamponade pericardial fusion in a bad tamponade. And the next day everybody's in the hospital talking about it. Oh my God.

That intern Smith missed it. How? Can you believe it? He missed Beck's Triad and that guy had to go to the OR. If he picked it up and maybe we did a pericardial synthesis he wouldn't have had to go.

What a knuckle head. Oh yeah what a knuckle head. See then you get all the Monday morning quarter backs sayin' stuff and then you may go to M and M and they're like did you notice this? Did you?

It can be like the Spanish Inquisition. Then what does that intern do? That intern, he's gonna be devastated and if he doesn't have somebody to talk to he becomes a second victim. He's traumatized by the event.

So this concept of the second victim was described a long time ago by Wu in 2000. It can impact all of us and it's under supported and it's under recognized. This is the definition. Sue Scott has done a lot of work on this

but the idea is that health care workers involved in an unanticipated event and things go south and then it impacts them and they feel responsible and then they start to self doubt and be very critical. There was a survey that was done. 3,000 physicians who had been involved

in unexpected bad events. And most of 'em were worried about how they were gonna handle it in the future. That intern was wonderin' oh my God. Am I ever gonna be able to salvage my career? Will I know Beck's Triad when I see it again?

Should I be a physician? What an idiot I am. So he loses, he or she loses confidence. He can't sleep at night. Wakes up in the middle of the night, I should have seen that.

We could've saved him from going to the OR and then you start saying why I am doin' this job for? Maybe I should be a greeter at Walmart. And you're worried about your reputation. So the response. The thing,

the natural progression of what a second victim goes through is actually very predictable. There are a few consistent stages. At first especially in our world, things may go south very fast and you may be trying to manage the untoward event

and also try to straighten out in your head how did I do that? Pneumothorax during a line insertion. The patient's bleeding no matter what. They have an MI on your table, and then everybody descends right?

The code team comes in, everything goes, or any event where you lose control that you don't have a handle on it, you're gonna be going through this in your brain. That's the very first step. The second step after the dust settles

and the patient either has severe morbidity or you lose the patient, you're gonna keep asking yourself why didn't I see Beck's Triad? Why didn't I do this? I shoulda done that.

You may wake up in the middle of the night thinking about it. It may not go away like every single day you are thinking about this. What painting's that? Scream.

Who did it? Munch, Munch long time ago. Now here's a question. How many people think that that figure is a male? And how many thinks it's a female? So male?

Female? Yeah I think it's a female too but I've had a lot of women tell me no way that's a man (laughs). (audience laughs) I love that painting but it's how you feel.

You're like oh my God what did I do wrong? And then you try to, after awhile you try to restore yourself. How can I get back on track? The only way you're gonna do it is by talking to either a close colleague

or a family member maybe a manager. Maybe a physician who looks after me. My boss. But you have to be very careful and the lawyers will tell you don't talk to anybody and I will tell you you should talk to somebody.

If you keep that inside you're gonna feel like that painting forever. Then you have to go through the M and M. All the inquisitions. Why didn't you do this? Why didn't you do that?

Did you get rest? How many people with Beck's Triad have you seen? And you feel like you're being just punished by the Spanish Inquisition. If you have somebody that you can talk to eventually you'll sort of achieve

some emotional first aid. You might even, I mean, there's a stigma and that's why we don't do it but maybe you need to go talk to a professional about it. Maybe you're even thinkin' that I shouldn't be alive anymore.

I'm terrible. So the last stage is moving on and there's three potential pathways in this moving on phase. (sighs) You could drop out. You could say well after this year

of internal medicine I'm done. I don't know how to spot Beck's Triad so I shouldn't be a physician I'm gonna leave. Or maybe I don't wanna be a physician at UNC. I'm gonna move someplace else. I just gotta get out of this town.

Everybody knows I made that mistake. Or may go be that Walmart greeter. You may just say I'm leaving medicine. I just can't do it anymore. Another thing that happens is people get not even quite back to base line.

At base line you were a happy hard working boy I love my job kind of healthcare provider and now every single day you think about it but you're warn down. It's like okay. I made a mistake.

Let's go to work. And so you're no really (whistles). You're just kind of gettin' by. The best place is to come out of it. Think about what you could learn about that situation and thrive.

Maybe you can teach other interns how not to miss it. Maybe you can go and speak nationally about what I've been through and maybe you can just make yourself available and when somebody has a bad event, you can say hey, I bet that made you feel bad.

What happened? And just let them talk about it. Don't be critical. Don't say you knuckle head. JDV come on. The muffled heart, you didn't?

You're not critical. You just listen. All of us are involved in that stuff. I don't think there's anybody in this room who hasn't seen a case go south super fast. Oh sorry this is, I don't know.

I don't know who put this together. It's gonna make me have a seizure. Okay. There's some programs in place. There's a program. It's not a typo.

It's called forYou with no space between those words down at University of Missouri and they have done a great, a huge amount of work. The RISE program at Johns Hopkins. They've also written about their program.

Resilience in stressful experience, somethin' like that. Both those two programs have been around a long time and there's lots of literature out from those two places and at our place we have a peer support program

that just sort of in the initial stages. And when I speak to people the number one at our place the number one thing I hear is that we're not really utilizing it to it's full capacity. But it's there and maybe there's something like this at your place too.

So we talked about culture. It's a little important. Where you're from, how you communicate, and the idea where you should make your workplace a place where there's not a hierarchal set up but that everybody can talk this way.

Talk like this. Some tools. TeamSTEPPS and the CUSP words and things like that. And now you know about second victim and I don't think you can build a good safety culture, a good culture of safety

without considering what happens to the provider when something goes wrong. And I think that's it. I didn't move as much as I wanna gonna move and thanks to the technical folks for hookin' me up there and thanks for you time.

(audience applauding)

Disclaimer: Content and materials on Medlantis are provided for educational purposes only, and are intended for use by medical professionals, not to be used self-diagnosis or self-treatment. It is not intended as, nor should it be, a substitute for independent professional medical care. Medical practitioners must make their own independent assessment before suggesting a diagnosis or recommending or instituting a course of treatment. The content and materials on Medlantis should not in any way be seen as a replacement for consultation with colleagues or other sources, or as a substitute for conventional training and study.