Create an account and get 3 free clips per day.
Overview of Diagnostic Errors | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Overview of Diagnostic Errors | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Key Patient Safety Influencers | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Key Patient Safety Influencers | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Human Factors Engineering- What is it? | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Human Factors Engineering- What is it? | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Human Factors That Reduce Situational Awareness | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Human Factors That Reduce Situational Awareness | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Theories on Accident Causation | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Theories on Accident Causation | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
The Anatomy of Errors in Health Care | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
The Anatomy of Errors in Health Care | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Education Strategies to Reduce Human Errors | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Education Strategies to Reduce Human Errors | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Just Culture Concept | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Just Culture Concept | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Q&A- Risk in All The Right Places | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare
Q&A- Risk in All The Right Places | Looking for risk in all the Right Places: The Anatomy of Errors in Healthcare

thank you for joining me this morning as we talk about patient safety and risk management we're gonna touch on a number of different things but for starters can I see how many of bedside or procedural room nurses CTM our procedural room excellent okay all right okay all right

any leaders charge nurses directors awesome all right by chance are there any physicians in the crowd all right okay cool welcome thanks again for coming okay so just to note I have no financial or educational conflicts of

interests all right so today we're going to be talking about and discussing some key patient safety influencers in health care we're going to take a look at something that's called human factors engineering we're going to look at

educational and global human error reduction strategies and we're going to take a look at the just culture concept and its impact on patient safety Event Reporting so according to some statistics from this year for patient

safety week which was March 10th through March 16th so just a few days ago there are about that occur due to patient due to adverse events and 10 to 20% just to highlight a

few 10 to 20% during medical examiner cases they find that there have been some misdiagnosis during that so arriving at an accurate diagnosis is fundamental to the practice of medicine yet according to the 2015 Institute of

Medicine report most patients will experience at least one diagnostic err in their lifetime this report will also note that diagnostic errors contribute to about 10% of patient deaths and account for up

to 17% of adverse events during hospitalizations so currently we have about 41% of Americans who say that they've experienced a medical error either in their own care or that of a loved one or a friend and the National

Academy of Medicine and just a word about the National Academy of Medicine the IOM in July of 2015 changed their name to the National Academy of Medicine so this statistic comes post July of 2015 and they're

suggesting that 5% of US adults who seek care in outpatient settings experience a diagnostic error so that's the reason why we're here today so when we're

looking for risk we find it in about 11 areas and they're in an order that

represents those that are highest in risk for lawsuits obstetrics tends to be the highest because of its dollar value so there are nurses that work and they're called Life Care planners and they use different charts and statistics

from the government and from the CDC that work on disease processes and birth errors and they come up with a dollar amount for the care of that child and that tends to be for the life that's predicted from government charts and

that's why those end up being in the millions of dollars and as we look we see that radiology is actually eighth on this list and the the driver behind this is the missed or heirs in diagnosis

so let's talk about a few of our key

patient safety influencers so the Institute of Medicine came out with a landmark report called to err is human building a better health system in that report they stated that health care in the u.s. is not as safe as it could be

or should be and at that time there was forty forty four thousand two ninety eight thousand deaths occurring due to medical air so they defined medical error as the failure of a planned action to be completed as intended or the use

of a wrong plan to achieve an aim the number comes as the IOM quantified through a 1984 Harvard patient care survey they also estimated that the reporting of preventable adverse events is underreported so these are our near

misses these are our opportunities for change and those are underreported but it did support a collaborative relationship with quality improvement and just to note that not all errors result in death okay so as you could

well imagine this report met much scrutiny and had a lot of comments that came out the next one the next report is the future of nursing leading change in advancing health that came out in 2010 and in that the IOM called for nurses to

take an active role in preventing patient safety airs and they asked us to step up by practicing to the fullest extent of our education and training and from that time we saw a lot of academic advancement to terminal degrees so a lot

of doctorates that's the DNP started coming out we saw a lot of nurses advancing from Associates to be a sin BSN to masters etc to terminal degree our next influencer is the Committee on quality

healthcare in America and they had a project in which they focused on the 2001 IOM report which is crossing the quality chiasm a new health system for the 21st century now a lot of things came out of this report with regards to

the the government latched on to this one with some of our advances for quality care they have some specific objectives for improving healthcare delivery specifically they were revolved around six aims and those six aims were

safe effective patient-centered timely efficient and the equitable provision of healthcare but they noted that the biggest challenge was going to be establishing a culture that encourages reporting of events of our patient

safety events that may result in either actual or potential harm to our patients or others and this culture was actually known as the just culture which is we're gonna discuss a little bit more a little further next we have the National

Patient Safety Foundation so they're a central voice for patient safety they also promoted the patient safety awareness week that I spoke about so the NP SF enhances patient safety awareness through educational programs

research project grants awareness campaigns and they encourage patient and family involvement so many of your organizations may have patient or family liaison groups that work with your nursing administration or your

leadership in looking at the patient care experience during a hospital stay and how that can be improved and also as it relates to patient safety events that may happen are near misses those may also be

discussed with patients and family to get their perspective on how we could improve the Leapfrog Group is a coalition of for fortune 500 companies and what they do is they they buy health care but in order to get them to buy

your health care you have to meet certain patient safety standards that they have such as CPOE so computer physician order entry also ICU physician ICU physician staffing standards sorry and there are a number of other things

that they look at as well the agency for Healthcare Research and quality the AHRQ this is where a patient safety and risk we live here there's a lot of a lot of the patient safety event systems reporting systems that are out there are

based on AHRQ fundamentals that way we can benchmark across the nation on how we're doing with patient safety events so the last one is the Joint Commission and they have their national patient safety standards all right we're gonna

get into human factors engineering so James riesen who were going to talk a little bit more about said we can't change the human condition but we can change the conditions under which humans work and this is the discipline that is

human factors engineering so it's a discipline that takes into account human strength and limitations in the design of interactive systems that involve people tools and technology and work environments to ensure safety most

notably or most recently they were involved in the restructuring or re-engineering of anesthesiology machines so you may remember a time when anesthesiology machines didn't even have a monitor a

patient monitor on him they were just the machine and they had the bellows on them on top that would go with the patient respirations so since that time they've been written a re-engineered to include a patient monitor further

reengineering took in human factors which is is the human condition and I'm going to get into some of those human factors a little bit later but we they took a look at the monitor to make it more clear and intuitive for the

anesthesiologist they also took a look at the alarms of these anesthesia machines to make sure that they were accurate and had a better range to prevent alarm fatigue and then lastly they put in some automation into those

machines human factors engineering is also used outside of healthcare in aviation automobiles so the backup camera that we enjoy at our cars that came from human factors engineering and also a Three Mile Island they used human

factors engineering when they looked at that accident that nuclear accident that took place those many years ago and they used what they learned from Three Mile Island to design much safer patients say sorry nuclear power plants so it's it's

not only used in healthcare but throughout all of our industries so

there are some issues that impact human performance and you can substitute issues with human human factors and these factors that some of these factors

are present before an action takes place and those are fatigue stress and boredom dehydration and hunger there are some factors that are directly related to allowing us to make a decision and those are perception our memory our attention

our reasoning and our judgment and lastly there are factors that directly influence how a decision is executed and that's communication and the ability for us to carry out the intended action so a little bit about fatigue fatigue is the

easiest human factor to overcome but it's also probably the greatest human factor that interferes with our performance and our personality so fatigue affects us by reducing our decision making ability it prolongs

response time it increases lapses in attention and negatively affects short-term memory lessons the ability to multitask and it increases irritability moodiness depression and it also decreases our ability to communicate

part of what's built into us our mental shortcuts so we rely on these shortcuts in reasoning to minimize delay or cost or anxiety in our clinical performance you can imagine that if you everytime you had to run a fluid through an IV

line if you had to think back to when you were in nursing school about how you had to open the package take the line out straighten it out put it into the into the solution and run it through if you had to think about those steps every

time you did something or if you're in an ir procedure room if you had to think about every time you go to set up your patient if you had to actually think about those steps it would take you a very long time so we rely on mental

shortcuts however there are some things in our human condition our human factors that interfere with that and this is cognitive bias and hindsight bias so cognitive bias is a mistake in reasoning or evaluation and it often occurs as a

result of holding on to preferences or beliefs regardless of information in front of us and it's actually an attempt of our brain to simplify information processing hindsight bias works similarly but it calls upon experiences

that we've had in our past and we use them moving forward and what that does is it gives us cognitive dispositions to respond and those are jumping to conclusions seeing what's already expected whether it's actually there or

not it's a bias towards action versus non action so you may want to do something but really the better patient safety may be to wait and collect more information so that non action and assess get more of an assessment and

then there's an overconfidence bias these cognitive factors can contribute to diagnostic error in about 74 percent of cases

so we have some human factors that reduce situational awareness situational

awareness is our mental model of the world around us so I'm sure you're all very familiar with your interventional radiology rooms your CT rooms your MRI rooms and it may take you a little while because of

different human factors that are going on many of which I have listed here to realize that perhaps over a weekend weekend a blue wall got painted beige so some of these factors are insufficient communication fatigue and stress task

overload tasks under load group mindset press on regardless mentality have you ever had that from some of your Doc's in the IR room it's like you've got three cases to go and you know it's getting time that you know your staff have been

there for a while and they're let's push on we gotta get these cases done we're really opening ourselves up for air so again here's that action versus non action so we could really have some of that non action and maybe reassess those

patients and see if we can't have them wait till the next day it's a little bit safer to do those procedures and degraded operating conditions so I have a little test ready all right so this is actually a commercial that came out of

the UK and the UK was using this to heighten awareness for their drivers for motorcyclists being on the road but what it goes through is that we have a kind of a clue a clue ask type of setting where we have our trench coat detective

and we have a lineup of suspects for the murder of Lord Smythe who unfortunately is there on the floor and he's going to go through his lineup and ask them questions and he's gonna name the question but this is about the

world around you I want you to pay attention not only to what's going on but there are things that are happening in that environment that are changing and I'd like you to see how many you notice while you're watching our

detective go through his inquiry clearly somebody in this room murdered Lord Smythe who at precisely 3:30 4:00 this afternoon was brutally bludgeoned to death with a blunt instrument I want each of you to tell me your whereabouts

and precisely the time that this dastardly deed took place I was polishing the brass in the master bedroom I was buttering his Lordships scones below stairs so what I was planting my petunias in the potting shed

cussed of all a rest lady Smythe but how did you know madam has any horticulturist will tell you one does not plant petunias until May is out take her away it's just a matter of observation the

real question is how observant were you all right so how many changes did you happen to see I was gonna say would it surprise you I hit stop it in time um would it surprise you that there were 21 changes during this little yeah yeah

right yeah so how many caught late about five yeah but yeah right right so that's why communication is important and it is often one of those human factors that we don't pay attention to how key communication is in

preventing patient safety errors so let's take a look at what we what we did or didn't see clearly somebody in this room murdered Lord Smythe who at precisely 3:30 4:00 this afternoon was brutally bludgeoned to death with a

blunt instrument I want each of you to tell me your whereabouts at precisely the time that this dastardly place I was polishing the brass I was buttering his Lordships scones below stairs or something but I was fucking my petunias

in the potting shed touch the ball arrest lady Smythe right right originally yes is to increase that situational awareness where you've got motorcycles coming in from sides or in front or behind you or coming you know

all different directions that's what that was originally done for but there are a lot of those situational videos that are out there the probably the most famous is the one with the gorillas and you've got like I don't know ten people

that have the basketball and they're in different shirts and the task is you're supposed to watch the number of times that the white shirts versus the black shirts catch the ball right and in the middle of it comes this dancing gorilla

and most of the people miss the dancing gorilla because you're so focused on watching the ball well the same thing here you're so busy watching our trench coat detective interview to get to the end who did it

cuz you know they're gonna tell you I told you who's that they're gonna let you know who did it that you've miss all those things that are occurring around you so the reason why I did this is because it does involve a lot of

situational awareness and and situational awareness is around us every day and when we're taking care of our patients so it's those little things that we see when we see those changes in the monitor of our patient those little

things that happen in the room that you know maybe they're doing some reconstruction in your IR lab and your your MRI or something and and you've got to do a little workaround well that's not in your and we're gonna cover this a

little bit later with James riesen but that's not what you're used to and so your situational awareness changes and if you don't realize what's going on you may miss something and that something may be something very significant for

your patient and that's where those human factors come in where we have task overload under load communication factors that press on regardless how dangerous that can actually be so James

riesen comes to us and he talks about

some theories on why we make mistakes so and we're gonna cover these and then we're gonna cover the Swiss cheese model which many of you may be aware of so sorry slips tend to hurt current situations that are so routine that

they've become rote so an example of a slip could be selecting the wrong drug from a drop-down alright so again slips and lapses occur when the correct plan is made but executed incorrectly so we have that drop down of drugs but we just

select the wrong one that's a slip a lapse is generally not visible because it's reflective of a memory failure so for instance we may have a patient who forgets to take their medications or we may have a prescriber that forgets to

take a drug off of a med rec so those are examples of slips or lapses mistakes or judgment failures they're more subtle and they're complex than slips and these can go undetected for a period of time and they're often left to

a difference of opinion well I don't do it the same way that Mary does it who doesn't do it the same way that sue does it so those are mistakes and their knowledge base we know the right thing to do but because we have outside things

that are occurring situations that are occurring we may have to do some workarounds and those workarounds aren't always safe or we're gonna get in and this is part of the anatomy we're gonna get into the anatomy a little bit later

and often mistakes are rule-based so we know the rules we know what we're supposed to do but for factors that are out of our control we bypass those and that's when mistakes can happen active failure failures are highly visible

errors and we usually see these because they have immediate consequences and then the latent failures their processes that are under the radar they come from not following policies and there may be a good reason why we're not following

policies but oftentimes we hear that we've always done it that way and that means they're rooted in culture so that's where the justa culture comes into play all right Swiss cheese model so this is this is probably a graphic

that's very familiar to a lot of people but it does really it's it's at the basis of a patient safety air so organizations have defenses those are the slices of cheese now those defenses although we'd like them to be solid

they're oftentimes not they're filled with holes because of human factors the human condition those active and latent failures the slips lapses and mistakes that happen to all of us it's a part of us so often some of those defenses get

penetrated but then there's another defense that stops let's take for example identifying a patient so a patient comes in and maybe they're not english-speaking they may be

spanish-speaking and so we call their name and they answer the answer yes because it's close enough right it's close just close enough and they come up we don't check anything we don't check don't verify their name and their date

of birth we pass them on to our prep recovery room and then we're getting them ready because we have confidence that Jane at our front desk she doesn't make an error she always identifies the right patient so we have a high level of

confidence in Jane it's not a bad thing that's an OK Fay but here again we're not doing what we know is in our policy so it's rule-based and that we know is the right thing to do so it's knowledge base so it becomes a

mistake that we're not checking our patients identity and date of birth and that patient gets back to let's say the interventional room and boom we stop because now we're doing a timeout and we identify that we have the wrong patient

for our procedure and it stops but sometimes these heirs line up the holes line up and it's just one of those days and we end up with a patient safety event at the end so now we come to the

anatomy of ayres in healthcare so this

is according to ashram which is the American Society for healthcare risk managers this is my professional organization and what they say is the anatomy of patient safety errors is that we have a blunt end of the system and

then we have a sharp end of the system and the blunt end of the system you'll find your organizational factors like your culture your policies your procedures and regulations and you're gonna pass through those things that we

know all too well which is our environmental factors that have to do with our equipment our staffing our resources and our constraints and lastly we come to the sharp end of the system which are you and I at the

bedside alright so and this is where our human factors come into play if we are we have a lot of tasks overload under load communications not quite where it needs to be we're fatigued or stressed we're

thinking about other things our blue wall that we like suddenly turn beige so clinical competency comes into place here also the skill set of the person so your managers out there it's very important to know the skill set of your

people and of course our Kunik communication skills so patient safety used to focus on the sharp end you may remember a time when a patient safety event happened and you got called into a room with a group of people and the

event was on powerpoint slides you go through the event and then you're asked a lot of questions from this lineup of people and you it feels very much like the Inquisition well what we found was that wasn't very helpful at getting to

the root of the cause because oftentimes it's not the person it's what is happening in the organization that leads to that event the workarounds that we have to do because we know that a policy and procedure really needs to be tweaked

it's not representing the practice that we do any longer so we focus now risk management focuses on the system what was the system ere and how can we fix that and we do that now by interviewing the people who are involved with the

event and one of my favorite questions is to ask if you had to do it all over again what would you do differently that's my favorite question because it really lets me know what needs to be changed from an organization standpoint

and then that's when the leaders come in so then we get together with the eaters and sit down with the leaders and effects some real change for our staff at the bedside all right education

strategies so some things that we have

in place right now our peer review Grand Rounds CPOE this is one of my one of my favorite process improvements is is making the right thing the easiest thing and you do that through standardization of processes so that's standard work so

that's your order sets that's the things pop-ups although you don't want to get into pop-up fatigue but pop-ups help our providers for little gentle reminders to guide them to what's right for the patient and to cover everything that we

need we need to cover to ensure the safety of our patient so recently in the fall of last year we had a TPA administration err that occurred it involved a 69 year old patient who two weeks prior had had some stenting in her

right SFA she presented to our clinic when our clinics with some heaviness in her leg and some pain and when she was looked at from an ultrasound standpoint it was determined that her stents were from Bost so she was immediately taken

to the cath lab and it was after angiography did indeed show that there was clot inside these stents they did start catheter directed thrombolysis in the cath lab they also did started concurrent heparin often oftentimes done

with CDT what's usual for our institution is that we have templates that pull in the active problem list for a patient in this case the active problem list or a templated HMP was not used had they

used the template at agent p they would have found that the second active problem on this patients list was a cerebral aneurysm so some physicians will tell you some ir docs will tell you that's an absolute

contra contraindication for TPA however the SI r actually lists it as a relative contraindication so usually we're used to when you when you start a final Isis case you know you're gonna be coming in every 24 hours to check in

that patient in this case we started the the CDT on a Thursday the intent was to bring her back on Monday the heparin many ir nurses will know that we will run it at a low rate usually 500 units an hour and we keep the patient sub-sub

therapeutic on their PTT although current literature will show you that concurrent heparin can also be nurse managed keeping the patient therapeutic in their PTT which is what was done in this case so what ended up the the

course progression of this patient was that so remember we started on Thursday on Saturday she regained her distal pulses in her right leg no imaging Sunday she lost her DP pulse it was thought that it was part of a piece of

that clot that was in the the stent had embolized distally so they made the decision with the performing physicians they consulted him to increase the TPA that was at one milligram an hour to 2 milligrams by Sunday afternoon the

patient had an altered mental status she went to the CT scan which showed a large cerebral hemorrhage they ain't we intubated to protect her airway and by Monday we were compassionately excavating her because

she me became bred brain-dead so in the law there's something that's called the but for argument so the argument can be made that this patient would not have died but for the TPA that we gave her in a condition that she should not have had

TPA for namely that aneurysm so this shows how standard work can be very important in our care of our patients and how standard work drives us down the right way making the easiest thing the safest thing so since that time

we've had a process improvement group that we've established an order set specifically for use and thrombolysis from a peripheral standpoint and then also put together a guideline that was not in place so it's some of that Swiss

cheese that just kind of we didn't have a care set we didn't have a guideline you know we didn't use our template so all those holes lined up and we ended up with a very serious patient safety event so global human air reduction strategies

oops sorry let's go back these are listed in a weaker two stronger and some of what we're using in that case is some checklists so we developed a checklist that needs to be done to cover the

absolute contraindications as well as the relative and it's embedded in the Ulta place order that the physician has to review that checklist for those contraindications and also there to receive a phone call from pharmacy

just to double-check and make sure that they have indeed done that that it's not somebody just checking it off so we have a verbal backup sorry so the just

culture concept so the single greatest impediment to err prevention in the

medical industry is that we punish people for making a mistake we should learn right we should really learn so what comes to mind when you think about it the term just culture right she's being able to

report something and not having having punitive actions from that Lane free fair open and honest trustworthy supportive nice place to work yes so two nurses select the wrong medication from a dispensing system

one dose reaches the patient causing him to go into cardiac arrest and the other is caught at the bedside before causing harm do we treat these nurses in the same way no we should but oftentimes we don't right

right right so an active failure versus more the latent failure right so upon further investigation it showed that the two vials that these nurses pulled were very similar the vials was very similar to something that wears a different

medication so we needed some separation from pharmacy so so a little systems intervention right in our Omni cells so and maybe you know maybe there's some human factors that were involved there too that you know one nurse caught it

and the other one didn't but rather than punishing we need to work on consoling and supporting and look at the system and find what's happening what's going on what's the root cause a nurse loses custody of yet an unlabeled specimen but

chooses not to report the incident at a fear of discipline do we fur grit forgive the breach given the nurses fear no no so we really can't but we shouldn't come down on her like a hammer you know or on them doesn't have

to be heard on them because this can actually be a sentinel event if if you have to go back and get another sample that's a set a little bit so that's a Joint Commission never event so that that's not that's not a good thing plus

it's an extreme inconvenience to the patient and also we're opening that patient up to further harm because we have to get another sample so you have to ask why did the nurse behave this way why did she choose not to report it

honest honest disclosure without fear of retribution that's an important characteristic of the just culture hmm yes it does doesn't it that's an excellent point thank you very much for

sharing that excellent point certainly she said that you also have to look at leadership because a lot of times leadership has favoritism so you've got to work on the favoritism so it has to be fair and that's also part

of the just culture and that's a very good point as a learning experience and we're gonna cover some of that too so we have a radiology team that defends skipping the timeout on the basis that no adverse event occurred

so do we condone this no no no so we we don't condone it and it is it is a Joint Commission requirement but and although this incident didn't end in an adverse event we could certainly see where it might so again we need to engage our

leadership we need to engage people at the bedside including our physicians as to you know why we blew right through the timeout so a fair and just culture is is a culture that refers to values supportive model with shared

accountability um it's also an integrated pattern of individual and organizational behaviors based upon shared beliefs and values that continuously cease to minimize patient harm that may result from the processes

of care delivery so culture is the outcome of how our organization responds it's the outcome so if we have a just culture we will have people who will report those events those near misses and will work and not hide them and do

what's right that's why we need it because if we don't have it only two to three percent of errors would be reported most hospitals would be unaware of what errors they had health care workers would report only what they

could not hide and airs as viewed by hospital workers and the media are indicators of carelessness which is not true in fact it's farthest from the fact

if we had less blame we'd have more patient event reporting and from there

we can do more to look at our systems in our organizations and really affect patient care it helps so increased reporting helps to prevent future patient harm it provides an indication of human and system performance so again

we're working on the system not the person it guides performance improvement and it also provides an opportunity to identify risks it also provides a culture of safety so we go from blaming the equipment and the other person to

looking at owning some of our own air and then ultimately when we know that we have not followed a policy as we should have because we know that we're gonna our leadership is going to look at that and find out why we didn't do what we

were supposed to according to policy sometimes those policies are written by people who aren't at the bedside or they're so old that they're not up-to-date they don't have best practice in them and so we'd need to be conscious

of those so you know every three years we're supposed to be updating our policies and procedures and that includes our departmental ones too and we need to be looking at best practice and listening to our staff to really

prevent patient safety errors so if you look at your system design and behavioral choices if you spend 80% of your son time there you could really reduce your human errors in your adverse events that's

what a just culture and a culture of safety brings you but in order to do that you have to have organizational trust so management needs to be trusted management needs to trust the staff and staff needs to trust management so all

that is cyclic elana just culture you would have that so how do we get there I

think we mentioned some of them we need our we need our Huddle's right we need Huddle's we need our department meetings to include a patient safety event or it

doesn't have to be an event but a patient safety item you can wrap all kinds of things around that for your department meetings you know you can you can make it a contest for your staff you know for which patient safety item you

want to bring forward for your staff meeting you know giv yo suggestion involves staff directly in

problem solving processes and really take a look at your current disciplinary processes not only from leadership departmental wise but but organizationally you've really got to be a voice it's got to start somewhere so

you've got to be a voice so risk is everywhere it could be perception it can be absolute and it's not essentially bad so if you remember most of us are taught to drive at 10:00 and 2:00 but we slide our hands down to that 9 and 3 position

and my car was in the in the in the repair shop and I had a loaner and this loaner had a steering wheel warmer and my hands definitely went down to nine and three because it was right during the heart of so that Chicago winter that

we have so but normally I'm driving at 8:00 and coffee so I mean Mike so ah but I'll tell you during a blizzard I don't I'm not holding that coffee my hands are back up at ten and two

because I understand the risk so we there's risk in many things that we do in health care it's fraught with risk but we do that because and you hear the physicians talk about you know that risk benefit and we way that you know risk

benefit so we do a lot of risky things in health care but it's okay because there's a lot of good things that come from that in taking care of our patients but we need to recognize those human factors that come into play we need to

be aware of our surroundings so that situational awareness when we're taking care of our patients and affect patient safety manage and support our values of our organization and each other

questions a question comment I'm

Canadian I work in a Canadian hospital and I would say my hospital has an excellent just culture this is a practice so the other day we had a bunch of unusual things happen to begin with and I made the first error and it was a

medication error I forgot to order chemotherapy page went into the room they filled out their interventional procedure safety checklist and someone checked off all the equipment I need for this procedure as present checked it off

he did a time out in the room completed it the doctor started the case when he got the catheter in the right place that's when they discovered there was no chemo because I had forgotten to order the chemo that was the first mistake and

so we have an RLS reporting and learning system I filled it out etc and my manager was 100% supportive that art Swiss cheese lined up and you know the three things that should have caught it did not so this the safety procedure

checklist failed and so did the timeout but the ultimate one in my opinion and I wrote this in my report was that the doctor should never have started a case if he didn't know everything was ready and my my or

zatia was extremely supportive of everything I did but that doctor still thinks it's my fault that we didn't do the case and you know I'm not a new grad obviously and I'm you know he's wrong and I don't care

I fully own my mistake but he's wrong in that the whole thing was my fault so sometimes your organization will 100% support you but you might have people that are not in the just culture part and they're just looking to blame you so

you know I feel like I've done my thing I've learned I've set arow and I'm changing the situation and so it's important to remember that part of your just culture and and not focus on the people are trying to say it's your fault

to stop you from reporting in the future not really a question sorry no that's alright that's great because I think that illustrates that anatomy of the error in healthcare with that blunt end of the system and the sharp end so he's

kind of stuck in the sharp end isn't he he's blaming at you thank you very much because errors are made and they're devastating not just for the patients they're devastating to a practitioner so I think we have to look beyond the just

culture and there's something called second victim and you need second victim support and I'm trying yet actually where I work to have a program instituted sort of like a Rapid Response Team when an error is made so that you

can have the support and it goes beyond changing a policy or procedure or doing a root cause analysis but you need emotional and psychological support for the practitioner that made that error and came forth to report that error so

just wondering also how many people have a second Victim Support Service at their institution see there very little I think we have to really look at that and look forward and implement maybe something like that I

agree and we are one of those institutions that have the second victim and and that in itself is kind of a it's a topic but absolutely and and that does go hand in hand with that just culture to support because it is very

devastating when you have that air and depending on the patient safety event that occurs you know if it results in a patient death that really sticks with you and also events that we don't just stop there with ours so we have a

psychologist that's on board that talks with our physicians and then we have a liaison in our Employee Assistance Program that's also psychology based for our staff so that they can have further follow-up but even even if it's a

devastating event where there wasn't anything that was done wrong it's just that we were gonna stop that train that was rolling with this patient you know how devastating sepsis can be you just sometimes aren't going to stop

that train and and the patient is going to pass but the practitioners that were involved in that care are are moved by that most recently we had a three-year-old who passed and they had they were septic had a cleft palate and

they had a abscess that had formed after the surgery so you know that can be very devastating we do we pull our practitioners into that and that from risk management we're able to initiate that so we absolutely ask when a serious

patient safety event occurs or one that we can pick up that there's some a lot of emotion wrapped around it we'll ask them how they're doing and then a we can self-refer a person and then EAP we'll reach out to the staff member and

our psychologist we'll reach out to the physician if we if we really feel that they need just a little helping hand so yeah it's a good program I kudos to you to get that started so much for your talk today I just wanted to reach back

to you and ask you how your organization or other organizations support the exposure of those events within your hospitals I happen to be from Vanderbilt University Medical Center and we have been in the national news recently and

so there's been a lot of conversation with my staff and you know you you pull your team together and you have conversations and and the event occurred in 2017 and I'm facing them in 2018 2019 and they're like how come we don't know

these things happen in our organizations and you know there's a lot to learn from patient error and Sentinel events and I'm just curious to learn from you how how do you expose your nurses within your organization to those very private

things that go through risk management can you share with me sure sure thank you that's a great question so we do out of patient safety and risk management we do Grand Rounds and we do one once a quarter and in that way and

we will on some sensitive issues because some of it can be wrapped up in legal if there's lawsuits pending and stuff so you you really can't share some of that and I think that that might be some of it a little bit of the disconnect that

staff have because they may or may not know the players involved which gets to be a little tricky so so time helps but we we do let them know that the event happened here and that's that's the title of our

Grand Rounds and we bring those patient safety events but will de identify them quite a little bit and change some of that to protect the practitioners involved and also to focus more on the on the patient safety event and again

focus on the system so on the on the blunt end rather than so much on that sharp end because that sharp end it's sharp for a reason and it could hurt so it can hurt our clinicians describe to me who all is involved in your Grand

Rounds and where that takes place so we have we have a couple of different venues in our Hospital depending on how large we anticipate it to be so we actually have a an auditorium that has the auditorium seating because we're an

academic Medical Center so we have that that luxury we also have some smaller venues depending on what's happening so depending on what the event is we may have outside people come in and talk about that in fact we had the one of the

big things that we're working on right now is sorry the burt behavioral emergency response team so and awareness awareness wrapped around that so one of the things that we're actually looking at is bringing in the the nurse who

speaks from the del noir event to come to the hospital and speak about issues she presented very well very strategically and just to kind of heighten that behavioral awareness that we don't want our nurses to be you know

subjected to that so so depending on what's happening we may pull in outside most of the time we will involve people from our own departments throughout the hospital depending on what the event is so we've we've had some we had a wrong

patient that was they had a procedure done not a wrong patient we had the wrong the wrong procedure was done on the right patient and we actually brought in from ultrasound and from I are including the

physician involved with the case and then a risk management person and made up a panel for people to we presented and then fielded questions now that actually went really well we standing-room-only so okay that was good

so that's some of the strategy that we use thank you you're welcome because we have a computerized reporting and learning system our system sends out a monthly report on just the trends so if we're seeing a rise in a certain

thing and sometimes it's just you know Falls so remember to look at your Falls where whatever but sometimes it's more specific so there have been you know a mixup on this drug in this drug and and pharmacy is doing this to try and

alleviate that and so well it's not everything and it's obviously not any that are illegal it does give you a sort of months a month overview of what kinds of things people are doing wrong and the best part about it is these were all

reported independently so you can it's showing us as people that someone listened to our report and that something's being done about it right very good point and that's some of what we hear too is that these systems allow

you to anonymously submit a report which is fine we're interested in the event we want to hear the the event it's helpful when we have a name because if if I as the MIS managers that's looking at this report if I have a question I'd like to

go back to the person who put the report in to kind of find some more information out but it is not necessary and we're like I said we're more interested in the event but we we to send out a report that kind of aggregates our involvement

but it what our top five reports are for the month but we hear a lot of disconnect that our staff don't hear about what's happening what the report is I put that I put that record in and I don't hear anything about it well did

you give us a name so because the manager the unit manager also sees that and that's why we encourage our managers to use some of those reports that they're seeing as a patient safety during their event during their

department meetings get that word out and what they're doing about it because leader leadership so does do some effect some change but staff might not realize it's connected to the event that they turned in I was just curious amongst us

all who when you get new hires or new employees who talks about what to do you know if there's an air or just the whole process of that because I know the facilities that I worked at nobody has ever done it until the time that it's

happened so what education are we providing from the get-go that maybe change practice further down absolutely so we risk management speaks at nursing orientation for us anyone else do they have just to talk about oh sorry that's

okay Dartmouth Hitchcock Medical Center up in New England and we have an error prevention training class that's required by all new staff but we also have made a huge push that all veteran staff have to go as well and we're like

at 90% it's a two-hour training and it talks about all different types of error prevention and then it also talks about our reporting system we also are starting to look at code lavender if anyone's heard about that but that's the

second victim so we're supporting our nurses through errors and doctors you know technicians technologists but we have a really great just culture I mean sometimes of course it's thought to be punitive but I actually as a nurse

manager do all of the reporting systems for quality and safety for the whole department and we have at 9:30 we have a daily safety brief and everyone from the hospital every department comes and reports out any safety issues and then

oftentimes in real time we're actually getting together with the different parties to say okay what can we do what was the failure we also have a very robust our see a root cause analysis or when we

have something that goes to a report that's pretty serious we will have that we get a lot of people in the room including the people that were involved in it and it's to look at where do we systems failure where is that and then

after that oftentimes we'll do a cap so we'll grab a group of working together to say we need to change our policy or change the standards in which we're working because it's it's not ever proof it's fabulous thank you so much for

sharing that okay thank you all very much I appreciate you coming [Applause]

Disclaimer: Content and materials on Medlantis are provided for educational purposes only, and are intended for use by medical professionals, not to be used self-diagnosis or self-treatment. It is not intended as, nor should it be, a substitute for independent professional medical care. Medical practitioners must make their own independent assessment before suggesting a diagnosis or recommending or instituting a course of treatment. The content and materials on Medlantis should not in any way be seen as a replacement for consultation with colleagues or other sources, or as a substitute for conventional training and study.