Skip to content

Raining on the BCMA parade (again) Part III

06/26/2009
tags:

Another fantastic analysis from Dennis Tribble, CPO and CTO, ForHealthTechnology.

John Poikonen, Pharm.D. | UMass Memorial Health Care | john.poikonen@umassmemorial.org | 508-334-1159 | 978-501-4887 mobile


From: Dennis Tribble
Subject: RE: Raining on the BCMA parade (again)

My thoughts around BCMA are as follows:

1) Human interaction with the environment is subject to cognitive bias; this bias is necessary for human perception to deal with an inherently chaotic environment. It is how our minds “fill in the blanks” to deal with and understand an environment that otherwise presents messages that are incomplete or contradictory. http://www.12manage.com/description_cognitive_bias.html

2) The more chaotic our environment is, and the more hectic it is, the more we tend to rely on our biases to help us triage inputs and manage what is most important.

3) Cognitive bias grows through experience; the more often that bias works, the more heavily we tend to lean on it.

4) Errors occur when our cognitive biases lead us astray, when they literally cause us to see things that aren’t there.

a. When I literally check through hundreds and thousands of orders or fulfillments and they are all correct, I develop the expectation that what I am looking at will be correct, no matter how many times I tell myself that I need to start from the presumption that everything I look at is wrong.

b. If I expect to see a 5, and I see a 5 (even if it is attached to a 0 to make 50), I tend to see it as correct.

c. If I perceive someone as competent, I tend to check their work less carefully because my expectation is that they don’t make mistakes.

d. If I am checking someone else’s work, and I, too, am overburdened and distracted, (that is, if my primary role is not that of checker), then I will lean on my bias toward that person in the process as part of my triaging of the chaos around me.

e. One of the defense mechanisms we have in the absence of good tools to catch and prevent errors from cognitive bias is yet one more bias… the “I never make mistakes” bias. The nurse who says “I have been taking care of patients for 35 years and have never made an error”… The pharmacist who castigates peers who have reported to have made errors as “careless”… the Technician who says “Other people make compounding errors; I don’t!” (believe me, I have known them all…)

5) Second-checking by a human is subject to cognitive bias; it has a known and reported failure rate. That rate was quoted at the IV Safety Summit as 10%. In my opinion, solving a medication error problem by asking another overburdened individual to interrupt their thought process and work flow to double-check someone else is a fundamentally bankrupt strategy.

6) In theory, BCMA offers a checking mechanism that is not subject to cognitive bias; each scan is very literal, and makes no assumptions about the world around it.

7) In fact, BCMA system implementation and use introduces cognitive bias. To the extent that such implementations create the perception that an alert from such systems is more likely to be a system failure than an actual error, we create systems that will be ignored or worked around.

8) So evaluation of BCMA systems in the absence of well-planned and well-executed systems are probably not revealing, to the extent that they are still subject to roughly the same cognitive biases as non-BCMA implementations. Based on my reading of the current studies, the quality of the implementation is not a reported parameter.

9) The better we control the environment around us (through adoption of e-MAR, better organization of medication supplies to prevent selection errors, second-checks), the less frequently it is likely to catch something. While it is possible that a sufficient number of such controls will eventually reduce that number significantly (and, apparently, e-MAR does, and so does bar code scanning on dispensing), these strategies are all still subject at some level to cognitive bias.

10) So basing a cost/benefit decision on these controls is still a gamble; there is some point, statistically, at which all the holes in the Swiss cheese will line up.

11) Current studies in critical care environments suggest that the rate at which BCMA systems detect errors in that environment is very low, perhaps undetectable. This may, in fact, be a generally applicable finding. Then again, it may not. Based on what I know about errors BCMA systems have been reported to detect, the relatively low nurse-to-patient ratio, and the relative geographic isolation of the care makes some of those errors seem less likely in the critical care setting than in a general medical/surgical setting where the case load per nurse is much higher, and the likelihood of simply walking into the wrong room more acute. It is certainly possible that we will discover that the primary value to BCMA has more to do with assuring patient identification than it does with verifying medications. It may be that we discover that it is no better than other methods.

12) If nurse staffing remains a critical rate-limiting step, do we really want to commit that precious time to second-checking other nurses if we could perform that function another way?

I do bridle at the juxtaposition of BCMA and CPOE… in my mind they are completely unrelated, other than their competition for capital dollars.

CPOE deals with the likelihood of a clinical error in the ordering of pharmaceutical care. Clearly this is an issue, and is, perhaps, the most significant at the end of the day. By the way, there is significant skepticism about the value of CPOE as well, expressed as concern over e-iatrogenesis. In my opinion, there is a considerable amount of work regarding what a “safe” human interface for CPOE looks like, and it doesn’t appear to me that anybody is really researching that issue.

BCMA deals with the implementation of properly ordered care. It attempts to address the issue of incorrect product selection, preparation or administration. They are at opposite ends of the medication-use process. BCMA benefits from some highly  publicized medication error reports. Incidentally, it probably wouldn’t have prevented them. I believe that BCMA systems must first become reliable; then we can figure out if they work. Measuring systems that are inherently unreliable is not revealing; we already know they are unreliable.

There are sources of error that can be captured by neither system. That is why we advocate for better systems in the preparation and dispensing process. There IS good evidence that these systems provide benefit.

A safe medication use system will not be created with just one of these.

Just one man’s opinion…

Dennis A. Tribble, Pharm. D. | Chief Pharmacy Officer|Chief Technology Officer

Phone: 386-506-5315 | Fax: 386-274-4114|Cell: 386-871-6940

Email: dtribble@fhtinc.com| www.fhtinc.com

790 Fentress Boulevard | Daytona Beach, FL | 32114

Posted via email

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: