From Today's Boston Globe
http://www.boston.com/news/globe/health ... tors_make/
The link should be good 'til at least Wednesday p.m.
Unfortunately, the web version doesn't have the snazzy graphic showing how an attribution error led to an achoring effect error, which led to confirmation bias, ultimately resulting in a dangerous mis-diagnosis.
Of note "Misdiagnosis occurs in 15 to 20 percent of all cases, according to some research, and it is estimated that in half of these, serious harm occurs."
I know there's been a few folks responding to the call for blind testing and/or blind verification (second opinions) by saying that "gee, doctors don't do that". Perhaps they ought to -- might cut down that 15-20% error rate!
Doctors too affected by various heristics
-
L.J.Steele
- Posts: 430
- Joined: Mon Aug 22, 2005 6:26 am
- Location: Massachusetts
- Contact:
-
Pat A. Wertheim
- Posts: 872
- Joined: Thu Jul 07, 2005 6:48 am
- Location: Fort Worth, Texas
Hmmmmmmmm . . . . . . . .
The article reminds me of the time my First Born was 11 or 12 years old and came down with the flu one winter. At least, that was the diagnosis from the emergency clinic where my wife took him. He was running a fever, vomiting, and had other symptoms of the flu that escape me just now. Over a period of a week, she took him back to the clinic several times and each time, the doctor on duty checked his records, checked his symptoms, and assured my wife that he was just suffering the symptoms of the flu and he would get over it in another week or so. Spring break came, and wife and kids had planned a trip to see the rest of the family several hundred miles away. But poor First Born got left at home with me because he was too sick to travel. Wife assured me he was fine, just keep him in bed until the flu passed.
Well, I had a minor skin blemish my doctor, Dr. Bishop, had wanted to remove and biopsy. My appointment happened to fall in the middle of spring break week. I took First Born with me to Dr. Bishop's office. When we walked into the doctor's office, Dr. Bishop took one look at First Born, spent a minute or less doing quick tests, and told me to forget the skin blemish. Rush First Born to the hospital immediately -- Dr. Bishop was cancelling his afternoon schedule and would do emergency surgery as soon as he could get to the hospital. Turns out the kid's appendix had just ruptured that morning -- he didn't have the flu at all.
In retrospect, once the erroneous diagnosis was made, all the doctors at the emergency clinic suffered from confirmation bias. Dr. Bishop unknowingly engaged in "blind verification" but came up with a different -- and correct -- conclusion.
Some of you know First Born, but now you know "the rest of the story."
The article reminds me of the time my First Born was 11 or 12 years old and came down with the flu one winter. At least, that was the diagnosis from the emergency clinic where my wife took him. He was running a fever, vomiting, and had other symptoms of the flu that escape me just now. Over a period of a week, she took him back to the clinic several times and each time, the doctor on duty checked his records, checked his symptoms, and assured my wife that he was just suffering the symptoms of the flu and he would get over it in another week or so. Spring break came, and wife and kids had planned a trip to see the rest of the family several hundred miles away. But poor First Born got left at home with me because he was too sick to travel. Wife assured me he was fine, just keep him in bed until the flu passed.
Well, I had a minor skin blemish my doctor, Dr. Bishop, had wanted to remove and biopsy. My appointment happened to fall in the middle of spring break week. I took First Born with me to Dr. Bishop's office. When we walked into the doctor's office, Dr. Bishop took one look at First Born, spent a minute or less doing quick tests, and told me to forget the skin blemish. Rush First Born to the hospital immediately -- Dr. Bishop was cancelling his afternoon schedule and would do emergency surgery as soon as he could get to the hospital. Turns out the kid's appendix had just ruptured that morning -- he didn't have the flu at all.
In retrospect, once the erroneous diagnosis was made, all the doctors at the emergency clinic suffered from confirmation bias. Dr. Bishop unknowingly engaged in "blind verification" but came up with a different -- and correct -- conclusion.
Some of you know First Born, but now you know "the rest of the story."
-
nigeus lowlef
- Posts: 22
- Joined: Sat Jun 10, 2006 10:42 am
The mistakes doctors make
Errors in thinking too often lead to wrong diagnoses
By Dr. Jerome Groopman |
March 19, 2007
http://www.boston.com/news/globe/health ... ke/?page=2
Five years ago, a woman named Leslie developed indigestion, abdominal discomfort, and, occasionally, diarrhea. She has just given birth to her third child, and life was understandably hectic at home. Her primary-care physician gave her antacids, but this afforded only slight relief.
"I feel really different. Something has changed in my body," Leslie told her doctor. But nothing abnormal was found on her physical examination or on routine tests. Leslie was sent to several specialists, and before each visit her primary-care doctor informed the consultants that Leslie was under "a lot of stress and seems anxious, and depressed."
"Nothing is physically wrong," the doctors reassured Leslie. An antidepressant was prescribed, but it did not ameliorate her condition.
Some four years later, Leslie felt dizzy and nearly fainted in the street. Her husband drove her to the local hospital, where she was found to be severely anemic. X-rays and scans showed a large mass where the small intestine meets the colon. Clearly, the tumor had caused her previous problems.
Leslie is in her early 40s, an intelligent and thoughtful woman who told me her story in a clear and organized way. "I felt like I must be losing my mind," she said when recalling how her symptoms had been attributed to "stress and anxiety," and treated with an antidepressant. Upon reviewing Leslie's medical records, I found a blood test ordered by one of the specialists several years earlier that had been clearly abnormal and indicative of an intestinal tumor called a carcinoid. Her symptoms were consistent with this type of endocrine tumor, and, indeed, this proved to be the diagnosis at surgery.
Misdiagnosis occurs in 15 to 20 percent of all cases, according to some research, and it is estimated that in half of these, serious harm occurs.
Why do we as physicians miss the correct diagnosis? It turns out that the mistakes are rarely due to technical factors, like the laboratory mixing up the blood specimen of one patient and reporting another's result. Nor is misdiagnosis usually due to a doctor's lack of knowledge about what later is found to be the underlying disease.
Rather, most errors in diagnosis arise because of mistakes in thinking.
Physicians diagnose diseases based on what is called "pattern recognition." We draw bits of information from our patients' symptoms, our findings on physical examination, the laboratory tests, and X-ray studies the way a magnet pulls from all directions. To form patterns in our minds, we use shortcuts in thinking, so called "heuristics." Usually, a doctor generates one or two hypotheses about what is wrong within the first minutes of seeing the patient and listening to his or her story. Often, we are correct in these rapid judgments, but too often we can be wrong.
Physicians are rarely taught about pitfalls in cognition. During their training, they work as apprentices to senior doctors. They learn largely by doing. In today's medical system, where there is intense pressure to see as many patients as possible, the quick judgment is often rewarded.
Unfortunately, working in haste is a setup for errors in thinking.
Only very recently have medical educators begun to focus squarely on the problem of misdiagnosis, why it occurs, and what might be done to prevent it. It turns out that errors in thinking do not occur in isolation, but usually arise from a cascade of sequential cognitive mistakes.
I only learned this recently when I realized I did not know how I think; in fact, when I asked other clinicians how they succeeded or failed in making a diagnosis, very few could explain how their mind works to decipher a patient's problems.
Let's deconstruct Leslie's case. Yes, the arrival of a third child can cause stress in a family. This truth strongly colored the physicians' impressions, so they made what is called "an attribution error." This involves stereotyping -- in Leslie's case , casting her as an anxious and somewhat depressed and distraught postpartum woman. The diagnosis of indigestion and abdominal discomfort with occasional diarrhea was too quickly fit into the pattern of a stress-related condition.
YOUR VIEW: Did a clever doctor ever save the day for you, finding a medical problem others had missed?
The doctors fixed on this diagnosis, so called "anchoring" where the mind attaches firmly to one possibility. Anchoring so tightly to one diagnosis and not broadly considering others is called "premature closure." Even when, later in Leslie's evaluation, a blood test result was obtained that was very abnormal, it was not sufficiently considered; no one involved in her case could lift their mental anchor and comprehensively explore other possibilities.
Discounting such discrepant or contradictory data is called "confirmation bias" -- the mind cherry-picks the available information to confirm the anchored assumption rather than revising the working diagnosis.
When I called one of Leslie's doctors, he was crestfallen that he had missed what was wrong. I knew all too well his feeling. Throughout my career I have made cognitive mistakes, some of them originating from an attribution error.
All of us as physicians are fallible, and while it is unrealistic to imagine a perfect clinical world, it is imperative to reduce the frequency of misdiagnosis. I believe all health professionals should learn in-depth about why and how and when we make errors in thinking, and I also believe that if our patients and their families and friends know about the common cognitive pitfalls, they can ask specific questions to help us think better.
We can interrupt the cascade of cognitive mistakes and return to an open-minded and deliberate consideration of symptoms, physical exams , and laboratory tests -- and in this way close an important gap in care.
Leslie was lucky, by the way. Her cancer turned out to be treatable, and she is doing fine.
Dr. Jerome Groopman is chief of experimental medicine at Beth Israel Deaconess Medical Center and a staff writer for The New Yorker. A more extensive look at these ideas appears in his new book "How Doctors Think."
© Copyright 2007 Globe Newspaper Company.
****************************************
Now, the modified version --
****************************************
The mistakes examiners make
Errors in thinking too often lead to wrong conclusions
Ten years ago, a woman named Shirley collected some evidence, visited a crime scene, and provided elimination prints. A few years earlier, she had gotten in trouble for visiting a scene without authorization to do so. An SCRO examiner looked at one of the crime scene prints and identified it as belonging to Shirley.
"I am really distressed. Something is very wrong here," Shirley told the SCRO. But nothing abnormal was found about the examination. Shirley's print was sent to several specialists, and before each examination the SCRO informed the consultants that Leslie "had gotten in trouble for visiting scenes before."
"Nothing is wrong with this conclusion," the experts reassured the SCRO. An antidepressant was prescribed, but it did not ameliorate her condition.
Later, Shirley felt depressed and nearly committed suicide. Her father helped her find an outside expert, who found the identification to be erroneous. Demonstrations were made that showed even a lay jury unanimously it was not an identification. Clearly, the erroneous match had been the cause of the problems.
Shirley is in her early 40s, an intelligent and thoughtful woman who told me her story in a clear and organized way.
"I felt like I must be losing my mind," she said when recalling how the print had been attributed to her. Upon reviewing Shirley's files, I found an independent examination conducted by one of the specialists several years earlier that had been clearly abnormal and indicative of an erroneous identification. The fingerprint detail was consistent with supporting an exclusion, and, indeed, this proved to be the conclusion in the end.
Misdiagnosis occurs in a small percent of all cases, according to some research, and it is estimated that in many of these cases, serious harm occurs.
Why do we as examiners miss the correct conclusion? It turns out that the mistakes are rarely due to technical factors, like the laboratory mixing up the fingerprint of one patient and reporting another's result. Nor is misdiagnosis usually due to an examiner's lack of knowledge about distortion. Rather, most errors in conclusions arise because of mistakes in thinking.
Examiners effect identifications based on what is called "pattern recognition." We draw bits of information from patterns, points, shapes, and pores the way a magnet pulls from all directions. To form patterns in our minds, we use shortcuts in thinking, so called "heuristics." Usually, an examiner generates one or two hypotheses about the conclusion within the first minutes of seeing detail within the latent print and known print. Often, we are correct in these rapid judgments, but too often we can be wrong.
Examiners are rarely taught about pitfalls in cognition. During their training, they work as apprentices to senior examiners. They learn largely by doing. In today's forensic system, where there is intense pressure to complete as many examinations as possible, the quick judgment is often rewarded. Unfortunately, working in haste is a setup for errors in thinking.
Only very recently have forensic educators begun to focus squarely on the problem of erroneous identification, why it occurs, and what might be done to prevent it. It turns out that errors in thinking do not occur in isolation, but usually arise from a cascade of sequential cognitive mistakes.
I only learned this recently when I realized I did not know how I think; in fact, when I asked other examiners how they succeeded or failed in making conclusions, very few could explain how their mind works to decipher an impressions detail.
Let's deconstruct Shirley's case. Yes, the un-authorized presence on scene is not a good thing. This truth strongly colored the SCRO examiner's impressions, so they made what is called "an attribution error." This involves stereotyping -- in Shirley's case, casting her as an un-trustworthy person who routinely breaks the rules. The conclusion of identity was too quickly fit into the pattern of historical knowledge. The examiner fixed on this conclusion, so called "anchoring" where the mind attaches firmly to one possibility. Anchoring so tightly to one conclusion and not broadly considering others is called "premature closure." Even when, later in time, a second result was obtained that was very different, it was not sufficiently considered; no one involved in her case could lift their mental anchor and comprehensively explore other possibilities.
Discounting such discrepant or contradictory data is called "confirmation bias" -- the mind cherry-picks the available information to confirm the anchored assumption rather than revising the working diagnosis.
When I called one of Shirley's examiners, I wish he had been crestfallen that he had made an erroneous identification. All of us as examiners are fallible, and while it is unrealistic to imagine a perfect forensic world, it is imperative to reduce the frequency of erroneous identifications. I believe all forensic professionals should learn in-depth about why and how and when we make errors in thinking, and I also believe that if lawyers know about the common cognitive pitfalls, they can ask specific questions to help us think better.
We can interrupt the cascade of cognitive mistakes and return to an open-minded and deliberate consideration of all qualities and quantities of detail and bias -- and in this way close an important gap in service.
Shirley was lucky, by the way. Her fingerprint turned out to be an exclusion, and she is doing fine.
Errors in thinking too often lead to wrong diagnoses
By Dr. Jerome Groopman |
March 19, 2007
http://www.boston.com/news/globe/health ... ke/?page=2
Five years ago, a woman named Leslie developed indigestion, abdominal discomfort, and, occasionally, diarrhea. She has just given birth to her third child, and life was understandably hectic at home. Her primary-care physician gave her antacids, but this afforded only slight relief.
"I feel really different. Something has changed in my body," Leslie told her doctor. But nothing abnormal was found on her physical examination or on routine tests. Leslie was sent to several specialists, and before each visit her primary-care doctor informed the consultants that Leslie was under "a lot of stress and seems anxious, and depressed."
"Nothing is physically wrong," the doctors reassured Leslie. An antidepressant was prescribed, but it did not ameliorate her condition.
Some four years later, Leslie felt dizzy and nearly fainted in the street. Her husband drove her to the local hospital, where she was found to be severely anemic. X-rays and scans showed a large mass where the small intestine meets the colon. Clearly, the tumor had caused her previous problems.
Leslie is in her early 40s, an intelligent and thoughtful woman who told me her story in a clear and organized way. "I felt like I must be losing my mind," she said when recalling how her symptoms had been attributed to "stress and anxiety," and treated with an antidepressant. Upon reviewing Leslie's medical records, I found a blood test ordered by one of the specialists several years earlier that had been clearly abnormal and indicative of an intestinal tumor called a carcinoid. Her symptoms were consistent with this type of endocrine tumor, and, indeed, this proved to be the diagnosis at surgery.
Misdiagnosis occurs in 15 to 20 percent of all cases, according to some research, and it is estimated that in half of these, serious harm occurs.
Why do we as physicians miss the correct diagnosis? It turns out that the mistakes are rarely due to technical factors, like the laboratory mixing up the blood specimen of one patient and reporting another's result. Nor is misdiagnosis usually due to a doctor's lack of knowledge about what later is found to be the underlying disease.
Rather, most errors in diagnosis arise because of mistakes in thinking.
Physicians diagnose diseases based on what is called "pattern recognition." We draw bits of information from our patients' symptoms, our findings on physical examination, the laboratory tests, and X-ray studies the way a magnet pulls from all directions. To form patterns in our minds, we use shortcuts in thinking, so called "heuristics." Usually, a doctor generates one or two hypotheses about what is wrong within the first minutes of seeing the patient and listening to his or her story. Often, we are correct in these rapid judgments, but too often we can be wrong.
Physicians are rarely taught about pitfalls in cognition. During their training, they work as apprentices to senior doctors. They learn largely by doing. In today's medical system, where there is intense pressure to see as many patients as possible, the quick judgment is often rewarded.
Unfortunately, working in haste is a setup for errors in thinking.
Only very recently have medical educators begun to focus squarely on the problem of misdiagnosis, why it occurs, and what might be done to prevent it. It turns out that errors in thinking do not occur in isolation, but usually arise from a cascade of sequential cognitive mistakes.
I only learned this recently when I realized I did not know how I think; in fact, when I asked other clinicians how they succeeded or failed in making a diagnosis, very few could explain how their mind works to decipher a patient's problems.
Let's deconstruct Leslie's case. Yes, the arrival of a third child can cause stress in a family. This truth strongly colored the physicians' impressions, so they made what is called "an attribution error." This involves stereotyping -- in Leslie's case , casting her as an anxious and somewhat depressed and distraught postpartum woman. The diagnosis of indigestion and abdominal discomfort with occasional diarrhea was too quickly fit into the pattern of a stress-related condition.
YOUR VIEW: Did a clever doctor ever save the day for you, finding a medical problem others had missed?
The doctors fixed on this diagnosis, so called "anchoring" where the mind attaches firmly to one possibility. Anchoring so tightly to one diagnosis and not broadly considering others is called "premature closure." Even when, later in Leslie's evaluation, a blood test result was obtained that was very abnormal, it was not sufficiently considered; no one involved in her case could lift their mental anchor and comprehensively explore other possibilities.
Discounting such discrepant or contradictory data is called "confirmation bias" -- the mind cherry-picks the available information to confirm the anchored assumption rather than revising the working diagnosis.
When I called one of Leslie's doctors, he was crestfallen that he had missed what was wrong. I knew all too well his feeling. Throughout my career I have made cognitive mistakes, some of them originating from an attribution error.
All of us as physicians are fallible, and while it is unrealistic to imagine a perfect clinical world, it is imperative to reduce the frequency of misdiagnosis. I believe all health professionals should learn in-depth about why and how and when we make errors in thinking, and I also believe that if our patients and their families and friends know about the common cognitive pitfalls, they can ask specific questions to help us think better.
We can interrupt the cascade of cognitive mistakes and return to an open-minded and deliberate consideration of symptoms, physical exams , and laboratory tests -- and in this way close an important gap in care.
Leslie was lucky, by the way. Her cancer turned out to be treatable, and she is doing fine.
Dr. Jerome Groopman is chief of experimental medicine at Beth Israel Deaconess Medical Center and a staff writer for The New Yorker. A more extensive look at these ideas appears in his new book "How Doctors Think."
© Copyright 2007 Globe Newspaper Company.
****************************************
Now, the modified version --
****************************************
The mistakes examiners make
Errors in thinking too often lead to wrong conclusions
Ten years ago, a woman named Shirley collected some evidence, visited a crime scene, and provided elimination prints. A few years earlier, she had gotten in trouble for visiting a scene without authorization to do so. An SCRO examiner looked at one of the crime scene prints and identified it as belonging to Shirley.
"I am really distressed. Something is very wrong here," Shirley told the SCRO. But nothing abnormal was found about the examination. Shirley's print was sent to several specialists, and before each examination the SCRO informed the consultants that Leslie "had gotten in trouble for visiting scenes before."
"Nothing is wrong with this conclusion," the experts reassured the SCRO. An antidepressant was prescribed, but it did not ameliorate her condition.
Later, Shirley felt depressed and nearly committed suicide. Her father helped her find an outside expert, who found the identification to be erroneous. Demonstrations were made that showed even a lay jury unanimously it was not an identification. Clearly, the erroneous match had been the cause of the problems.
Shirley is in her early 40s, an intelligent and thoughtful woman who told me her story in a clear and organized way.
"I felt like I must be losing my mind," she said when recalling how the print had been attributed to her. Upon reviewing Shirley's files, I found an independent examination conducted by one of the specialists several years earlier that had been clearly abnormal and indicative of an erroneous identification. The fingerprint detail was consistent with supporting an exclusion, and, indeed, this proved to be the conclusion in the end.
Misdiagnosis occurs in a small percent of all cases, according to some research, and it is estimated that in many of these cases, serious harm occurs.
Why do we as examiners miss the correct conclusion? It turns out that the mistakes are rarely due to technical factors, like the laboratory mixing up the fingerprint of one patient and reporting another's result. Nor is misdiagnosis usually due to an examiner's lack of knowledge about distortion. Rather, most errors in conclusions arise because of mistakes in thinking.
Examiners effect identifications based on what is called "pattern recognition." We draw bits of information from patterns, points, shapes, and pores the way a magnet pulls from all directions. To form patterns in our minds, we use shortcuts in thinking, so called "heuristics." Usually, an examiner generates one or two hypotheses about the conclusion within the first minutes of seeing detail within the latent print and known print. Often, we are correct in these rapid judgments, but too often we can be wrong.
Examiners are rarely taught about pitfalls in cognition. During their training, they work as apprentices to senior examiners. They learn largely by doing. In today's forensic system, where there is intense pressure to complete as many examinations as possible, the quick judgment is often rewarded. Unfortunately, working in haste is a setup for errors in thinking.
Only very recently have forensic educators begun to focus squarely on the problem of erroneous identification, why it occurs, and what might be done to prevent it. It turns out that errors in thinking do not occur in isolation, but usually arise from a cascade of sequential cognitive mistakes.
I only learned this recently when I realized I did not know how I think; in fact, when I asked other examiners how they succeeded or failed in making conclusions, very few could explain how their mind works to decipher an impressions detail.
Let's deconstruct Shirley's case. Yes, the un-authorized presence on scene is not a good thing. This truth strongly colored the SCRO examiner's impressions, so they made what is called "an attribution error." This involves stereotyping -- in Shirley's case, casting her as an un-trustworthy person who routinely breaks the rules. The conclusion of identity was too quickly fit into the pattern of historical knowledge. The examiner fixed on this conclusion, so called "anchoring" where the mind attaches firmly to one possibility. Anchoring so tightly to one conclusion and not broadly considering others is called "premature closure." Even when, later in time, a second result was obtained that was very different, it was not sufficiently considered; no one involved in her case could lift their mental anchor and comprehensively explore other possibilities.
Discounting such discrepant or contradictory data is called "confirmation bias" -- the mind cherry-picks the available information to confirm the anchored assumption rather than revising the working diagnosis.
When I called one of Shirley's examiners, I wish he had been crestfallen that he had made an erroneous identification. All of us as examiners are fallible, and while it is unrealistic to imagine a perfect forensic world, it is imperative to reduce the frequency of erroneous identifications. I believe all forensic professionals should learn in-depth about why and how and when we make errors in thinking, and I also believe that if lawyers know about the common cognitive pitfalls, they can ask specific questions to help us think better.
We can interrupt the cascade of cognitive mistakes and return to an open-minded and deliberate consideration of all qualities and quantities of detail and bias -- and in this way close an important gap in service.
Shirley was lucky, by the way. Her fingerprint turned out to be an exclusion, and she is doing fine.
-
H. B. James
- Posts: 92
- Joined: Tue Jan 30, 2007 9:02 am
- Location: US
-
L.J.Steele
- Posts: 430
- Joined: Mon Aug 22, 2005 6:26 am
- Location: Massachusetts
- Contact:
Groopman Interview
You should be able to pull up the recording of an interview with him from On Point (www.wbur.com) later tonight. Discussion of anchoring, confirmation bias, misdiagnosis, etc.