I've worked on five continents, in dozens of industries, in the government, private sector and not-for-profit arena and wherever I went, one theme emerged.
The quality of information affects the quality of outcomes.
When I was the Senior Risk Advisor to the Australian Department of Health and Ageing, my sense of responsibility was particularly strong. Medical professionals and members of the public made critical decisions based on the material that we provided. Even a simple change to the website was vetted and critiqued many times over. We knew it was critical that our communications were accurate, timely, and effective and, mostly, they were. But, as is true of any health organization across the globe, we weren't always right.
While working on my fifth book last week, I was writing a section on communication. During a break, I reflected on why it is that we take communication for granted. And also, why are most of us still so poor at it?
This article is, as you may have guessed, about communication. I’ve drawn on examples from the medical industry in order to illustrate my point; partially because I'm passionate about health, but also because, regardless of what we do professionally, we can all relate to it. The concepts, however, are relevant to all areas of life.
This is a long article, so I've created a pdf copy which you can download to read offline as a booklet in an e-reader.
WHY DOES COMMUNICATION MATTER?
For most of us, communication is mundane, with little thought given to the majority of our interactions. Nonetheless, it affects everything we do. Life, love, family, work. Everything.
The message transmitted is not what matters. It is the message received and the subsequent actions which we should focus on.
At age 21, during her last semester of undergraduate studies, Zoe Ligon received not one but two positive HIV tests. Over the course of a few weeks, she recalls that she "spiraled from a well-adjusted ... woman into someone who was ready to take her own life over a disease she didn’t have.” [1]. Zoe, however, persevered with further tests, all with negative results.
Part of the problem was the way the news was delivered. Zoe describes the experience as a rush of harried words over the phone when her doctor said “You tested positive for HIV, but then we did the more extensive blood work, and that came back negative, so you’re fine.”
It wasn't entirely her doctor's fault. Zoe's mind would have been unable to take on all the information given to her. This is a normal, self-protection mechanism we all employ at a certain point when we hear catastrophic news.
Zoe's suicidal thoughts were not unique. Take the cases of Juzar and Julius. Juzar Adamjee lived in Cornwall, England and was described as an 'exceptional' doctor. Julius Kyaligonza lived in Kagadi, Uganda and wasn’t a doctor, exceptional or otherwise. But they had at least one thing in common. Both men took their own lives shortly after receiving positive results to HIV tests. They are not alone. Many people have taken, or considered taking, their lives after positive tests for HIV, cancer, or other life-threatening illness.
"I have witnessed the response of countless people receiving life-threatening news about a critical diagnosis. After hearing the news, it is quite normal for them to block further information and to not hear anything more. It is an ego-protection mechanism for loss, and they are unable to absorb any new information until they have dealt with the initial part. It is possible that a doctor might offer further detailed information, but it is unlikely that most people would be able to assimilate it until they have dealt with the initial prognosis."
- Sister Ann Aichroy, OAM, SRN, ONC (UK), ONC, Clinical Nurse Consultant, Oncology and Palliative Care (Retired)
Few people realize that, for a person in a low-risk category, the chances of a positive test result being accurate are not high. For a definition of high-risk categories and behaviors refer to CDC.GOV or HIV.GOV.
A study of 22 blood donors in Florida [2] who had committed suicide after being informed that they were HIV positive, concluded that the chances of these individuals being actually infected were only 50%. Might the outcomes have been different if they had also been informed that they only had a 50% chance of being positive? We will never know. But as we will see, it is possible that their doctors and counselors were unaware of these facts.
THE COMMUNICATION PROBLEM
The primary purpose of communication is survival. Surviving by making strong connections and informed decisions with subsequently improved outcomes.
When I was in my thirties, I decided that I should improve my communication skills. I thought that involved public speaking, writing skills, a larger vocabulary, and body language.
It did. But the more I learned, the more I realized that communication is not what you transmit, but what is received by the other person. And it's not their responsibility for them to understand, it is your responsibility to provide the information in a way that they will understand.
You would think that this ‘insight’ would have been obvious already by the age of 30. After all, we communicate almost unceasingly, particularly if you take into account non-verbal communication. But no. It was an epiphany. We get busy, harried and rarely consider the quality of our communication, even though our quality of life depends upon it.
Think about it for a moment. Your ability to maintain relationships, to learn, and to get things done relies on your ability to communicate. Managers, colleagues, shop assistants, and family can all make life easy or difficult for you. It depends on how you interact with them, and how effective you are at delivering your message.
In an emergency situation, good versus bad communication can mean life or death.
Such situations require precise but minimal information. "Grab the first aid kit" or "Get out now!" must be direct and unemotional. Most of our communication isn't so simple. Often, we are dealing with emotions and need to consider the style, form, and content of our communication.
If you use words without care or deliver them when your recipient is distracted, the results are unlikely to be as intended. When you see people make bad decisions, does it make you wonder why? Could it be that, like the Florida blood donors, they were distracted by the emotional impact of the information? Bad news triggers a grieving process and a stress response. Even if they had been told that they had only a 50% chance of being positive, they might not have absorbed it.
If the counselors at the blood banks in Florida had additional information, could it have changed things? Knowing, for example, that other donors had committed suicide after receiving the news but only had a 50% chance that the diagnosis was accurate, would they have modified their approach? Blood banks don't have unlimited resources of course, but it is likely that some small changes may have led to better outcomes.
There are no doubt some decisions in life which we all would have made differently had we received better information. On occasion, we might find out what we need to know in time to make a wise decision. Sometimes we receive critical information too late, and sometimes not at all. Either way, most information on which we base our decisions is communicated to us by others, as reciprocated by us to them. Hence, the importance of ensuring that our messages are received as intended.
WHAT YOU SAY, OR HOW YOUR SAY IT?
Even the tone of your voice conveys the significance of the information. Try saying the following phrase out loud: "I never said she stole a book.” Now read it out another seven times, but emphasize a different word each time. It’s a worthwhile exercise.
And it’s more than the tone of voice that matters. How you communicate information counts as much, if not more, than what you communicate. Non-verbal communication is also key. Your body language, kindness when delivering bad news, a touch of a hand on a shoulder, assertive posture when delivering instructions, and many other things can make critical differences.
CHOOSE YOUR WORDS CAREFULLY
Kindness goes a long way when delivering bad news. "The report has not come back in the way that we might have hoped, but it may not be as bad as it seems" would be a much kinder way to open than "your tumor is malignant" or "you have cancer".
Even when delivering a statement of fact that doesn't directly affect the individual, it needs to be delivered in ways that the recipient can understand. In ‘Thinking, Fast and Slow', Daniel Kahneman offers examples of how the way you craft a message shapes the communication. Consider the following example. “A vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability.”
Now consider another description of the same risk. “One of 100,000 vaccinated children will be permanently disabled.” The second statement does something to your mind that the first does not. It calls up the image of an individual child who is permanently disabled by a vaccine. The 99,999 vaccinated children who are safe have faded into the background. [3]
It’s the same information, but the recipient’s beliefs, feelings, and actions are likely to be very different.
PERCEPTION IS REALITY
What we believe and perceive can shape our response to a message. We all see things from our own perspective. That is, after all, our main window to the world. But it’s not the only one. The idiom "Before you judge a man, walk a mile in his shoes" can help us see things from other people's perspective and is essential for good communication.
Consider an icy path in winter. It might inspire joy in a child who sees it as a playground for sliding on. For an older person, that same icy path might inspire a fear of falling that keeps them indoors.
Even then, it’s not so clearcut. Describing the path to an elderly person as "the same path you used to skate on with your childhood friends” will inspire different emotions from "that icy path is too dangerous and slippery.” Which statement is more likely to get them outdoors on a sunny winter's day?
FACTS CAN MISLEAD
The way in which you present data can change the way the recipient of your information is likely to behave. The O. J. Simpson murder trial offers a good example. His defense team was able to quash the prosecution's assertion that spousal abuse leads to murder. The defense argued that Simpson's history of assaulting his wife, Nicole Brown Simpson, was not relevant to whether or not he had murdered her. But was it?
The defense argued that “As many as 4 million women are battered annually by husbands and boyfriends. In 1992, according to the FBI Uniform Crime Reports, a total of 913 women were killed by their husbands and 519 were killed by their boyfriends. In other words, while there were … 4 million incidents of abuse, there were only 1,432 homicides. Some of these homicides may have occurred after a history of abuse but obviously most abuse, presumably even more serious abuse, does not end in murder”. [4]
Based on these figures, OJ’s defense team argued that there is less than one homicide per 2,500 incidents of abuse. They used this to claim that there was no evidence of domestic violence being a prelude to murder. Implying that only 0.04% of abuse leads to murder. While this is factually true, it is also misleading.
Alan Dershowitz, a Harvard Law Professor, proposed that the question should have been: “How many women were murdered by men who had previously abused them?”
At the time of the trial, statistics showed that out of every 100,000 battered women, 45 were murdered. Of those 45, 40 were murdered by men who had previously battered them. Over 90% of murdered women who had been battered by their partners, were killed by their partners. Rather than a 0.04% probability, past data suggested a probability of 90% that OJ was the murderer. [4]
Does a 90% probability constitute evidence of OJ’s guilt? Of course not. Would it have influenced the jury? We can never know for sure. But is it likely to have influenced your views?
ANALYSIS IS HARD WORK
Analysis - turning data into information or noise into signal - is essential for decision-making. Choosing whether to have a critical test, much less how to interpret and act on it, is challenging. Yet, we must try. And, we must do a better job of communicating the results of that analysis.
This is especially true for issues such as climate change, healthcare, natural disasters, and national security. Not only are these problems complex, we also require specialist knowledge to understand them in any depth. But even for the experts, it isn’t easy.
Consider the results of some research by Gerd Gigerenzer. [5] He first phrased the following question to HIV counselors in probabilities, as is typical of the way statistics are presented to counselors and medical professionals.
“About 0.01 percent of men with no known risk behavior are infected with HIV. If such a man has the virus, there is a 99.99 percent chance that the test result will be positive. If a man is not infected, there is a 99.99 percent chance that the test result will be negative. What is the chance that a man with no known risk behavior who tests positive actually has the virus?”
What is your answer? A hint: most people think that the correct answer is 99.99 percent or higher.
The probability of being infected if one tests positive is also known as the positive predictive value (PPV). To calculate the PPV we need to take the number of true positives (TP) divided by the number of true positives and false positives (FP). The formula for this analysis is PPV = TP / (TP + FP) where:
p {HIV} = 0.0001
p (pos | HIV} = 0.9999
p {pos | no HIV} =0.0001
Is this helping? Or giving you a headache? The full formula to work out the chance of a patient who tests positive, actually being positive, looks like this:
PPV = p(HIV)p(pos | HIV) / (p(HIV)p(pos | HIV) + p(no HIV)p(pos | no HIV))
The study noted that (HIV) “Counselors communicated numerical information in terms of probabilities rather than absolute frequencies, became confused, and were inconsistent.” The study even noted that a majority of the counselors explained that false positives do not occur, and half of the counselors told the client that if he tests positive, it is 100% certain that he is infected with the virus. In fact, as I’ve already alluded, the PPV (positive predictive value) is only around 50%.
PPV = (0.0001 0.9999) / ((0.0001 0.9999) + (0.9999 * 0.0001)) = 0.5
HOW CAN WE IMPROVE THIS SITUATION?
So if health professionals are confused, is it because they aren’t being provided with all the facts available to them? You would hope people who prepare health advice are consummate analysts and communicators, but that isn’t always the case.
Imagine you are responsible for publishing public health risk information for medical professionals. Your task today is to produce a leaflet for patients who are about to undertake a blood test for HIV. What would you say about false positives? As you now know, false positives in most tests are not uncommon. They are normal enough that when a test produces a positive result, the blood sample is retested to verify the result.
Despite the additional testing following a positive result, a small number of cases (roughly 0.01%) can still yield false positives (or false negatives) for a variety of reasons. These can include medical conditions, accidental swapping of blood samples, and data input error.
Despite this, most information about the disease does not mention the false positive rate. A study of 21 HIV/AIDS information leaflets in America found not one of the leaflets mentioned even the possibility of a false positive. [5]
An equally worrisome example of poor communication was identified in a 1998 German study of pre-test counseling for HIV tests. [6] Twenty counselors were assessed and, although they were very knowledgeable about most aspects of the topic, they exhibited significant gaps in the interpretation of tests.
Of the 20 health professionals in the study who gave pre-test counseling to a client with no known risk behavior (e.g. homosexual, IV drug user), five incorrectly claimed that false negatives never occur. A whopping sixteen (80%) incorrectly claimed that false positives never occur. The reasons included poor risk communication in their training, the illusion of certainty in testing, and a failure to understand that the proportion of false positives is highest in low-risk patients.
THE CURRENT SITUATION
You may have noticed that these studies are from last century. It would be reasonable to expect that our health professionals are more knowledgeable now, but that does not seem to be the case.
I did a Google search for “HIV test leaflet pdf” today. Nine out of the first ten pamphlets failed to mention the potential for false positives. The one that did, simply said that they were “very uncommon.” The word about the accuracy of blood tests and false positives may be getting out, but I couldn’t find any evidence. Zoe’s example from 2013, although anecdotal, also suggests that our communication skills are generally not improving.
The evidence I found when researching this article regarding false positives was still disappointing. To take one of many examples, the mammogram is considered to be the gold standard for screening for breast cancer. In 2007, the Annals of Internal Medicine [7] published a meta-analysis of 117 randomized, controlled mammogram trials. Amongst the findings, rates of false-positive results are high (20-56 percent after ten mammograms).
Five years later, a study published in the British Medical Journal (BMJ), [8] concluded that "Annual mammography in women aged 40-59 does not reduce mortality from breast cancer beyond that of physical examination or usual care when adjuvant therapy for breast cancer is freely available. Overall, 22 percent of screen-detected invasive breast cancers were over-diagnosed, representing one over-diagnosed breast cancer for every 424 women who received mammography screening in the trial.”
After 15 years of follow-up, the mammography group had significantly more cancer diagnoses than the non-mammography group. These were considered to be attributable to over-diagnosis. "Overdiagnosis is the diagnosis of "disease" that will never cause symptoms or death during a patient's ordinarily expected lifetime." (Wikipedia)
What these studies suggest is that one of the most commonly recommended tests for cancer generates up to 56% false positives. I do not intend to undermine your faith in the medical profession. I could provide many similar examples from many other disciplines.
The key message is that you need to take charge and do your own research.
In the words of two experts who have studied this type problem for years, Gerd Gigerenzer and J. A. Muir Gray, "Many doctors and most patients do not understand the available medical evidence. Seven ‘sins’ … have contributed to this lack of knowledge: biased funding; biased reporting in medical journals; biased patient pamphlets; biased reporting in the media; conflicts of interest; defensive medicine; and medical curricula that fail to teach doctors how to comprehend health statistics.” [9]
The next time you seek information in order to make a decision, look for the underlying data. Think of the Russian proverb, made popular in English by Ronald Reagan, when you next seek medical advice, “Trust but verify.” The same is true for any other discipline.
"67% of all statistics are made up on the spot."
- Anon
LIES, DAMN LIES, AND EMOTIONS
One can see how emotions influence our communications. In order to address this, start with accurate information. Consider, if you will, how we view data. Large numbers are abstract, vague concepts for those of us who aren’t physicists or mathematicians.
In contrast, gruesome events on the evening news are engraved in our minds. We call the tendency to overestimate the likelihood of events 'availability heuristic'. This is because they are more available to us to recall easily. This is influenced by how unusual or emotionally charged they may be. It explains why most people, for example, are more afraid of terrorists or sharks than they need to be.
But let’s compare those fears by looking at actual data. I’ll pick US data because the US offers a large sample size, as well as reliable records that are easy to find.
If you have read my Real World Threat Advisory article then you already know this. Suffice to say that the data tells a story which is almost the opposite of our fears. An average of 100 Americans die in terrorism-related events each year. During the same period, over 40,000 people are killed on US roads. And terrorists kill more people than sharks. Even bathtubs take the lives of more people than terrorists.
In 2013, in England or Wales, you were more likely to take your own life than to be killed by someone else. You were even twice as likely to die from medical care than homicidal assault. Your chances of dying from:
Suicide was 1 chance in 140;
Misadventure during surgical and medical care, 1 in 636 and,
Homicide or assault, 1 in 1,201
It turns out that even the things that are meant to help us can kill us. Roughly 100,000 Americans will die each year in hospital from documented and preventable medical errors [10]. Our lifestyle choices are also key sources of risk. Roughly 400,000 people die each year from tobacco-related illnesses. Diabetes and heart disease have reached epidemic proportions and the majority are due to preventable lifestyle choices.
NATURAL FREQUENCIES
If the first step is to start with accurate data, the way in which we communicate is the second. Communication itself can introduce risks. The problem is common, yet preventable. Consider what advertisers and lobbyists do. They use stories. We remember and understand information which is on a human scale. Particularly if it has emotional content.
We have lived for most of history in small villages. Groups of people and stories make sense to our minds. Presenting data as we evolved to understand it means using numbers we experience in our lives. If we talk, for example, about five people in a village of 100, that has more meaning to us than "five percent of a village". This is a technique known as 'Natural Frequencies'.
Consider our previous example of false positives for tests. One way the information could be better communicated would be to present it as a natural frequency. [5] For example:
The test is 99.99% accurate.
Imagine 10,000 people who are not in any known risk category.
One is infected and will test positive with practical certainty (99.99%).
Of the 9,999 who are not infected, one will return a false positive (0.01% or 1 in 10,000).
How many will test positive? And how many will actually be positive?
This graphic illustrates how our minds understand Natural Frequencies.
From this rephrased information, you can understand why it is likely that two people will test positive. One will be a false positive. The odds are roughly 1 in 2 (50%) that someone from a low-risk category who has a positive test result is actually positive.
This wording is clearer because our brain absorbs the information differently. Presenting the data in a complicated formula may produce the same answer, but it is anything but intuitive.
A word of caution. You would be unwise to become complacent about the results of such tests. For people in high-risk categories (homosexual men or IV drug users, for example), the chance of a false positive is less than 1%. Out of 10,000 people in a high-risk category with a base rate of 1.5% infection, we would expect about 150 to be HIV positive. We can, with practical certainty, determine that they will all test positive. Of the 9,850 who are HIV negative, it is likely that one would test positive. The chance of this person receiving a false positive is therefore 1 in 151. Less than 1%.
CONCLUSION
We need to be clearer in our communications. To express factual information in a simple and considerate manner, and in a way that our recipient can understand.
This article focused on health and wellness - areas that I'm passionate about - in order to illustrate some points. I could have chosen climate change, transportation, education, crime, housing, national security, or a host of other poorly-communicated issues. They are all important; but health is something which we can all relate to.
We have great scientists, big data, and some brilliant analysts. But, with all of this, we could still do more. Everything you say, whatever you write, your behaviors, and even how you spend your time, are methods of communication. If you're a doctor, pilot, emergency worker, plumber, politician or any form of living biped composed of atoms, then what you say affects others.
In many cases, the problem comes down to a lack of funding. There is no use blaming the doctors, lawyers, etc. who don't communicate well if they simply don't have the time. Even their managers rarely have the funding to resource their communication effectively.
We must think about how we resource time and materials for delivering communication. Our advertising industry is well-funded. Our medical community, sadly, not as much.
Irrespective of resource limitations, we can all improve how we communicate. If you're looking for some practical advice, take heart. It isn't complicated and we have the tools for change. The power to:
Put our attention where we choose, including, for example, learning, thinking, and communicating.
Send a message to our politicians and marketers by spending our money on what we value
Think about the recipient, their frame of mind, and their preferred methods of communication.
Be consistent and deliver your message as many times as is necessary for it to be received.
Remember, our barriers rise when our stress levels are elevated in response to bad news and during the grieving process.
Consider whether our communication is simple, logical information such as a train timetable, or complex emotional information such as the health of a loved one.
Improve our communication skills through study and practice.
DISCLAIMER
In 100 year's time, we (hopefully that will include me) will look back on our current technologies, communication skills, and medical systems, with amusement (perhaps with horror). Much as we now look back on the days when we 'knew' the world was flat, mercury cured syphilis, and snake oil improved everything.
If you've gained anything from this article, hopefully, it will be a healthy sense of skepticism. An understanding that much of what you have been told may be wrong.
Believe half of what you see, a quarter of what you read, and none of what you hear. If I'm right, you can probably trust about a quarter of this article. But then again, if I've just said you can believe quarter of what you read, does that mean you should only believe one quarter of this sentence? Which quarter? Or a quarter of a quarter of this article? In a tailspin, my logic is failing me. I choose now to vanish in a puff of pixels as an act of damage control ...
P.S. Be careful in the bathtub.
REFERENCES
[1] Ligon Zoe, ‘What Happened When I Got My False-Positive HIV Test Results’ 2015, http://www.refinery29.com/false-positive-hiv [2] Stine, G. J. (1996), Acquired immune deficiency syndrome: Biological, medical, social, and legal issues. (2nd ed.), Prentice-Hall, Englewood Cliffs, NJ USA. [3] Kahneman, Daniel. Thinking, Fast and Slow (p. 329). Farrar, Straus, and Giroux. Kindle Edition. [4] Dershowitz, Alan (1997), Reasonable Doubts: The Criminal Justice System and The O.J. Simpson Case, Touchstone, New York, USA. [5] Gigerenzer, Gerd (2002), Calculated Risks, Simon & Schuster, New York, USA [6] Gigerenzer, Hoffrage, and Ebert. (1998), AIDS counseling for low-risk patients, AIDS Care, 10, 197 – 211 [7] Armstrong, Katrina MD, MSCE; Moye, Elizabeth BA; Williams, Sankey MD; Berlin,Jesse A. ScD; Reynolds, Eileen E. MD, Screening Mammography in Women 40 to 49 Years of Age: A Systematic Review for the American College of Physicians, Annals of Internal Medicine, 3 April 2007;146(7):516-526 http://annals.org/aim/article/733981/screening-mammography-women-40-49-years-age-systematic-review-american [8] BMJ, 2014;348:g366 http://www.bmj.com/content/348/bmj.g366 [9] Gigerenzer, G. And Muir Grey, JA, Better Doctors, Better Patients, Better Decisions: Envisioning Health Care 2020,” edited by Gerd Gigerenzer and J. A. Muir Gray. The MIT Press. Except available at 01 Gigerenzer.indd [10] Kohn, L. T., Corrigan, J. M., and Donaldson, M. S. (2000), To err is human: Building a safer health system, DCS National Academy Press, Washington, DC, USA
Commentaires