Many of our beliefs are factually wrong. For example, according to the General Social Survey, roughly 20% of Americans polled in 2014 think that the Sun revolves around the Earth. Although such incorrect beliefs are a justified source of dismay to educators and scientists, they have little bearing on our everyday life. A sunset would look the same regardless of what revolved around what.
Other incorrect beliefs—for example, that it is a good idea to ride a bicycle on ice—have more direct consequences, but tend to be quickly corrected by experience. The intuition most people have of what would happen is exactly what happens (though with some slight modifications, ice biking is possible!).
A more concerning situation arises when someone’s incorrect beliefs have real-world consequences—either for the person or those around them—but the consequences are difficult to link to the incorrect belief. For example, someone who does not believe that washing hands helps prevent the spread of diseases may get sick more often and is more likely to infect others, but these real-world consequences are difficult to link back to the original false belief and the associated risky behavior.
In fact, sometimes a false belief may even get strengthened by real-world experience. For example, someone’s incorrect belief that vaccines are not needed may lead that person to withhold vaccinations from their children and have their belief bolstered by their children remaining healthy (as they are likely to, thanks to herd immunity created by those who have been vaccinated).
It is tempting to think that simply giving people more information can correct false beliefs and lead to better outcomes. But as researchers Kara Weisman and Ellen Markman remind us in a review paper recently published in the Psychonomic Bulletin & Review, attempts to change beliefs and behavior by giving people the correct facts have often failed.
What works better than mere facts, argue Weisman and Markman, is providing people with simple but coherent explanations. The paper goes through four case studies in which providing explanations to children and adults has led to effective behavioral change: (1) using hand-washing to help prevent viral epidemics, (2) completing the full course of antibiotics to reduce antibiotic resistance; (3) vaccinating children to stem the spread of infectious diseases; and (4) eating a variety of healthy foods.
In all four cases, augmenting people’s lay theories with explanations of why the recommended behavior has the desired effect led to better outcomes than providing people with simple facts.
For example, in an educational program designed in the aftermath of Singapore’s 2003 SARS outbreak, Au and colleagues reasoned that children may exercise better hygiene if they understood that bacteria and viruses can cause disease only if they are alive and that they are killed by disinfectants. Following an educational program that emphasized this connection, children were more likely to use napkins when touching food to pass to others, and their hand-washing increased threefold compared to a control group that received a more comprehensive education program which did not emphasize a coherent explanatory framework.
In a second case study, adults were asked to reason about their use of antibiotics. The World Health Organization (WHO) recommends simple guidelines for antibiotic use: Use antibiotics only when prescribed by a health professional; always complete the full prescription; and never share antibiotics with others or use leftover prescriptions. Yet noncompliance is high. In particular, many patients stop taking antibiotics when their symptoms improve.
One might ascribe this noncompliance to forgetfulness, but there is likely a better explanation. If one takes antibiotics to feel better, then once one feels better, there is no longer a reason to continue taking the drugs. The problem is that cutting the regiment increases the chance of selecting for more antibiotic-resistant bacteria, thereby reducing the future effectiveness of the antibiotic.
Weisman and Markman describe an explanation-based intervention aimed at getting people to complete their antibiotic regiments. The explanation focused on the difference between antibiotics and medication that treat symptoms. It described that bacteria have different tolerance to the antibiotic and that stopping the antibiotic regiment would kill the “weaker” bacteria, but leave the “stronger” bacteria alive giving them a chance to reproduce further and could cause diseases that are difficult to treat. Another group was directed to an information-rich website developed by the Center for Disease Control (CDC) to combat antibiotic resistance, and a third group was given the standard package insert which clearly stated to take the medication until the prescribed amount is finished. The results showed evidence of belief and behavior change only in the explanation group. For example, 66% of those in the explanation condition indicated that they would seek a replacement prescription if they dropped the last few days of antibiotics down the drain compared to fewer than 50% in the other two conditions.
A common message across the case studies above and the two others included in the paper by Weisman and Markman is that more information is not always better. As Weisman and Markman argue, “a judiciously crafted explanation is very likely superior to a full-blown educational curriculum, in which irrelevant or overwhelming details might obscure the central message.”
If explanations are so much more powerful than even simply articulated guidelines, why do health organizations continue to focus on guidelines? One reason may be that good explanations are surprisingly hard to create, requiring knowledge of people’s prior beliefs and knowledge of which causal links need to be clarified and which left alone. As difficult as good explanations are to create, they are often easy to recognize by the “aha” feeling they create.
A good source for such explanations is the “Explain like I’m 5” (ELI5) section of Reddit in which thousands of people get to vote on what explanations they like best, offering insights to researchers and policy-makers into what qualities are shared by well-liked explanations.
Article focused on in this post:
Weisman, K., & Markman, E.M. (2017). Psychonomic Bulletin & Review. DOI:10.3758/s13423-016-1207-2.