Wrong, But Still Right
In a recent opinion piece in The Washington Post, Emily Thorson, an assistant professor of political science at Boston College, wrote about an infographic that kept popping up over the Thanksgiving holiday in some of the Twitter feeds she follows. The graphic put a racial spin on gun death statistics, but none of the data was true. The tweets came from news outlets and pundits whose intent, through myriad links and third-party references, was to correct the false data.
The reported source of the Twitter storm was a retweet from none other than presidential candidate Donald Trump. Politics aside, Thorson’s interest was piqued because she has done research that shows these kinds of fact-checking efforts and corrections often backfire, creating what she terms “belief echoes,” which can affect people’s attitudes even when they know something is false.
In one series of online experiments, she randomly assigned 905 participants to read one of three versions of a news article about a fictional political candidate. The control version just described the campaign; the second version added information that the candidate had been accused of accepting donations from a convicted felon; and the third version duplicated the second, but also contained a correction specifically stating that the accusation was false. No one in any group saw more than one version of the story.
After reading the story, the subjects were asked questions about the candidate. The control group was more positive than those in the second group, but the third group—the people who had read the correction—were equally negative. Thorson wondered if maybe they just didn’t believe the correction, so she asked all three groups additional factual questions about the story, including one about the accusation.
What she discovered was that the correction worked for those who read it, but it didn’t change the negative impression the original accusation had created.
False Positive
Thorson explains a number of ways that misinformation creates belief echoes, including strong emotional affect from the misinformation, and the tendency of the human brain to create causal relationships, even where none exist. She also notes that, although her experiments involved political themes, corrections are undermined by belief echoes in other areas of activity as well.
Her recommendation: “When we spread a correction, whether it’s through tweeting or conversation, we should do our best to avoid repeating the false information.”
What’s this got to do with remodeling? For one thing, I wonder if it works for false positives, such as the exaggerated claims a salesperson might make about energy efficiency or a product warranty. Even if the information is later corrected, I suspect that people continue to believe what they wanted to hear—that they were making a smart investment.
But it’s the negative side of the phenomenon that is most worrisome because “belief echoes” seem to explain what goes on when people read negative online reviews. If that’s true, then remodelers are more vulnerable than I thought because no amount of rational setting straight of the record will do any good—and may actually make things worse.
Many social media experts say that the best way to counter a bad online review is not to argue or obsessively correct the misinformation, but to bury the bad review with a couple hundred good reviews.
Maybe they’re onto something.