Terry H. Schwadron
The exhausting arguments about health care are bouncing back and forth, tennis ball style, just as previous political collisions over whether there were wiretaps on Trump Tower, whether the earth is heating up, whether crime rates are moving up or down, whether immigration is out of control, or even whether this inauguration was the largest ever.
Once, at least we thought, such things might be brought to a practical conclusion, at least, by the arrival of facts. No longer. Point of view and majority votes are the new arbiters. He who talks loudest and most often, with or without facts, becomes the winner of today’s argument, having simply outmuscled anyone arguing facts in support of an opposing position.
Hence, political fundraising, lobbying and swamp-building of all kinds. And dealmakers like you know who.
Business insider magazine ran a piece recently that says we’ve been fooling ourselves, and reminding us that this probably always has been so.
Why do inaccuracies and misinformation continue despite earnest attempts to correct each falsehood after it is made, the article asks.
Indeed, the article cites research that shows that generally, misinformed people do not change their minds once they have been presented with facts that challenge their beliefs. In fact, they are likely to become yet more sure of their mistaken beliefs. According to the article, factual information “backfires.” As explained, when people do not agree with you, research suggests that bringing in facts to support your case might actually make them believe you less, semantically like fighting a grease fire with water. It seems like it should work, but it’s actually going to make things worse.
The magazine article cited a 2010 academic study that probed this question. So, I went to find it. This issue seems important to me, since I am addicted to looking for factual information rather than just spit-balling opinions. And I do so in hopes of persuading others that the facts might matter.
As it turns out, for a paper in the journal, Political Behavior, Brendan Nyhan (University of Michigan School of Public Health) and Jason Reifler (Georgia State University, Political Science) led experiments with groups who read newspaper articles that included statements from politicians. The statements supported some pieces of misinformation. Some participants immediately then read articles that included information that corrected the inaccurate statements, while others did not. All were asked a series of questions about the article and their personal opinions about the issue.
The researchers found that how people responded to the factual corrections in the articles they read varied systematically by how ideologically committed they already were to the beliefs that such facts supported. Among those who believed the popular misinformation in the first place, more information and actual facts challenging those beliefs did not cause a change of opinion — in fact, it often had the effect of strengthening those ideologically grounded beliefs.
The pair had undertaken the research because they said there was extensive literature addressing citizen ignorance, but very little research on misperceptions. Their question: Could false or unsubstantiated beliefs about politics be corrected effectively or realistically?
be corrected? Previous studies have not tested the efficacy of corrections in a realistic format. Indeed, their paper quoted Mark Twin: ‘‘It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.’’
From previous studies, it seems to be an academic given that most citizens appear to lack factual knowledge about political matters and that this deficit affects issue opinions that they express. Some scholars have respond that citizens can successfully use slogans and stock phrases as a substitute for detailed factual information in some circumstances. (Any of this sound familiar?) Still other studies have sought to distinguish among those who are uninformed from being misinformed.
In their case, the researchers variously asked about information and news stories from would-be news sources involving WMDs in Iraq, tax cuts and stem cell research — all from an era in which strong advocacy of an issue might tend to drown out opposing voices, but that did not involve the level of daily calls about FAKE NEWS that we hear today. Still, both self-identified conservatives and liberals showed that attempts to correct information they held as beliefs reflected in the information they took in “backfired,” and only strengthened their opposition.
It makes me worry about “fairness” in jury systems, science arguments, race and ethnicity issues and international relations even more. What bothers me about creating nutty government policy about such things as disparate as abortion, immigration and climate is that facts do matter in my world.
Would we hire a plumber who wants to believe that water flows upward rather than downhill?
It makes me wonder about how best to explain to well-meaning Trump supporters that there won’t be a return of coal-mining, and that they face loss of their health care in return for some kind of ideological win about their “freedom” not to be able to afford it any longer.
The only effective way for “facts” to enter in these conditions, then, may be to keep a close eye on the actual effects of policies. When people see it for themselves, they will declare a new “fact” that they can accept. Of course, this may come too late to keep ill-intended policy from affecting others.
If we can’t use facts, we had better hope for a snazzy-looking, youthful, optimistic, money-abled political opponent to appear on the scene who just happens to believe in facts. Know anyone?