An Ohio State Study — Demonstrated that Providing People with Immediately Corrected Information Does Not Reliably Change their Mistaken Minds — the Finding Corroborates Suspicion that Determined Stupidity Is Our Lot in Life — which May Account for American Politics’ Frequently Stubborn Resistance to Facing Facts

© 2013 Peter Free

 

06 February 2013

 

 

Citation — to the study

 

R. Kelly Garrett and Brian E. Weeks, The Promise and Peril of Real-Time Corrections to Political Misperceptions, Ohio State University School of Communication (January 2013) (PDF)

 

 

Note

This paper is going to be presented at the Association for Computing Machinery [ACM] 16th Conference on Computer Supported Cooperative Work and Social Computing (23-27 February 2013 in San Antonio, Texas).

 

 

Citation — to press release about the study

 

Jeff Grabmeier, False Beliefs Persist, Even After Instant Online Corrections, Ohio State University – Research and Innovation Communications (24 January 2013)

 

 

If we were rational beings — immediately corrected misinformation would change our mistaken minds, wouldn’t it?

 

Probably.

 

But, since we are not especially rational, immediate correction apparently makes pre-existing false beliefs more resistant to change.

 

 

The Ohio State experiment — the question

 

Professor Kelly Garrett and graduate student Brian Weeks wondered whether it was more effective to have an Internet browser:

 

(a) that immediately posts corrections to untrue political allegations,

 

or

 

(b) one that delays corrections, until after the viewer had gone on to do something else.

 

Psychological research already indicates that immediate correction tends to breed resistance to accepting the correction.

 

But even if that is so, might not immediate correction still be preferable to delaying correction until after the misinformation is set?

 

 

The key question is whether a correction embedded in an inaccurate message generates more resistance than a correction presented at a later time.

 

© 2013 R. Kelly Garrett and Brian E. Weeks, The Promise and Peril of Real-Time Corrections to Political Misperceptions, Ohio State University School of Communication (January 2013) (PDF) (at 4th paragraph under “Political fact-checking psychology”)

 

 

Describing the test subjects

 

The research team got 574 people across the United States to participate in an experimental online attempt to answer the “immediate versus later” correction question:

 

 

The sample is 49% male, has an average age of 45.8 years (SD = 15.8), and is racially diverse (86.9% While, 6.8% Black, 6.3% other).

 

Participants also had a range of party affiliations (25.4% Republican, 34.7% Democrat, 28.1% Independent, 11.9% other) and of ideologies (28.4% Liberal, 35.0% Moderate, 36.6% Conservative).

 

© 2013 R. Kelly Garrett and Brian E. Weeks, The Promise and Peril of Real-Time Corrections to Political Misperceptions, Ohio State University School of Communication (January 2013) (PDF) (at first paragraph under Experimental Study) (paragraph split)

 

 

How the experiment worked — a fake blog contained wrong facts

 

The experiment centered around an online political blog that the researchers had made up.  The blog incorporated both accurate and untrue information regarding electronic health records.

 

The untrue data had to do with the claim that “hospital administrators, health insurance companies, employers, and government officials have unrestricted access to personal health information.”

 

Before exposing the participants to the blog, the researchers:

 

(a) assessed their pre-existing familiarity with electronic health records

 

and

 

(b) provided them with an accurate introductory overview of how electronic health records actually work.

 

Participants were divided into three experimental groups:

 

The first group (of 191 people) read the blog entry and then completed an unrelated 3-minute image comparison task.  Afterward, a 378-word correction (attributed to FactCheck.org) to the blog appeared on the computer screen.

 

The second group (182 people) received immediate corrections to the blog’s mistaken claims.  These corrections were attributed to a third-party fact checker and were highlighted within the blog entry in red.  The first group’s 378-word correction appeared on the bottom of the second group’s webpage.

 

A control group (201 people) read the blog, but got no corrections.

  

Findings

 

Immediate correction worked slightly better than delayed correction, which obviously worked better than no correction.

 

But when the data was analyzed, the authors discovered that immediate correction fostered increased resistance among the participants who were already predisposed to believe the false claims.

 

The ineffectiveness of immediate correction (among the group of people predisposed to accept the falsities) appears to have been due to their tendency to assign less credibility to the correcting source.

 

This disparity was less evident during delayed correction.  Delay apparently arouses less instinctive argumentation.  Dr. Garrett said that:

 

“The problem with trying to correct false information is that some people want to believe it, and simply telling them it is false won’t convince them.”

 

For example, the rumor that President Obama was not born in the United States was widely believed during the past election season, even though it was thoroughly debunked.

 

Garrett said the results of this study cast doubt on the theory that people who believe false rumors need only to be educated about the truth to change their minds.

 

 “Correcting misperceptions is really a persuasion task. You have to convince people that, while there are competing claims, one claim is clearly more accurate.”

 

© 2013 Jeff Grabmeier, False Beliefs Persist, Even After Instant Online Corrections, Ohio State University – Research and Innovation Communications (24 January 2013)

  

Achieving public rationality is a difficult goal

 

In my opinion, persuading people who hold irrationally based beliefs is a more difficult task than Dr. Garrett appears to imply.

 

When people are wired to deny facts, distrust reason, ignore science, and favor of conspiracy theories — they are automatically resistant to logically based persuasion.  They do not care about the accuracy that Dr. Garrett implies that they should.  Consequently, I do not think that it much matters (with this group) when, or how, reasoned “truth intervention” takes place.

 

Changing irrational people’s minds probably requires indulging in Machiavellian schemes that manipulate their milieu and alter their sense of who (or what) is credible and who (or what) is not.

 

 

A hypothetical example — changing my imaginary Mr. Rock Head’s mind

 

Assume that our experimental subject, Mister Rock Head, has an in-group that lives in City A and City B.  He lives in City C.

 

In the interest of investigating how to change irrational people’s minds, I come up with a scheme to irradiate City A with warming rays sent down by an invisible satellite.  I have configured the satellite to (equally undetectably) raise that metropolis’s temperature by 15 degrees Fahrenheit, year round.

 

After three years of doing this, I poll City A residents.

 

Predictably, its sub-population of global warming deniers has been almost entirely converted into climate change believers.

 

Not surprisingly, Mister Rock Head’s in-group in City B has been largely converted, as well — even though City B’s temperatures have only escalated to the degree that the rest of the region has.

 

Rock Head, himself, is now also a global warming convert because his “credible” in-group told him that warming is occurring.  Of course, he still does not care about seeking the scientific evidence that a more rational person would demand.

 

 

The moral? — Changing predominantly irrational people’s adherence to provably mistaken beliefs is an uphill task

 

And I suspect that the manipulative means that are probably required to do so would strike the majority of reasonably rational people as unethical.

 

Perhaps I am too pessimistic.  But over my nearly seven decades, the evidence has mostly gone the other way.