FDA Commissioner Robert Califf said in an interview that aired last week that misinformation is killing Americans — contributing to the fact that our life expectancy is three to five years worse than that of people in comparably wealthy countries. He called for better regulation to crack down on misinformation. But would such rules help?
I studied medical misinformation as part of a journalism fellowship, and as I’ve written in previous columns, there is a real danger when misinformed people skip lifesaving vaccines or buy into risky, untested treatments. Yet policing misinformation is tricky.
The fact-checking industry may even make the problem worse by confusing value judgments with facts, and by portraying science as a set of immutable facts, rather than a system of inquiry that constructs provisional theories based on imperfect data.
The advent of artificial intelligence tools like ChatGPT will only magnify the confusion. As my Bloomberg Opinion colleague Niall Ferguson recently wrote, some AI enthusiasts are plotting to “flood the zone with truth” — but this is problematic when people have an inflated idea of their own abilities to identify truth.
According to a new study from Oxford University, the very people who are most worried about misinformation are also the most likely to consider themselves impervious to it. Sacha Altay, the cognitive scientist who led the study, said there’s a strong correlation between concern about misinformation and feelings of superiority in spotting it.
Altay, who tested participants from both the U.S. and the U.K., argued that we’re seeing a moral panic about misinformation that’s been exaggerated by people’s false sense of superior discernment. He said he thinks the media are contributing to an “alarmist” view with stories that, for example, overstate how many people believe in QAnon. Perhaps the public is not as gullible as has been assumed.
Rebuilding trust is key
Cambridge University psychologist Sander van der Linden, author of the new book “Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity,” has done research that shows small nudges can motivate people to be smarter consumers of information. One of his most recent studies tested more than 3,000 U.S. participants on their ability to spot fake news stories with a political bent, and found their performance improved remarkably if they were given cash for each right answer.
How can we use insights like these to make the world less susceptible to deception and error? To Altay, stamping out misinformation is the wrong goal. Rebuilding public trust is much more important.
“It’s very dangerous for a democracy to promote ideas that people are stupid and there is misinformation everywhere,” he said. It’s far better to shore up trust in institutions and in reliable sources of information. His view reminded me of something I learned from former Soviet spy Larry Martin (formerly Ladislav Bittman), who defected to the U.S. in the 1980s.
When I interviewed him in 2017, he said that when the Soviets wanted to cause damage, they would spread such propaganda to undermine trust in our institutions — the government, universities, the press. It’s bad for democracy if people lose faith in each other.
And assuming (other) people are stupid is also bad for our health. People have a range of cognitive strengths and weaknesses in every country. Blaming online misinformation for shrinking American life spans is a cop-out — especially when we have an overburdened health care system that has made serious mistakes, from overprescribing opioids to failing to come up with an effective COVID strategy.
Our brainpower is what it is, but our health care system can do a lot better.
Faye Flam is a Bloomberg Opinion columnist covering science.