As recently as 15 years ago, Americans had a fairly limited choice when it came to the news. In many towns, there was one, maybe two, newspapers, the trio of three-letter networks with their news programs. Those with cable might have been getting their first taste of the 24-hour news cycle with the likes of CNN. Those without might have used one of their AM presets on a news station. Although you could turn the dial or flip from station to station, chances are most stories shared a lot of the same details and images.
The digital age, however, has made that a distant memory, its arrival bringing a plethora of options beyond the traditional sources—including a host of varying interpretations for each story. That new media world, and the fast-moving spread of news over social media, that has made “fake news” a ubiquitous term.
While usually the term is used in the context of politics, “fake news” has infiltrated medicine, with online discourse giving new life to those arguing against vaccinations, as well as modern-day snake oil salesmen touting “cures” to cancers that can’t be cured and herbal remedies that are unproven at best.
It’s this environment that has both Raina Merchant, MD, director of the Center for Digital Health and associate vice president and associate professor of Emergency Medicine, and David A. Asch, MD, MBA, executive director of the Penn Medicine Center for Health Care Innovation, concerned. In a new perspective piece published in JAMA, they hope to provide some guidance for medical professionals and scientists as they wade into online discussions.
Prominent in their piece is breaking through people’s spheres of influence. Building on their recently published paper, Merchant and Asch suggest four things to keep in mind to protect scientific thought and break through the fake news noise.
1. Everyone is Susceptible
Although scientists and health professionals have the best intentions when trying to promote health research, they’re still people. They can fall into the same traps.
“My view is that researchers have the same challenges with echo chambers,” Merchant said. “We read the same sources and that can cause us to only listen and think like each other. It can be a challenge to think of new research like someone who doesn’t have the same background and training that we do.”
Asch explained that people generally keep a mental model that allows for them to accept—or deny—the truth of something, which could cause a bit of resistance to new thinking. That might especially include someone very active in research. Asch said that researchers who see evidence counter to what they’ve found, or what they’re looking for, can react with resistance.
“Even scientists are at risk for discounting competing theories,” Asch said.
So before someone looks to break into someone else’s bubble, they need to think about their own preconceptions.
2. People Aren’t Uninformed on Purpose
The majority of people who promulgate medical fake news are not likely doing it maliciously. Most aren’t the snake oil salesmen mentioned earlier.
“There are some people who have some sort of non-evidenced or faith-based belief system, so they promote an approach consistent with that, such as using crystals or copper bracelets for medical reasons,” Asch said. “And then there are others who don’t have other scientifically proven options and are grasping at hope, making them willing to lower their guard.”
He called the latter the “Rasputin effect,” referring to the infamous Russian mystic who gained significant sway over the last czar and czarina by claiming healing powers over their child, who had hemophilia.
“It’s instructive to unpack why people believe what they do,” Asch said.
3. “Science is Not a Popularity Contest”
In their paper, Merchant and Asch agree that the journals where new studies are published are a great asset to discrediting bad information. However, the peer-review process that is critical to making sure content is accurate “may be the ally in greatest need of support.”
That’s because many scientists are now moving forward more quickly with their information before its fully ready for publication. A recent movement encourages prepublication prior to the peer review process, to accelerate the spread of information. But Merchant and Asch argue that with the rise of fake news, we may be better off shoring up the peer review process—lengthy though it sometimes is—rather than bypassing it.
Complicating things, researchers themselves can now judge their impact not just by scientific citations in other journals, which take years to reveal themselves, but with tweets and posts on social media. Part of the allure might have to do with Altmetric, a tool attached to many papers now published online that, among other things, tracks who has tweeted about a paper or shared it online. Merchant and Asch describe it as providing researchers “fame in 15-minute doses.”
So why is social media so appealing for researchers?
“Messages on social media are designed to spread,” Merchant said. “They enable unprecedented connectivity and access that was before more cumbersome. For some, it can greatly increase your ability to share information and disseminate your work, especially with researchers who share your interests.”
That’s why Merchant and Asch agree: “Science is not supposed to be a popularity contest.”
4. Support Science with Science
None of this is to say that researchers should stay off social media. Without a healthy presence, unsubstantiated, evidence-less stories would truly run wild. So researchers and health professionals should engage in new media channels, but always emphasize the rigor behind the work and its review.
Overall, Merchant and Asch know that there is still much to figure out when confronting fake medical news—and defending true medical research—on the digital frontier. They hope engagement can become constructive and positive.
But it’s also no sure thing, and they hope the medical community will think harder about it.
“If we can’t fix this, if science moves into the post-truth era,” Asch warned, “then we are back in the dark ages.”