Medical misinformation is often not an outright lie. It’s more subtle than that

Jhe most powerful forms of deception rely more on emotional manipulation and misdirection than outright lies. This is what I have observed in nearly a year of research in the murky world of medical misinformation.

Take the episode of “The Joe Rogan Experience” podcast that prompted music legends like Joni Mitchell and Neil Young to pull their music from Spotify, where Rogan is the platform’s most popular podcaster. This episode’s guest, medical researcher Robert Malone, has created a distorted picture of the alleged dangers of vaccination with a combination of anecdotes, cherry-picking, innuendo and wildly improbable speculation – not deliberate lies.

Whether Malone’s juggling is called misinformation or something else, the resulting confusion can lead people to make fatal decisions not to be vaccinated. In media interviews and social media posts, unvaccinated people who had been infected with SARS-CoV-2, the vaccine that causes Covid-19, and who fought for their lives in intensive care urged others not to make the mistake that they had committed. Some told investigators that they had been so distracted by the hype about side effects that they forgot to think about the danger of disease.


I started to struggle with the concept of medical misinformation when I proposed it as a topic for a bourse offered by the Society of Professional Journalists. I have spoken to various infectious disease experts about confusing rules, fuzzy forecast, decisive casesand the implications of gain-of-function research. I talked to psychologists and computer scientists examine why people spread false information on social networks, historians looking at how we decide what constitutes legitimate science and what constitutes pseudoscience. In the future, I hope to interview more mainstream people in different regions about their experiences and how they made up their minds.

I had previously viewed medical misinformation as pseudoscience promoted by those who sell alternative cures, fad diets, etc. It still exists. But this is mixed with more politically motivated misinformation.


Spreading malicious rumors about one’s enemies is something political scientists and evolutionary psychologist Michael Bang Petersen told me is deeply rooted in humanity’s affinity for groups. Social media has now not only polarized people on pandemic-related issues, but it has also provided an easy channel to sow confusion.

But the public’s confusion is mostly due to manipulation, tilting, spinning and shoddy information rather than outright lies. Changes in the media landscape encourage such deceptions to thrive at the expense of honest reporting.

Although traditional media is not perfect, most publications maintain accountability to readers and those who are written. Social media runs on algorithms designed to amplify anything that gets attention, and new policies of fact-checking and censoring content without any transparency only increase the power of social media companies. But there are better alternatives to fact-checking and censorship for dealing with the range of issues that qualify as misinformation.

Fact-checking does not lend itself to scientific ideas, since science is not a body of facts. Instead, it is a system of inquiry that can be seen in a dynamic interplay of data and theories. And his not always easy draw a sharp line between legitimate minority opinion and fringe or pseudoscience. Fact checkers could easily introduce their own political biases and extinguish innovative concepts and diversity of thought. During the pandemic, the word “misinformation” has been thrown around to describe ideas that people disagree with for political reasons.

Rogan’s three-hour conversation with Malone — which I listened to from start to finish — provided a textbook case in the complexity of misinformation. It also poses a challenge to those who are calling for greater public trust in scientists.

Malone is a scientist. His research in the 1980s and 1990s is recognized as legitimate and important. But on the podcast, he deployed the deceptive and manipulative tactics common to those who promote pseudoscience. It is not the first scientist to do so.

Malone has repeatedly cited connections and alleged insider status with the US Department of Defense and the FDA. He claims to have been the inventor of mRNA vaccine technology, although in reality he is recognized as one of many contributors.

Despite all the connections and accolades Malone claims, I was surprised how few well-known members of the infectious disease research community had heard of him, even after all the controversy. Last week, at a Covid-19 press conference hosted by Harvard Medical School, the University of Massachusetts and other collaborators, I asked what people thought of Malone. The response was blank stares and quick Google searches.

Rogan’s listeners, however, were led to believe they were seeing science’s brightest star. Some may have thought they weren’t fast enough or smart enough to follow Malone’s logic, when in reality the logic was full of holes.

When he used data, it didn’t clearly support his claims – which included the idea that Covid-19 vaccines had caused premature menopause and could make the disease worse through a mechanism called drug-dependent enhancement. antibody.

There are real data that some women didn’t get their periods after getting the vaccine, and I’ve spoken with scientists who say that in animal studies some of the spike proteins generated by the vaccine can spread to different parts of the body – but there is no evidence that they harm it. Malone also insisted on what he saw as an effort to cover up the importance of the antiparasitic drug ivermectin, which has been the subject of dozens of studies and has not been shown be effective against Covid-19.

I’ve covered almost every topic Malone has covered in my own podcast,”follow the science.” But the scientists I interviewed gave more detailed arguments and came to different conclusions. In general, scientists who make a legitimate effort to promote understanding will show how they have drawn some inference from datasets and will often provide other evidence such as basic biology or chemistry and mechanisms plausible.

One of my favorite podcast guests is medicinal chemist Derek Lowe, who used chemistry to help explain why ivermectin probably doesn’t work against viruses, and why vaccines are very unlikely to aggravate the Covid-19, even if it arrived in rare cases with other vaccines.

Twitter “de-platform” Malone in December. On the contrary, this attempt at censorship only increased its mystique. And Malone probably wouldn’t have become famous — or infamous, depending on your perspective — if social media hadn’t amplified his most provocative anti-vaccine claims, ones that didn’t impress his traditional colleagues or the science journalists.

Comedian Jon Stewart has blamed social media algorithms for our misinformation problem – and a number of researchers agree. Studies by computer scientists have shown that experimental automated accounts are pushed towards extreme content and polarized bubbles and that the algorithms in charge amplify information without regard to its actual importance or accuracy.

People care about accuracy and they haven’t caved in to a post-truth world, social scientists David Rand of MIT and Gordon Pennycook of the University of Regina in Canada told me. in an interview. They collaborated with other colleagues on a study published in Nature showing that people really want to share accurate information but give in to the temptation to share juicy gossip that they think will please their friends or make them look good.

Rand himself admitted to giving in to temptation, sharing a tweet claiming that Ted Cruz said he would believe in global warming when Texas froze. This meme circulated when Texas had unprecedented snowstorms and freezing weather. “It was just too delicious,” Rand told me.

In experiments, he and Pennycook showed that if they simply asked volunteers to rate the accuracy of a headline, the prompt improved the accuracy of their social media sharing for the rest of the day. daytime.

Even more intriguing was their find that asking 10 or 12 people to independently rate the accuracy of the tweets yielded results that agreed with the assessments of the professional fact checkers as well as the fact checkers agreed with each other. Crowdsourcing can work as fact-checking, but it is known to only work if the individuals within the crowd think independently. Social media encourages the opposite behavior: forming opinions based on what others say.

Instead of using crowdsourcing to flag content for censorship, it could be deployed to increase the popularity of posts most likely to be accurate. There is nothing organic today about the way algorithms decide who and what will be popular. Why not change them in a way that encourages precision and puts power back in the hands of the people?

The message from Malone’s conversation with Rogan that appears to have been the most attention is that people were hypnotized by what he called mass formation psychosis. It resonated – even though no one really knew what it was. The world has become so polarized that it’s easy to imagine those on the other side of the left-right divide experiencing an alternate reality.

People are manipulated by the sources they rely on for information – social media feeds. But we don’t need to close them to regain control.

Faye Flam is a science journalist and columnist for Bloomberg Opinion and host of the “Follow the Science” podcast.

Comments are closed.