Skip to main contentSkip to navigationSkip to navigation
An anti-vaccine protest in London in November.
An anti-vaccine protest in London in November. Photograph: James Veysey/REX/Shutterstock
An anti-vaccine protest in London in November. Photograph: James Veysey/REX/Shutterstock

Twitter to remove tweets that spread lies about Covid vaccines

This article is more than 3 years old

‘Anti-vaxxer’ misinformation to be tackled from next week, and conspiracy theories from 2021

Twitter will remove tweets that spread harmful misinformation, starting with the Covid-19 vaccine, the company has announced – and from 2021 it will begin to label tweets that push conspiracy theories.

The move sees the company follow Facebook and YouTube in tightening up policies around the coronavirus vaccination as the rollout of the jab begins across the world.

“Starting next week, we will prioritise the removal of the most harmful misleading information,” the US company said in a blogpost.

“And during the coming weeks, we will begin to label tweets that contain potentially misleading information about the vaccines.”

Examples of posts that may be removed include false claims “that suggest immunisations and vaccines are used to intentionally cause harm to or control populations”, and claims “that Covid-19 is not real or not serious, and therefore that vaccinations are unnecessary”.

Tweets that do not reach the level of potential harm will not be removed, but may receive a label linking through to authoritative public health information, the company said.

Examples of that sort of claim include unsubstantiated rumours, disputed claims, as well as incomplete or out-of-context information about vaccines.

The labelling will have a similar visual appearance to the company’s notorious labels about the US election, regularly placed on tweets from Donald Trump in which he falsely claimed victory in the US election.

Twitter said it would enforce the policy “using a combination of technology and human review”.

Confusingly, the company has no way for users to report Covid misinformation, or misinformation about vaccines, despite the content being banned on the site.

Instead, Twitter says users who think a particular tweet breaks the company’s rules on the topic should report it for any other offence – such as “threatening harm” – and use the text box to add that it is banned misinformation.

The move comes two weeks after Facebook tightened its own policy about Covid vaccines.

The larger social network will remove claims that rise to the level of imminent physical harm, as well as claims that have been debunked by public health experts, even if they do not reach that level.

Chinese network TikTok has also strengthened its policies on vaccine misinformation, announcing on Tuesday that it has policies in place that prohibit misinformation “that could cause harm to an individual’s health or broader public safety”.

The company also said it would be marking all videos about vaccines with a link through to “verifiable, authoritative sources of information”.

Making such policies is easier than enforcing them, however, and at a parliamentary hearing on vaccine misinformation, TikTok was asked how one particular influencer, Olivia Madison, had managed to get 38,000 followers while making wildly false claims about vaccination.

Madison, an American, describes herself as “pro life, pro guns, pro Trump”.

During a digital, culture, media and sport select committee hearing in parliament on Thursday, the Scottish National party MP John Nicolson said of Madison: “She’s very beautiful and what she does is utterly wicked.”

If TikTok cannot remove someone with that large a following, he said, “what chances are there that you’re going to get rid of the smaller fry? I mean, this woman’s just screaming lies as publicly as she possibly can in very professionally produced videos.”

The company rapidly made up for its oversight, however, banning Madison from the platform entirely before the hearing had even ended.

“It’s a pity it takes a parliamentary select committee hearing to get rid of this stuff,” DCMS committee chair Julian Knight told TikTok director Theo Bertram. “We can’t do it every time.”

Most viewed

Most viewed