もっと詳しく

YouTube expanded its medical misinformation policies today to include new guidelines that ban vaccine misinformation. The Google-owned video platform had previously banned over 1 million videos spreading dangerous COVID-19 misinformation. Now, YouTube says it will also remove content that spreads misinformation about vaccine safety, the efficacy of vaccines and ingredients in vaccines. The platform previously banned misinformation specific to coronavirus vaccines, but now, its policies are being updated to also block misinformation about routine immunizations, like those for measles and Hepatitis B, as well as general false statements about vaccines that are confirmed safe by local health authorities and the World Health Organization (WHO).

This change in policy comes as COVID-19 vaccination rates slow — in the U.S., about 55% of people are fully vaccinated, but these percentages are larger in countries like Canada and the United Kingdom, which have vaccinated 71% and 67% of people against COVID-19, respectively. But President Biden has pointed to social media platforms as a place where vaccine misinformation spreads. The White House has even enlisted the help of rising superstars like Olivia Rodrigo to encourage Americans to get vaccinated.

With these new guidelines, YouTube is following the footsteps of Facebook, which expanded the criteria it uses to take down false vaccine information in February. Twitter also bans the spread of misleading COVID-19 information and labels tweets that might be misleading by using a combination of AI and human efforts. Twitter even suspended Georgia Representative Marjorie Taylor Greene after she falsely claimed that vaccines and masks do not reduce the spread of COVID-19.

Some examples of content that violates YouTube’s new guidelines include videos that claim vaccines cause chronic side-effects like cancer or diabetes; videos that claim vaccines contain devices that can track those who are inoculated; or videos asserting that vaccines are part of a depopulation agenda. If a user posts content that violates these guidelines, YouTube will remove the content and let the uploader know why their video was removed. If it’s a user’s first time violating community guidelines, YouTube says that they will likely get a warning with no penalty. If not, the user’s channel will receive a strike; if a channel gets three strikes in 90 days, the channel is terminated. YouTube will also take down several channels associated with high-profile anti-vaccine figures like Joseph Mercola and Robert F. Kennedy Jr.

“There are important exceptions to our new guidelines,” YouTube wrote in a blog post. “Given the importance of public discussion and debate to the scientific process, we will continue to allow content about vaccine policies, new vaccine trials, and historical vaccine successes or failures on YouTube.”

The platform will also allow users to discuss their personal experiences with vaccines, so long as the content doesn’t violate other guidelines — but if a channel shows a pattern of promoting vaccine hesitancy, YouTube may remove the content. These guidelines will be enforced starting today, but YouTube wrote that, as with any new policy, it will take time to “fully ramp up enforcement.”