YouTube said Friday it would stop removing content that falsely claims the 2020 election or other past U.S. presidential elections were marred by “widespread fraud, errors or issues.”
The change is a reversal for the Google-owned video service, which said a month after the 2020 election it would start removing new posts that falsely claimed widespread voter fraud or mistakes had changed the result.
YouTube said in a blog post that the updated policy was an attempt to protect the ability to “openly debate political ideas, even those that are controversial or based on disproved assumptions.”
“In the current environment, we find that while removing this content will curb some misinformation, it could also have the unintended effect of restricting political discourse without significantly reducing the risk of violence or other harm in the world. real,” the blog post read.
The updated policy, which is effective immediately, will not prevent YouTube from removing content that attempts to mislead voters in the upcoming 2024 election or other future races in the United States and abroad. The company said its other existing rules against election misinformation remain unchanged.
The announcement comes after YouTube and other major social media companies, including Twitter and Meta-owned Facebook and Instagram, have come under fire in recent years for not doing more to tackle election misinformation and disinformation. that spread on their platforms.
Copyright © 2023 Washington Times, LLC.