Facebook vows to crack down on voter intimidation in election
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
Facebook said on Wednesday it would crack down on efforts to intimidate voters, amid growing concerns over potential confrontations at polling places and calls from the Trump campaign for an "army" of poll watchers.
The company said it would remove posts urging people to monitor voting places if the posts "use militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials or voters."
Yet a prominent example of such a post, in which Donald Trump Jr., the president's son, urges "every able-bodied man, woman to join Army for Trump's election security operation," will remain on the platform. Facebook says it does not apply its policies retroactively.
The company will remove only new posts that use militarized language such as "army" or "battle," Monika Bickert, head of content policy, told reporters on Wednesday. "Under the new policy, if that [Trump] video were to be posted again, we would indeed be removing it," she said.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
Facebook also announced it will stop running all political and issue ads in the U.S. for at least a week after polls close on Election Day "to reduce opportunities for confusion or abuse," amid growing expectations that the results of the presidential election may not be immediately known.
"We know this election will be unlike any other," Sara Schiff, Facebook's product lead for political advertising, told reporters.
Facebook defines "social issue" ads in the U.S. to cover a wide range of subjects, from civil and social rights, environmental politics and guns to health, immigration and education.
The post-election advertising ban brings the company in line with Google, which last month said it will not publish political ads after polls close. Twitter and TikTok banned all political ads last year. Facebook, which has resisted calls to fact check political ads, had previously said it would not accept any new political ads in the week before the Nov. 3 vote.
Election watchers and social media companies say they are bracing for the possibility that final results will be delayed because so many people are voting by mail during the pandemic.
Facebook is under pressure to rein in political content that could be used to mislead or manipulate voters, whether it comes from foreign actors like Russia, which used the platform to interfere in the 2016 presidential election, or from Americans. That includes President Donald Trump, who has repeatedly made false claims online about mail-in voting and attempted to undermine the legitimacy of the election.
Social media companies have tightened their rules against election misinformation and efforts to dissuade people from voting. But critics say Facebook in particular often waits too long to act on posts that break its rules, and has allowed the president too much leeway.
On Wednesday, Facebook also gave more details on its plans to prevent candidates from claiming premature victory or misrepresenting the vote-counting process. Once polls close, it will run a banner with information about the counting process at the top of the Facebook and Instagram apps.
If a presidential candidate claims victory before a race is called, Facebook says it will label such posts to "add more specific information" that counting is still in progress and a winner has not been declared.
Once a winner has been declared, if another candidate or party contests the results, Facebook will update the notification at the top of its apps with the winner's name and label posts from presidential candidates with the winner.
Editor's note: Facebook is among NPR's financial supporters.
Copyright 2020 NPR. To see more, visit https://www.npr.org.