Facebook plans to add 3,000 workers to monitor, remove violent content
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
Faced with a recent spate of violent videos and hate speech posted by users on its network, Facebook has announced plans for a heap of hires: 3,000 new employees worldwide to review and react to reports of harm and harassment.
"Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," CEO Mark Zuckerberg announced Wednesday in a Facebook post.
"If we're going to build a safe community," he continued, "we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner — whether that's responding quickly when someone needs help or taking a post down."
specifically to these concerns, increasing its current staff of 4,500 by nearly 70 percent. The hires will be made over the next year.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
It's a high-profile announcement for what has become a high-profile problem for the tech giant. In recent months, an elderly man was murdered in Cleveland in a video later uploaded to Facebook; a teenage girl was sexually assaulted in a live-streamed video; and four people were charged with hate crimes for the assault of a man who authorities said had "mental health challenges" in a video that was also streamed on the site.
And then, there have been issues in the other direction — where the problem wasn't that violence went unflagged, but that an overactive flagging process removed less-than-offensive content. Perhaps the most notable of these incidents came last year, when the Pulitzer Prize-winning "Napalm Girl" photograph was removed from the site for violating Facebook's Community Standards, before the company finally relented and allowed it after an international outcry.
"Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " Stanford University law professor Daphne Keller told NPR's Laura Sydell last month. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Zuckerberg intends the new hiring spree to help ease this see-saw swing between too little enforcement and too much. Saying the company also aims to combat things such as hate speech and child exploitation, Zuckerberg explained the next steps include close work with law enforcement and streamlined reporting mechanisms.
"As these become available," Zuckerberg wrote, "they should help make our community safer."