Facebook tries to tackle misinformation with new climate change hub
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
Facebook is launching a climate change information page in an effort to promote facts about climate change from trusted sources.
Users in the U.S., U.K., France and Germany are seeing links and information from Facebook's Climate Change Information Center starting Tuesday. It's similar to the COVID-19 information page launched in March.
The spread of misinformation of all kinds has been a huge problem for the company, especially after evidence emerged of fake news on Facebook during the 2016 election campaign.
The climate information page features articles from reputable news sources and social media posts from government and international agencies like the National Oceanic and Atmospheric Administration and the World Meteorological Organization.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
Facebook says it partners with more than 70 organizations to fact-check misinformation, including that related to climate change. There has been some controversy over what is exempted from fact-checking as "opinion" when it comes to climate change.
"We are hopeful this climate science information center will be very effective," says Facebook Vice President of Global Affairs and Communications Nick Clegg.
"It provides a simple, easy-to-find repository for authoritative information about what is happening to our climate, how it's changing. And our experience with the COVID information hub is that there is a real appetite for people to find out more for themselves."
Clegg talked with NPR's Audie Cornish on All Things Considered about misinformation on Facebook. Here are selected excerpts of that interview:
Is there anything that Facebook can do more proactively to prevent misinformation from spreading?
We do already more than any other company in the industry. No other company partners with 70-plus fact-checkers around the world. Fact-checking is not some sort of light sanction. The suggestion that the only remedy to misinformation or the only credible remedy ... is when content is removed, I think overlooks how effective this program of fact-checking can be. And crucially, also, removing content is the ultimate sanction that a company like Facebook has. And we confine that to content where we think there is a clear and impending link to real-world harm.
So, for instance, this last weekend we removed content which circulated which made claims that far-left groups were responsible for the fires in Oregon. And we did so because the police and law enforcement emergency services told us that these rumors were actually taking their resources away from helping people whose homes were ablaze. And so that's a clear example where we felt that by removing that content, we were also diminishing the real-world harm that could could follow.
Recently Department of Health and Human Services public affairs official Michael Caputo used a Facebook livestream to make all kinds of claims against the federal government, ones that do sound threatening or dangerous, claiming that there were [left-wing] hit squads that were preparing for armed insurrection. How do you refute this?
So I haven't seen the video, so I simply can't comment on the video itself. ...
Look, the country is going through a highly, highly polarized time with people from all wings of opinion saying things to each other and about each other, not least in the run-up to this highly consequential election.
And we have put in place a whole range of guardrails to try and make sure that our platform remains a place for open debate. Sometimes we act swiftly and immediately. Sometimes we don't act quickly enough and then quite rightly, people criticize Facebook.
But I don't think it would be fair to overlook the huge, huge undertaking which now exists on Facebook, whether it is the millions of fake accounts that are removed every day, the 35,000 people who been employed to help monitor what goes on on our platform.
I guess what I want to say is these things aren't disconnected. Here you have a science hub and here you have someone using Facebook, someone in an official capacity who claims federal government scientists are engaging in sedition. I mean, this is going against the let's call it ‘good’ or ‘clean’ information you're trying to put out on the one hand. And on the other hand, there's a steady stream of this [misinformation].
I think it would be unrealistic to imagine that the only thing one is ever going to see on Facebook is material that you feel comfortable or happy about. That's not the nature of the world we live in. And it's certainly not the nature of the highly polarized debate we have in the United States. If that video broke our rules and we didn't act on it, I'm sure that is a legitimate source of criticism. But I think the underlying point you're making is that, yes, one of the reasons for this climate science center is precisely because good speech is often the most effective antidote to bad speech.
Editor's note: Facebook is among NPR's financial supporters.
Peter Granitz and Patrick Jarenwattananon produced and edited the audio interview.
Copyright 2020 NPR. To see more, visit https://www.npr.org.