Mark Zuckerberg on Facebook's new Shops feature and the company's responsibility to curb misinformation
The coronavirus pandemic has had a devastating impact on the economy, as a recent survey found nearly half of all small businesses have closed temporarily. Meanwhile, misinformation about causes and cures for COVID-19 have spread rapidly online.
Facebook co-founder and CEO Mark Zuckerberg on Tuesday joined the "CBS Evening News with Norah O'Donnell" to discuss the social media platform's responsibility in slowing the spread of misinformation, and to spotlight a new online shopping portal called Shops that Facebook rolled out Tuesday.
Norah O'Donnell: So tell us what makes this new feature unique?
Mark Zuckerberg: COVID has not just been a health emergency. It's been a real economic crisis that is putting a lot of strain on small businesses.
And what we are seeing are a lot of small businesses are moving more of their business online, and we want to make it easier for them to do that, too. So, Facebook Shops allows a small business to easily set up a shop inside our apps. And it will be a very fast experience for people to discover their products and be able to buy things directly.
Privacy is always a big question, so, will my friends know when I buy something on Facebook Shops?
No, we're not going to tell anyone what you're buying or your shopping history across our services without your permission. And that's not really a big part of this experience. This is really more about people being able to connect with the small businesses that they care about.
There was a conspiracy video circulating a few weeks ago called "Plandemic." That was widely debunked. Why did Facebook take it down?
We view one of our primary responsibilities now as making sure that we can connect people with authoritative information from governments and health officials, and that's one of the reasons why we built this Coronavirus Information Center that we put at the top of our apps and directed more than 2 billion people to go to.
There's harmful misinformation, which is the type of thing that puts people in imminent physical risk. So, if you're telling someone that social distancing doesn't work, or that something that's proven to be a cure when it isn't, we want to take that off our services completely.
There's other misinformation which is not generally going to cause physical harm, it's just stuff that's wrong. We want to stop it from going viral, and there we work with independent fact-checkers, which has led to us showing about 50 million warning labels on content that people have seen.
We have an indication that those warning labels work because 95% of the time when someone sees a piece of content with a warning label, they don't click through to it.
Facebook has been hesitant in the past to take down misinformation, deep fakes. How is this different from political content?
I think one of the things that's different during a pandemic is that if people are saying that something is a cure, when in fact it could hurt you, I do think that is qualitatively different, and within the long tradition of free speech and free expression that our country has, to not allow things like that. I think you want to have a pretty high bar for telling people they can't say something.
But if a political figure shares information about a cure that could cause imminent harm, would that be taken down?
Yes, it would. You know, there's no exemption on this for anyone in particular. And in fact, we've had cases of this, and the president of Brazil did make a statement saying that there's a drug that was proven everywhere to work and be effective. And of course, there is no drug currently — that I'm aware of, at least — that has been proven to be effective everywhere. And we took down his statement, and that was very controversial because we were basically taking down something that the president of a major country said. But it clearly violated our policy.