Threads ban on search terms like "COVID" is temporary, head of Instagram says
The head of Instagram on Tuesday said the app Threads will soon stop blocking search terms like "COVID" – a current practice at the app, which Meta launched in July.
In an exchange on the Threads platform, Instagram head Adam Mosseri said the app's ban on certain search words is temporary, but he doesn't know when exactly the blocks will be lifted.
Threads is Meta's newest social media platform, and it is similar to X, the platform formerly known as Twitter. Meta is also the parent company of Facebook, which has come under scrutiny in the past for permitting the dissemination of misinformation.
In September, The Washington Post first reported that explicit words, as well as words and phrases that may be seen as contentious by some, would be blocked from the search feature on Threads. The list of banned search terms includes words like "sex" and "porn" as well as "coronavirus," "long covid," "vaccines" and other related words.
Washington Post reporter Taylor Lorenz, who wrote the original report on Threads' search term bans, posted on the platform this week saying the users were "somehow 150x times more insufferable than Twitter."
That seemed to catch Mosseri's attention. Lorenz went on to tell Mosseri in a post that communications representatives for Threads told her the block on words like "covid" and "long covid" would be temporary, while representatives at Meta told her it was not.
"I'm sympathetic to the fact that Meta gets a lot of unfair criticism around misinfo, but Covid is surging/ongoing," she wrote.
Mosseri replied: "I don't have an ETA to give you unfortunately, but it is temporary and we are working on it. We're just getting pulled in a lot of directions at once right now," he said.
In another reply, Mosseri said that the ban on some words would be lifted in "weeks or months." He added, "The reality is that we have lots of important work to do. The team is moving fast, but we're not yet where we want to be."
Both Facebook and Twitter have come under fire for their alleged mishandling of misinformation on their platforms, particularly surrounding elections and the pandemic.
In 2016, Twitter and Facebook were subjected to an interference campaign by the Russian government, which created confusion on the platforms surrounding issues related to the presidential election. The heads of both platforms have testified before Congress about the alleged allowance – and even alleged promotion – of misinformation.
Both companies said they would make changes to their platforms to try and stop the spread of misinformation.
Ahead of the 2020 presidential election, Twitter said it created a "2020 U.S. election hub" to promote accurate election information and resources. The company implemented stricter content policies and even banned former President Donald Trump, saying he violated the policies after the Jan. 6 insurrection and citing "the risk of further incitement of violence."
Facebook, which also deleted Trump's account for violating content policies with his posts about the Jan. 6 insurrection, said in 2020 that it had created methods to fight misinformation.
The company said it would remove posts with "clear misinformation" about the election and coronavirus. Still, the platforms' fight against misinformation has been scrutinized. In 2021, a whistleblower who had resigned from Facebook in 2020 said executives at the company knew about the spread of misinformation on the platform but failed to act on it at times.
While Twitter, which was rebranded and restructured by Elon Musk, has long been criticized as an app where people spread misinformation and hateful content, Threads was framed as a safer alternative. In an Instagram post about the new platform, Meta founder and CEO Mark Zuckerberg called Threads a "friendly community."
Some social media users posted memes comparing the lighthearted culture of Threads to the darker reputation Twitter had earned in the eyes of some. Some users even called Threads, which gained more than 100 million users in five days, a "Twitter killer."
In an interview with The Verge in September, Zuckerberg called Twitter "quite negative and critical."
"I always just thought you could create a discussion experience that wasn't quite so negative or toxic," Zuckerberg said. "I think in doing so, it would actually be more accessible to a lot of people. I think a lot of people just don't want to use an app where they come away feeling bad all the time, right?"