Keller: Needham woman said Instagram's teen accounts are just lip service
The opinions expressed below are Jon Keller's, not those of WBZ, CBS News or Paramount Global.
BOSTON - After facing pressure from legislators and being sued in 2023 by dozens of states for allegedly causing harm to children and teens, Meta is launching Instagram for teens.
"If we see a teen try to set up a new account as an adult account, we're gonna ask them to verify their age," said Antigone Davis, global head of safety for tech giant Meta.
That's just one of the new "safety" precautions the company is putting in place on its Instagram app, including new "teen accounts" that will be private by default, restrict who can message the teen, and limit user exposure to "sensitive content."
"We've listened to parents and tried to take in their concerns," Davis said in a CBS News interview.
Bereaved parents say social media hurt their children
Deb Schmill, whose daughter Becca died of a drug overdose after being victimized by a one-two-three punch of online sexual predators, bullies and drug pushers, says she doesn't believe that social media companies care about the kids. Schmill, who runs a Needham foundation in her daughter's name devoted to pressuring the social media companies to clean up their act, said, "They have been watching what's going on in Congress, and they're getting nervous."
Schmill was among the bereaved parents at a Senate hearing last winter that featured an apology of sorts from Meta CEO Mark Zuckerberg. "No one should have to go through the things that your families have suffered," Zuckerberg told the parents after being prodded by senators to offer an apology.
The Senate later passed the Kids Online Safety Act, imposing relatively tough new social media regulations, and the self-imposed new Instagram measures come one day before the House is due to take that measure up.
Will social media really change?
Dr. David Bickham, of Children's Hospital, studies what social media does to kids, and he likes some of what Instagram is doing, but noted, "There's a lot of time and effort put into making the platform really engaging and looking nice and really successful, and then the parental tools can be tacked on the end, and they can be harder to use and harder to access."
But Deb Schmill sees a political con game in progress. "This is an insult to the parents who've lost their children," she said. "What they're proposing is good, definitely, but I also think it's too little too late. They can't be trusted."
It's a familiar pattern: the tech giants point to moves like these new rules and say they're trying to do their best while their lobbyists go to work behind the scenes.
But the whistleblowers and the continued safety issues with social media tell a different story. And a nearly 30-year-old federal law protecting the tech giants from legal liability may be threatened by a current suit claiming Tiktok is liable for the death of a 10-year-old girl after its algorithm recommended the infamous Blackout Challenge to her.
We'll see if the threat of massive legal judgments promotes more accountability than mere political threats.