Senators reintroduce Kids Online Safety Act to help protect kids from harmful online content
Social media has become an important part of the lives of many teenagers, including 14-year-old Jasmine Hernandez. But what started as a means of connection and self-expression took a dark turn when the Texas student fell victim to racist cyberbullying.
Like countless other teenagers, Hernandez experienced online harassment. According to Hernandez, images of herself were imposed on "someone hanging on a tree or like the KKK surrounding someone burning on the cross."
The incident deeply affected her mental health, forcing her to miss weeks of school and seek counseling for the remainder of the year. It also left LaQuanta Hernandez, Jasmine's mother, in fear that she would lose her daughter.
"One of the things that the licensed school professional counselor said was if she [Jasmine] did not have the faith that she had she would have committed suicide," LaQuanta Hernandez said. "That's major to me. This was my baby we're talking about."
Hernandez has returned to social media and her mom thinks there should be some responsibility on the part of social media platforms.
"I keep up with what she's looking at. I check things. But there has to be some accountability," she said.
Hoping to help families like the Hernandezes, Connecticut Democratic Sen. Richard Blumenthal and Tennessee Republican Sen. Marsha Blackburn have reintroduced the Kids Online Safety Act in the Senate. The proposed legislation would give parents new controls to identify harmful behavior and report negative content. It would also require social media companies to provide options for minors to protect their information and disable addictive features.
The bill was introduced in the previous Congress and is now being revived as the mental health crisis among young people continues to escalate. As parents themselves, the senators share a deep concern for the welfare of children in the digital age.
"There would be a responsibility on these platforms to monitor what is there, things that cause self-harm," Blackburn said.
According to a recent survey by the Centers for Disease Control and Prevention, nearly one in three teen girls seriously considered suicide, and studies have shown that teen depression rates have doubled over the past decade.
"We're in the midst of a mental health crisis and moms are coming forward and saying big tech is aggravating and exacerbating, in fact profiting from driving toxic content at kids ... eating disorders, bullying, sexual harassment, substance abuse," Blumenthal said.
TikTok says its guidelines don't allow content that could leave to suicide, self-harm or unhealthy eating behaviors. Meta, which owns Facebook and Instagram, told CBS News it has "developed more than 30 tools to support families" and will "continue evaluating proposed legislation."
A spokesperson for Snap, the parent company of Snapchat, told CBS News in a statement that it offers "extra protection for SnapChatters ages 13-17" and will continue its work with policymakers.
The bill is one of several bipartisan legislative efforts targeting social media platforms, including legislation introduced Wednesday by Massachusetts Sen. Ed Markey and Louisiana Sen. Bill Cassidy designed to protect children's online privacy.
Similarly, a bipartisan bill introduced last week would bar children under the age of 13 from using social media, while those between the ages of 13 and 17 would need parental consent to create an account.