Bill In Minnesota Legislature Would Prohibit Social Media Algorithms Targeting Kids
ST. PAUL, Minn. (WCCO) -- A proposal with bipartisan support advancing in the Minnesota Legislature would prohibit social media companies from targeting children under the 18 with algorithms in a move that supporters say will help mitigate the harmful effect of certain content on kids.
It would prohibit social media platforms in Minnesota with a million or more users from using those algorithms. There'd be a penalty of $1,000 per violation if the legislation passes and big tech companies would also be liable for damages.
"It's really causing unnecessary harm to our kids by constantly feeding them and trying to keep them engaged on these sites," Rep. Kristin Robbins, R-Maple Grove, said.
The proposal moved out of a key committee last week with nearly unanimous support. There were changes adopted during a committee Tuesday night that exempt algorithms intended to block inappropriate or harmful content and parental controls to filter for age-appropriate material from the bill's ban.
It also specifies that the legislation doesn't cover internet search providers or e-mail.
Katie Bell, a medical and psychiatric nurse practitioner and co-founder of the Healthy Teen Project, on Tuesday shared testimonials from teenagers she works with, focusing on the body-image effects of newer social media app TikTok, where users post videos.
"The teens are like, 'you can't un-TikTok,' once you've entered certain words then they just continue to come at you," she said.
Robbins said she was disturbed after reading a Wall Street Journal report about the ways TikTok inundates teens with eating disorder videos, which helped inspire her effort the pass legislation. TikTok has since said it's adjusting its algorithm.
Trade groups representing big tech and e-commerce companies are against it, arguing it's too broad and would also stifle good content algorithms that at times can regulate misinformation. Some also raised First Amendment questions.
"A better solution is to empower parents and teenagers to understand the content they consume online and make the appropriate choices," wrote Jennifer Huddleston, policy counsel for NetChoice in a letter to a House committee. "[The legislation] lacks the nuances of different online experiences and treats all recommendations as equally harmful."
William McGeveran, a law professor at the University of Minnesota who specializes in internet privacy and first amendment law, said states across the U.S. and Congress are introducing dozens of policies to regulate or break up big tech companies.
Among them is a proposal in the United States Senate dubbed the "Kids Online Safety Act" that states platforms have a "duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors" from certain content. There is language for an "opt-out" for algorithms that use a child's data.
Some rules, McGeveran said, could pass the First Amendment test, depending on how they are written. He noted the Minnesota House proposal is the only one so far targeting algorithms.
"It's much less likely to be a be a first amendment problem if it regulates how you do it instead of what you say," McGeveran said. "So a rule that says algorithms can be used or can't has a better chance than a rule that says this is the speech that's harmful get rid of it."
Robbins said she is attempting to balance free speech protections and content by doing just that: targeting how social media companies operate and not what the content specifically is.
"It doesn't address to nature of the content. It doesn't address content creators. It's just that you can't use the algorithm to target kids," she said.
The proposal advanced Tuesday unanimously.