Sign up for the Today newsletter
Get everything you need to know to start your day, delivered right to your inbox every morning.
A proposed bill would remove any content suggested by an algorithm from social media feeds for Massachusetts teens.
Social media platforms would only show teens content from accounts they subscribe to or search for, eliminating suggested, or “for you,” content.
The bill, known as “An act protecting children from addictive social media feeds,” was written by Rep. Bill MacGregor of Boston, with a Senate version being pushed by Massachusetts Senate Majority Leader Cynthia Creem of Newton.
“Right now, children across the Commonwealth are being targeted by social media companies who seek to take advantage of and profit from their impressionable minds and developing brains,” MacGregor wrote in a statement to Boston.com.
“I filed this bill because social media companies are deliberately designing feeds that keep teens endlessly scrolling — fueling anxiety, depression, and sleep deprivation,” Cynthia Creem wrote in a statement to Boston.com.
Teenagers spend an average of five hours on social media per day, according to a Gallup survey. The 2023 Social Media and Youth Mental Health advisory issued by the US Surgeon General found that teenagers who spend more than three hours on social media per day are twice as likely to experience serious mental health problems.
In response, the bill would attempt to protect teens from potential harmful content, such as videos that might adversely affect teens with eating disorders or mental health issues.
“I was taken on an inescapable spiral into an eating disorder, almost passively,” said MacGregor’s teenage intern, Mary Ferrari, at a July 10 hearing of the Joint Committee on Advanced Information Technology, the Internet and Cybersecurity. “Pro-eating disorder content creators speak in code words that fly under the radar of restrictions. This means that the only way to protect teenagers, like myself, is to completely ban content that operates in the form of an algorithm.”
The Surgeon General’s advisory also reported that frequent social media use impacts brain development and is linked to depression, anxiety, attention issues, and sleep problems. To help combat this, the bill would prohibit operators from sending notifications to minors from midnight to 6 a.m.
“These changes are essential to protecting teen mental health and fostering a healthier relationship with technology — one that serves, rather than exploits,” wrote Creem.
Rep. MacGregor said, “I want to ensure that by the time my daughters are old enough to have a phone, we did our part in making their media landscape as safe as possible for them and their friends.”
A spokesperson for Meta, parent company to both Facebook and Instagram, told Boston.com that the company and the legislators “share the same goal: creating safer, more supportive online environments for teens.”
“But it’s important that we focus on solutions that empower parents, not one-size-fits-all mandates,” the spokesperson said in a statement to Boston.com. “This bill takes control away from parents, overlooks the responsibility of app stores, and broadly blocks features instead of encouraging smarter, age-appropriate experiences.”
The bill faces pushback closer to home as well.
The forced chronological feed created by eliminating the algorithm could cause harm by, for example, showing a teen only cyberbullying posts from their peers, said Briana January, Northeast State & Local Government Relations Director at Chamber of Progress, a trade group that represents technology companies, at the July 10 hearing.
Representative Tommy Vitolo of Brookline said he considers this bill “a half measure” since the Legislature will “never look” at the changes the social media platforms make, he said at the hearing.
Also, at this stage, the bill does not outline how platforms will be required to verify user data. To verify age, a platform itself, or a third-party, can collect “sensitive identifying data,” posing the risk for cybersecurity breaches and for the Trump administration to use the data for immigration enforcement, said January.
The bill went through the hearing process at the Hearing Details
Joint Committee on Advanced Information Technology, the Internet and Cybersecurity on Thursday, July 10. Now, the committee will vote on the bill to determine if it moves on to the Senate Committee on Ways and Means, according to Creem’s office.
If the bill passes, Massachusetts would join New York and California as the third state to attempt to limit social media’s impact on adolescents. While the New York bill prohibits all suggested content for minors, the California bill, which will take effect in 2027, will allow parental consent to override the law and allow social media platforms to provide suggested content to teenage users.
The Mass. bill would make it illegal for social media operators to show an “addictive feed” to a user unless the platform determines the user is not a minor. In a similar vein, the attorney general will determine the reasonable methods a platform operator must take to determine if a user is a minor in order to adhere to the proposed law.
Under the law, the attorney general would also manage a website for complaints and tips from community members regarding platform operators complying, or not complying, with this law.
Platform operators would be fined $5,000 for each violation made under this law, including destroying unlawfully obtained data.
“We look forward to working with lawmakers on approaches that better support teens and families,” the Meta spokesperson said.
Get everything you need to know to start your day, delivered right to your inbox every morning.
Stay up to date with everything Boston. Receive the latest news and breaking updates, straight from our newsroom to your inbox.
To comment, please create a screen name in your profile
To comment, please verify your email address
Conversation
This discussion has ended. Please join elsewhere on Boston.com