AG Proposes Rules Restricting Children’s Social Media Use

New York Attorney General Letitia James released a set of proposed rules on how social media companies should restrict addictive features on their platforms to protect children’s mental health, as required by the Stop Addictive Feeds Exploitation (SAFE) for Kids Act. 

The SAFE for Kids Act, championed by Attorney General James, sponsored by Senator Andrew Gounardes (D-Brooklyn) and Assemblymember Nily Rozic (D-Flushing), and signed into law by Governor Kathy Hochul, requires social media companies to restrict algorithmically personalized feeds, or addictive feeds, and nighttime notifications for users under the age of 18 unless parental consent is granted. 

The law also prohibits social media platforms from sending notifications to users under 18 from 12:00 a.m. to 6:00 a.m. without parental consent.

Addictive feeds and nighttime notifications are tied to depression, anxiety, eating and sleep disorders, and other mental health issues for children and teenagers. The proposed rules released today explain which companies must comply with the law and outline standards to determine users’ age and obtain parental consent. A public comment period on the proposed rules is open for 60 days.

“Children and teenagers are struggling with high rates of anxiety and depression because of addictive features on social media platforms,” James said. “The proposed rules released by my office today will help us tackle the youth mental health crisis and make social media safer for kids and families. This is an issue that affects all of us, and I encourage parents, educators, young people, industry groups, and others to review the proposed rules and submit a comment during the public comment period.”

“We know that kids are happier and healthier when they’re learning and growing, not clicking and scrolling,” Hochul said. 

“I passed the SAFE for Kids Act in 2024 for one simple reason: I refuse to raise my children in a world where Big Tech profits at their expense,” Gounardes said. “Big Tech spent millions last year to defeat this bill and continue trapping kids into addictive algorithms, leading to a youth mental health crisis and sky-high rates of depression, anxiety, suicidal ideation, and self-harm.” 

“The SAFE for Kids law is a landmark step toward protecting kids online, and I am proud to see these strong regulations moving forward,” Rozic said. “This is a vital step in halting harmful, addictive feeds and putting kids’ health ahead of corporate profits. I applaud Attorney General Letitia James and her team for their thoughtful and groundbreaking work in protecting our kids.” 

Algorithmically personalized feeds, or addictive feeds, recommend or personalize content for users in an endless stream based on data that the platform gathered about the user. They are a feature designed to encourage a user to continue to use and return to a platform. Content displayed in addictive feeds is often from accounts that a user does not follow and is often displayed out of chronological order.

Algorithmically personalized feeds are known to drive unhealthy levels of social media use in minors that can affect their mental health. Research shows that children as young as 10 to 14 years old experience addictive use of social media, and the more time children spend online, the more likely they are to experience negative mental health outcomes such as depression, anxiety, and eating and sleep disorders.

The SAFE for Kids Act addresses these mental health concerns for children by requiring social media companies to restrict addictive feeds for users under 18. Instead of the default algorithmically personalized feeds that keep young people on the platform, users under 18 will be shown content only from other accounts they follow or otherwise select in a set sequence, such as chronological order unless they get parental consent for an algorithmic personalized feed. Users cannot be cut off from the platform simply because they don’t want or don’t have parental consent for an addictive feed. Instead, all users will still be able to access all of the same content they can access now.

Additionally, it authorizes the Office of the Attorney General (OAG) to promulgate rules on how companies should comply with the law before the statute goes into effect, including rules setting standards to determine a user’s age and parental consent. Before drafting the proposed rules, OAG issued an advanced notice of proposed rulemaking on August 1, 2024, and provided the public a 60-day period to submit comments. The OAG reviewed all comments that were submitted and used the public’s input, industry research, and its significant experience to inform the proposed rules.

Age Assurance

  • For users above the age of 18, social media companies must ascertain that the user is an adult before allowing them to access algorithmic feeds and/or nighttime notifications. Companies may confirm a user’s age using a number of existing methods, as long as the methods are shown to be effective and protect users’ data. Companies can use options, such as:
    • Requesting an uploaded image or video; or
    • Verifying a user’s email address or phone number to cross-check other information that reflects a user’s age.
  • Social media companies must offer at least one other alternative method for age assurance besides providing a government-issued ID.
  • Any information used to determine age or obtain parental consent must not be used for any other purpose and must be deleted or de-identified immediately after its intended use.
  • Young users who turn 18 must have an option to update their age status on the platform.
  • Social media companies must choose an age assurance method with a high accuracy rate, conduct annual testing, and retain the results of the testing for a minimum of 10 years.

Parental Consent

  • Social media companies must first receive a minor’s approval to request parental consent for algorithmic feeds and/or nighttime notifications. Once a minor approves, the platform may seek verifiable parental consent to allow a minor to access algorithmic feeds and/or nighttime notifications.
  • The platform may not block the minor from generally accessing the platform or its content through, for example, searches, simply because they or their parent has refused to consent.
  • The platform is not required to show parents the user’s search history or topics of interest to obtain parental consent.
  • Parents and minors must also have the option to withdraw their consent at any time.

These proposed rules apply to companies that display user-generated content and have users who spend at least 20 percent of their time on the platform’s addictive feeds.

The full proposed rules can be found here. 

“For too long, we’ve allowed social media companies to rake in enormous profits at the expense of our kids’ mental and physical health,” said James P. Steyer, Founder and CEO of Common Sense Media. “The passage of the SAFE for Kids Act, which Common Sense Media strongly supported, sent a message to social media companies that in New York, our kids’ safety comes first.” 

“These groundbreaking rules are an important step to prevent social media platforms from exploiting our children’s attention and mental health for profit,” said Julie Scelfo, founder and executive director of Mothers Against Media Addiction (MAMA).

A public comment period on the proposed rules is open for 60 days and the deadline to submit comments is December 1, 2025. The OAG seeks comment on every aspect of the proposed rules, including personal experiences, research, technology standards, and industry information, together with examples, data, and analysis in support of any comment. The OAG seeks comments from parents and other caretakers of children, young people, educators, members of academia, mental health professionals, consumer and child advocacy groups, privacy advocacy groups, industry participants, and other members of the public.

To submit a comment on the proposed rules, email ProtectNYKidsOnline@ag.ny.gov.

After the public comment period closes, OAG has one year to finalize the rules. Once the final rules are released, the SAFE for Kids Act goes into effect after 180 days.

For companies that violate the SAFE for Kids Act, the law authorizes OAG to bring an action to stop violations as well as to seek civil penalties of up to $5,000 per violation, among other remedies.