The impact of social media on young people’s mental health is still unclear. Despite this, lawmakers and health officials are advancing legislation that includes age restrictions and warning labels for platforms like YouTube, TikTok, and Instagram.
Congress, state legislatures, and the U.S. Surgeon General are taking steps to address concerns about social media. However, experts warn that this focus on potential dangers might overlook the positive aspects these platforms can offer to teenagers.
In June, U.S. Surgeon General Vivek Murthy advocated for warning labels on social media platforms. On July 30, the Senate passed the bipartisan Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act. Additionally, over 30 states are considering laws related to social media, ranging from age restrictions and parental consent to new digital literacy programs for K-12 students.
While some studies highlight the negative aspects of social media—such as distorted reality from algorithm-driven content, distraction from incessant notifications, and the encouragement of cyberbullying—there are also potential benefits. Linda Charmaraman, a research scientist at Wellesley Centers for Women, points out that social media can be a lifeline for marginalized groups.
Charmaraman’s research, published in the Handbook of Adolescent Digital Media Use and Mental Health, shows that social media can help children of color and LGBTQ+ youth by reducing feelings of isolation. She argues that age restrictions could disproportionately affect these groups, who often use social media to find a sense of identity and community.
Arianne McCullough, a 17-year-old student at Willamette University, uses Instagram to connect with other Black students in an otherwise predominantly white institution. She finds this connection crucial for combating isolation.
McCullough’s experience reflects a broader trend. During the pandemic, she struggled with weight gain and negative body image due to social media’s focus on workouts and dieting. This led to increased feelings of irritability and sadness. After reducing her social media use, she felt better but struggled with the fear of missing out.
Research indicates that before the COVID-19 pandemic, there was already an increase in young people reporting mental health issues. The American Academy of Pediatrics, among other groups, labeled this period as a “national emergency” in child and adolescent mental health.
A recent committee report from the National Academies of Sciences, Engineering, and Medicine suggests that the relationship between social media and mental health is complex. While social media can contribute to mental health issues, it can also provide benefits. The report calls for more research into the effects of social media on youth well-being.
The report also cautions against policies like Utah’s age and time limits on social media use, which may have unintended negative consequences, such as increased isolation from support systems.
In response, some states have enacted legislation aligned with the National Academies’ recommendations. Virginia and Maryland have laws to protect children’s personal data and ensure privacy settings, while states like Colorado, Georgia, and West Virginia have introduced curricula on the mental health impacts of social media.
The Kids Online Safety Act, pending in the House of Representatives, proposes requiring parental consent for users under 13 and imposing a “duty of care” on companies to protect users under 17. The Children and Teens’ Online Privacy Protection Act would ban targeted ads and data collection for minors.
Several states, including California, Louisiana, and Minnesota, have filed lawsuits against Meta, accusing the company of misleading the public about social media’s dangers and neglecting the mental health risks to young users.
Most social media platforms require users to be at least 13 and offer safety features like blocking adults from messaging minors. However, the Department of Justice recently sued TikTok’s parent company for allegedly violating child privacy laws by allowing children under 13 on the platform and collecting their data.
Public support for age restrictions and parental consent requirements is strong. However, industry group NetChoice, representing companies like Meta and Alphabet, has challenged several state laws aimed at protecting children.
Jenny Radesky, a physician and co-director of the American Academy of Pediatrics’ Center of Excellence on Social Media and Youth Mental Health, believes that current proposals do not address the root problem. She argues that the real issue lies in business models designed to maximize engagement and profit rather than promote mental health.
Radesky concludes, “The system we’ve created prioritizes profit over the well-being of young users, leading to a design that is not conducive to promoting mental health.”