top of page

The Push to Ban Social Media for Kids: Who Should Be Responsible?

  • Socialode Team
  • Mar 28
  • 2 min read

Updated: Apr 1

Girl focused intently on a smartphone, lying on her stomach indoors. Light background, casual setting, expressing concentration.

How young is too young for social media? This question has been at the center of debate in North Carolina, where lawmakers are considering a bill that would ban social media accounts for children under 14 and require parental consent for 14 and 15-year-olds. The move highlights growing concerns about the negative effects of social media on children, and raises important questions about who should bear the responsibility for enforcing age restrictions.


The Growing Concern Among Parents and Teachers

Many parents and educators have long worried about children’s exposure to inappropriate content and the overall impact of social media on mental health. Cynthia Marshall, a middle school teacher in North Carolina, sees these effects firsthand in her classroom. “There’s just so many terms, video exposures, recordings, silly things that shouldn’t be allowed, for parents to not have that control over,” she said.


It’s not just anecdotal evidence. According to the Pew Research Center, 95% of U.S. teens ages 13 to 17 use social media, and over one-third say they use it ‘almost constantly.’ While platforms like TikTok, Facebook, and Instagram officially require users to be at least 13, many younger kids easily bypass these restrictions, with or without parental permission.


Shifting the Responsibility to Tech Companies

House Bill 301 aims to hold social media companies accountable for preventing children under 14 from creating accounts. While the companies argue that they already comply with federal laws, critics point out that enforcement remains weak, allowing many underage users to slip through the cracks.


This isn’t the first time social media giants have faced scrutiny. Last year, North Carolina joined 32 other states in suing Meta, the parent company of Facebook and Instagram, alleging that its platforms contribute to the youth mental health crisis. The lawsuit accused Meta of deliberately designing addictive features targeted at children, an allegation that echoes broader concerns about social media’s role in rising anxiety and depression among young users.


The Challenges of Regulation

Smartphone displaying social media apps like X, LinkedIn, and Instagram. Notifications on X (18) and Snapchat (1). Held in a hand.

Even when social media companies introduce safeguards, children often find ways around them. For example, TikTok’s default 60-minute time limit for users under 18 can be bypassed with a simple passcode. This raises a crucial question: Can tech companies truly regulate underage users effectively, should more responsibility fall on parents and policymakers, or should we ban social media for kids?


Social media is a multibillion-dollar industry, and companies have strong lobbying efforts in state capitals, making regulation a challenge. While North Carolina’s bill was scheduled for discussion, it was pulled from the agenda, a reminder of how difficult it is to pass legislation that challenges the power of major tech firms.


Where Do We Go from Here? Ban Social Media for Kids?

The conversation around children’s social media use is just beginning, but one thing is clear: a solution will require cooperation between tech companies, lawmakers, and parents.


Socialode App icon: Teal speech bubble icon with three white dots, indicating a message or chat. Simple and modern design with a gradient background.

Register to Waitlist

First invites go to those who sign up :)

bottom of page