social media

San Diego School Districts File Federal Lawsuit Targeting Social Media Apps

Frantz Law Group held a press conference at its downtown offices Thursday morning to announce their federal lawsuit filing against social media apps that include TikTok, Meta and Snapchat

NBC Universal, Inc.

By now we've all heard from doctors, teachers, even the CDC that mental health in young people is in crisis. Many studies report a correlation between depression and social media use. Two local school districts, Coronado and Oceanside, are trying to do something about it by joining a federal lawsuit targeting many of the social media giants.

NBC 7 looked into the likelihood of a lawsuit actually gaining ground. Frantz Law Group held a press conference at its downtown offices Thursday morning to announce their federal lawsuit against social media apps that include TikTok, Meta and Snapchat. James Frantz said there are 22 school districts across the country, totaling more than 200,000 students being represented in the lawsuit, which is not a class action lawsuit.

“Social media companies, we are coming for you. And you better get your act in place soon. And we’re gonna make you do it if you don’t like it. Period,” said Frantz.

Frantz mentioned a few solutions he hopes this lawsuit will help bring:

  1. Funding for school districts to add resources for mental health and educational programs alerting students about social media addiction
  2. Control or more transparency over the apps’ algorithm process
  3. Potentially giving educators the ability to shut down access to the internet on campus at certain times.

“If we get a court order that says we are gonna have a regulator in your company, a computer expert in your company. They're gonna look at what these algorithms are doing. That’s what I’m talking about. Injunctive relief to force them to change. Period,” said Frantz.

NBC 7 spoke to a local attorney Dick Semerdjian, who is not connected to the case, about the impact this lawsuit could have.

“I think every juror that is going to hear these cases has a device. They have a personal device. Whether it’s an iPhone, cellphone, computer. Nearly 100% of the jurors are gonna have experience with social media,” said Semerdjian.

In past statements, some social media apps have said they take the safety of their users seriously, and that they have added tools for parents to monitor children’s screen time, access, etc.

In a statement to NBC 7, a representative from Meta detailed some of the company's efforts.

“We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online. We’ve developed more than 30 tools to support teens and their families, including tools that allow parents to decide when, and for how long, their teens use Instagram, age verification technology, automatically setting accounts belonging to those under 16 to private when they join Instagram, and sending notifications encouraging teens to take regular breaks. We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us. These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.” - Antigone Davis, Head of Safety, Meta

Davis also provided examples of some tools the company has implemented to support families that include:

  • Verification technology to help teens have experiences that are appropriate for their age, including limiting the types of content they see and who can see and interact with them
  • Automatically set teens’ accounts (U16) to private when they join Instagram. We also don’t allow people who teens don’t follow to tag or mention them, or to include their content in Reels Remixes or Guides
  • Seek to prevent unwanted interactions between teens and adults by change to limit adults from messaging teens who don’t follow them on Instagram. Show Safety Notices if an adult who a teen is connected to tries to DM them
  • -Developed technology to help prevent suspicious adults from engaging with teens
  • Limit the types of content teens can see in Explore, Search and Reels with a Sensitive Content Control
  • Don’t allow content that promotes suicide, self-harm or eating disorders
  • Show expert-backed, in-app resources when someone searches for, or posts, content related to suicide, self-harm, eating disorders or body image issues

Frantz hopes a lawsuit like this will force social media companies to be responsible actors.

“We want something in the middle that’s not harming them, that’s educating them. That’s making our life more interesting having it. But we don’t need these secret little algorithms out there trying to get you addicted when you don’t know you're addicted. So we are trying to prevent that,” said Frantz.

Contact Us