Seattle Schools Take Legal Action Against TikTok, Meta and other Platforms over Youth Mental Health Crisis
Section 230 of the US Communications Decency Act generally exempts online platforms from being held responsible for content posted by third parties. However, the lawsuit claims that the provision does not protect social media companies from the harm caused by their recommendations, distribution, and promotion of content.
Tech giants have defended themselves, pointing to tools and features they have put in place to prioritize the well-being of children and teenagers. For example, Google’s Family Link provides parents with the ability to set reminders, limit screen time and block certain types of content on supervised devices. Meta Global Head of Safety, Antigone Davis said the company has developed more than 30 tools for teens and families, including supervision tools that limit time spent on Instagram, and age verification technology to help ensure age-appropriate experiences. TikTok has yet to respond to the lawsuit, but the company has previously said they were dedicated to safety.