Meta, the parent company of Facebook and Instagram, has joined Take It Down, an initiative from the National Center for Missing and Exploited Children (NCMEC), to help prevent the spread of intimate photos of young people online. The system, which relies on locally stored photos, allows concerned users to upload generated hashes to Take It Down, rather than the photos themselves. If program members like Facebook or Instagram spot these hashes elsewhere, they can block the content to prevent its proliferation. The program is available not just for those under 18, but for parents who want to act on their child’s behalf and adults who want to scrub images taken of them when they were younger. While the NCMEC warns that platforms have limited capabilities to remove content that’s already online, this initiative could help mitigate or undo the damage from unwanted sharing.
Meta’s involvement in Take It Down is part of its ongoing efforts to combat sextortion and child sexual abuse material (CSAM) on its social networks. In November, the company announced plans to crack down on “suspicious” adults messaging teens, and this latest initiative follows on from its StopNCII technology, which was developed to fight revenge porn. Meta has faced pressure from state attorneys general and other government bodies to protect teens, particularly in light of whistleblower Frances Haugen’s allegations that the company downplayed research into Instagram’s effects on mental health. Meta already restricts ads targeting young audiences and limits sensitive content for teen Instagram users. The new takedown platform provides abuse survivors with more control over their online presence and may help lift some of the pressure on Meta to protect young people on its social networks.