Meta, TikTok and Snap pledge to participate in program to combat suicide and self-harm content

In an attempt to prevent suicide and self-harm content from spreading online, the nonprofit Mental Health Coalition (MHC) today announced a new program, Thrive, aimed at encouraging online platforms to share “signals” of potentially harmful material.

Thrive, which counts Meta, Snap and TikTok as founding members, will provide ways for platforms to share hashes — essentially unique fingerprints — of graphic suicide and self-harm content and content depicting or encouraging viral challenges. The hashes will only tie to content, the MHC says, and won’t include identifiable information about accounts or individuals.

Meta has contributed the technical infrastructure, which incidentally is the same infrastructure the company provided to the Tech Coalition’s Lantern child safety program last November.

Thrive members will be able to aggregate info on self-harm content and receive alerts of content that raises concerns or violates their policies, the MHC says. From there, they’ll be able to independently assess whether to take action.

Thrive’s director, Dan Reidenberg, who’s also managing director at the National Council for Suicide Prevention, will oversee the operational aspects of Thrive, facilitating and monitoring the org’s activities. Participating companies will be responsible for uploading, reviewing and taking action on any content shared through Thrive, and for contributing to an annual report that’ll provide insight into program’s impact.

“We at the MHC are excited to work with Thrive, a unique collaborative of the most influential social media platforms that have come together to address suicide and self-harm content,” Kenneth Cole, founder of the MHC, said in a statement. “Meta, Snap and TikTok are some of the initial partners to join ‘the exchange’ committing to make an even greater impact and help save lives.”

Conspicuously absent from Thrive is X, the platform formerly known as Twitter — which doesn’t exactly have the best track record when it comes to moderation.

Data suggests that X has significantly fewer moderation staff than other platforms, partly a consequence of CEO Elon Musk cutting an estimated 80% of the company’s engineers dedicated to trust and safety. Earlier this year, X promised to establish a new trust and safety center of excellence in Austin, Texas. But the company ended up hiring far fewer moderators for the center than initially projected, according to Bloomberg.

Google, which owns YouTube, also isn’t a Thrive member. YouTube has been in the spotlight for failures to protect users from self-harm content. A summer 2024 study from the Institute for Strategic Dialogue found that YouTube readily recommends videos encouraging or normalizing suicide to children.

We’ve reached out to Google and X and will update this piece if we hear back.

That’s not to suggest that Meta, Snap and TikTok have fared better; hundreds of lawsuits, including one recently filed by New York City, accuse the tech giants of contributing to a mental health crises. In a landmark ruling two years ago, a British authority found Meta-owned Instagram culpable for the suicide of a 14-year-old girl after she was exposed to self-harm content on the platform.

Studies have begun to show a causal link between high social media use and reduced well-being or mood disorders, chiefly depression and anxiety. Most imply that heavy social media users are much more likely to be depressed than light users, and to view themselves in a unflattering light — particularly their physical appearance.

Source

      Guidantech
      Logo
      Shopping cart