Meta, Snap, and TikTok have partnered to launch a new initiative aimed at curbing the spread of suicide and self-harm content on their platforms. The program, named Thrive, is designed to allow these companies to share signals with each other, alerting them to content that violates policies by graphically depicting or encouraging self-harm and suicide.
The Thrive initiative is being developed in collaboration with the Mental Health Coalition, an organization committed to reducing the stigma around mental health conversations. Meta is responsible for providing the technical infrastructure that enables secure signal sharing across platforms. This system is similar to the Lantern program, which focuses on preventing child abuse online. Through Thrive, companies can share media hashes—digital fingerprints of specific content—so that when one platform identifies violating material, others can be notified.
Meta has indicated that its platform already actively works to reduce the visibility of harmful content. However, the company emphasizes its intent to balance moderation with the need for users to share their mental health experiences. Discussions of suicide and self-harm are permitted if they are not promoting or graphically detailing these actions.
Data from Meta suggests a significant ongoing effort to manage such content. Every quarter, the company reports taking action on millions of posts related to suicide and self-harm. In the last quarter alone, an estimated 25,000 pieces of content were restored following user appeals.
Logos used in Featured Image courtesy of Meta, Snap, and TikTok.
Follow us for updates on the Thrive initiative.