Meta, in collaboration with Snap and TikTok, has launched a program called Thrive to prevent the spread of suicide and self-harm content online. Thrive, established with the Mental Health Coalition, allows tech companies to share signals about violating content securely. Meta provides the technical infrastructure for Thrive, enabling the sharing of signals about suicide and self-harm content across different platforms.
Key Points
- Thrive Program: A new initiative to share signals about violating suicide or self-harm content among tech companies to prevent its spread.
- Collaboration: Meta, Snap, and TikTok are founding members, encouraging other industry players to join.
- Technical Infrastructure: Meta provides the secure technology for sharing signals, similar to the Tech Coalition’s Lantern program.
- Content Sharing: Companies will share hashes (numerical codes) of violating images and videos, focusing on graphic content and viral challenges.
- Privacy: Shared signals will not include identifiable information about accounts or individuals.
- Existing Efforts: Meta already removes harmful content and supports users by connecting them to local organizations like Suicide and Crisis Lifeline.
- Recent Actions: Between April and June, Meta took action on over 12 million pieces of suicide and self-harm content on Facebook and Instagram. Efforts include making harmful content harder to find and hiding it from teens.
Thrive aims to enhance safety across various apps and services, leveraging industry collaboration to address complex mental health issues effectively.