Facebook
Article about Facebook

Published 4 months, 3 weeks ago

Tech Facebook Published by J. Doe

An Important Step Towards Better Protecting Our Community in Europe

We want to do everything we can to keep people safe on Instagram. We’ve worked with experts to better understand the deeply complex issues of mental health, suicide and self-harm, and how best to support those who are vulnerable. No one at Instagram takes these issues lightly, including me. We’ve made progress over the past few years, and today we’re rolling out more technology in Europe to help with our efforts. But our work here is never done and we need to constantly look for ways to do more.

We’ve never allowed anyone to promote or encourage suicide or self-harm on Instagram, and last year we updated our policies to remove all graphic suicide and self-harm content. We also extended our policies to disallow fictional depictions like drawings, memes or other imagery that shows materials or methods associated with suicide or self-harm.

We also believe it’s important to provide help and support to the people who are struggling. We offer support to people who search for accounts or hashtags related to suicide and self-harm and direct them to local organizations that can help. We’ve also collaborated with Samaritans, the suicide prevention charity, on their industry guidelines, which are designed to help platforms like ours strike the important balance between tackling harmful content and providing sources of support to those who need it.

The Numbers

We believe our community should be able to hold us accountable for how well we enforce our policies and take action on harmful content. That’s why we publish regular Community Standards Enforcement Reports to share global data on how much violating content we’re taking action on, and what percentage of that content we’re finding ourselves, before it’s reported.

Original article

Nov 19, 2020 at 23:22

MORE ARTICLES