Instagram is attempting to improve its regulations of content related to self-harm and suicide. The new restriction will cover illustrations, drawings, movies, memes, and cartoons.
“This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery,” says the head of Instagram, Adam Mosseri. “We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.”
Previously in the year, Mosseri spoke with the United Kingdom’s Health Secretary, Matt Hancock, about the company’s policy regarding posts linked to self-harm. The catalyst for this was the incident where 14 year old Molly Russell committed suicide after seeing similar content on Instagram.
The social media platform mentioned in February that it would be restricting certain self-harm oriented images, like cutting and scars of deliberately self-inflicted injuries. Now, instead of simply not bringing up such posts in the search, it will be deleting this kind of material in addition to fictional content and anything suggesting variations of suicide and self-harming.
Finding a middle ground
Mosseri’s blog regarding the policy shift said that the updated policy is “based on expert advice from academics and mental health organisations like the Samaritans in the U.K and National Suicide Prevention Line in the U.S”. Mosseri says, “We aim to strike the difficult balance between allowing people to share their mental health experiences while also protecting others from being exposed to potentially harmful content.”
Mosseri notes that it’s paramount that they discover the right mix of freedom and restriction, and that policy changes are a constantly evolving aspect of content regulation.
Mosseri explained, “Experts tell us that giving people a chance to share their most difficult moments and their stories of recovery can be a vital means of support” and “preventing people from sharing this type of content could not only stigmatize these types of mental health issues, but might hinder loved ones from identifying and responding to a cry for help”.