Instagram has pledged to extend its ban on images that promote self-harm and suicide. The move comes after public pressure following the death of British teenager Molly Russell, who took her own life days before her 15th birthday in 2017.
The teenager’s father, Ian Russell, accused Instagram of being partly responsible for his daughter’s death after he discovered she had been looking at graphic content on the app shortly before she died.
Speaking to the BBC, he said: “I think Molly probably found herself becoming depressed. She was always very self-sufficient and liked to find her own answers. I think she looked towards the internet to give her support and help.”
Russell claims that the algorithms used by some platforms “push similar content towards you” based on what you have been previously looking at.
He said: “I think Molly entered that dark rabbit hole of depressive suicidal content. Some were as simple as little cartoons – a black and white pencil drawing of a girl that said ‘Who would love a suicidal girl?’. Some were much more graphic and shocking.”
- Fresh Instagram Redesign Sees ‘Following’ Tab Removed
- Instagram Launches Restrict Tool to Tackle Bullying
- Instagram to Crack Down on Diet and Cosmetic Surgery Posts
In February, the Facebook-owned app banned graphic images showing self-harm as a result of Russel’s protests and the subsequent public outcry. The company implemented new measures to better monitor, censor and ensure such content was not shared or recommended. According to Instagram, within three months of its February policy change it was able to find more than 77% of this sort of content before it was reported.
In a recent post on the company’s blog, Instagram said it would try to “strike a balance” of allowing people to share mental health their mental health experiences while also protecting others from being exposed to potentially harmful content.
The platform’s announcement coincided with a visit made by Russell to Silicon Valley to discuss the problem of self-harm on social media and to meet other families affected by the issue.
Adam Mosseri, the head of Instagram, told BBC News: “It will take time to fully implement, but it’s not going to be the last step we take. There is still very clearly more work to do. This work never ends.”
Russell described the new commitment as sincere, but added he hoped that Mosseri would deliver on this pledge.
Andy Burrows, the head of child safety online policy at the NSPCC, noted that Instagram’s policy change did not alter the fact the majority of the industry was irresponsible, and called on the government to progress legislation intended to force a care of duty upon the social media companies.
“Molly’s death should be a galvanising moment to act,” Burrows said, “but the reality is while Instagram has taken positive steps the rest of the tech industry has been slow to respond – on self-harm, suicide and other online harms.
“As Ian Russell says there is a pressure of time and there is a price for not moving quickly enough, which is children’s lives. That is why the government needs to introduce a draft bill to introduce the duty of care regulator by next Easter and commit to ensuring it tackles all the most serious online threats to children.”