But Instagram has responded to the accusations saying they don’t allow content that glorifies self-harm or suicide.
“Our thoughts go out to Molly’s family and anyone dealing with the issues raised,” a spokesperson told Yahoo UK in an email.
“We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and work hard to remove it.
“However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues, is an important part of their recovery.
“This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
Instagram’s guidelines say posts should not “glorify self-injury” and users who search for words like ‘self-harm’ are met with an offer of help and signpost to the Samaritans and other offers of help.
But users are still able to view the image by ignoring the offer of help from the social media platform.
The social media site claims to encourage anyone who comes across content that they believe to be in violation of their Community Guidelines to report it using their in-app tools.
And when it comes to hashtags, if a certain hashtag is consistently used to share violating content or the hashtag itself inherently promotes suicide, self-injury or eating disorders Instagram will block it, meaning it can no longer be searched or recommended and cannot be used on future posts.
The site also has a sensitive hashtag list which is updated daily. Once a hashtag is added to this list, it cannot be recommended and it will not appear on a related hashtags list.
Users looking at content on these hashtags will also not receive recommendations to view other hashtags.
Following Molly’s death her family found the teenager had been interacting with accounts from people who were depressed, self-harming or suicidal.
“Since her death, we’ve been able to look back and just scratch the surface at some of the social media accounts that she had been following,” Ian explained.
One account she followed featured an image of a blindfolded girl, seemingly with bleeding eyes, hugging a teddy bear.
The caption reads: ‘This world is so cruel, and I don’t wanna to see it any more.’
Within the findings her father admitted some of the content was positive with some groups of people appearing to support each other, trying to remain positive or finding ways to stop self-harming.
But Mr Russell said Molly had access to “quite a lot of content” that raised concern.
“Some of that content is shocking in that it encourages self-harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter,” Ian told BBC in an interview.
The UK government is urging social media companies to take more responsibility for harmful online content which illustrates and promotes methods of suicide and self-harm.
And back in 2016 a study confirmed that Instagram is more damaging to self-esteem than traditional magazines and adverts.
Molly’s family have developed the Molly Rose Foundation in her honour, aimed at suicide prevention targeted towards young people under 25.
“We want to help spot those suffering from mental illness and connect them to the help, support and practical advice they need,” a statement on the page reads.
Read more from Yahoo Style UK: