Advertisement

Doctors urge social media firms to share data to help 'vulnerable' young people

Psychiatrists are calling for social media companies to hand over data about their users and why they use their sites [Photo: Getty]
Psychiatrists are calling for social media companies to hand over data about their users and why they use their sites [Photo: Getty]

Social media companies such as Facebook and Instagram should be forced to hand over data about who their users are and why they use the sites in an attempt to reduce suicide among children and young people, a new report suggests.

The call comes from the Royal College of Psychiatrists and is backed by the grieving father of Molly Russell.

Concerns about the impact of social media on vulnerable people come amid suicides such as that of 14-year-old schoolgirl Molly in 2017, who was found to have viewed harmful content online before she died.

The government plans to set up a new online safety regulator and has proposed a 2% levy on the UK revenues of major tech companies, but the college believes theses measures don’t go far enough.

READ MORE: Social media 'sadfishing' trend harming children's mental health, but what is it?

It says it should be given the power to compel firms to hand over data and that the forthcoming “turnover tax” on social media companies’ income should be extended so that it includes their turnover internationally, not from just the UK.

The college is also calling for some of the money from the levy to be used for mental health research.

Molly's father, Ian Russell, spoke of the urgent need for greater action in an emotional foreword to the report, in which he described the “wrecking ball of suicide” that “smashed brutally” into his family, blaming “pushy algorithms”.

Speaking about his daughter’s social media accounts he said: “Among the usual school friends, pop groups and celebrities followed by 14-year-olds, we found bleak depressive material, graphic self-harm content and suicide-encouraging memes.

“I have no doubt that social media helped kill my daughter.”

Molly Russell took her own life at the age of 14 and had viewed harmful content online [Photo: PA]
Molly Russell took her own life at the age of 14 and had viewed harmful content online [Photo: PA]

Mr Russell also detailed one of Molly's final notes which described how she felt “with heart-breaking clarity”.

“I'm the weird sister, quiet daughter, depressed friend, lonely classmate,” she wrote.

“I'm nothing, I'm worthless, I'm numb, I'm lost, I'm weak, I'm gone. I'm sorry. I'll see you in a little while. I love you all so much. Have a happy life. Stay strong xxx.”

While welcoming the UK Government's White Paper on online harms, the College's report calls for an independent regulator with powers to be able to establish a protocol for the sharing of data with universities on how children and young people are using the likes of Instagram, Facebook and Twitter - not just how much time they spend online.

The data collected would be anonymous, the report says.

The report also points to evidence that increased social media use may result in poorer mental health, particularly in girls.

Commenting on the recommendations Dr Bernadka Dubicka, chairwoman of the child and adolescent faculty at the Royal College of Psychiatrists and co-author of the report, said: “As a psychiatrist working on the front line, I am seeing more and more children self-harming and attempting suicide as a result of their social media use and online discussions.

“We will never understand the risks and benefits of social media use unless the likes of Twitter, Facebook and Instagram share their data with researchers.

“Their research will help shine a light on how young people are interacting with social media, not just how much time they spend online.

“Self-regulation is not working. It is time for Government to step up and take decisive action to hold social media companies to account for escalating harmful content to vulnerable children and young people.”

READ MORE: Mental health warning over children as young as two accessing social media

In a joint article for the Daily Telegraph, Mr Russell and Ms Dubicka wrote that: "On social media, Molly found a world, sadly full of similarly struggling people with a marked lack of access to professional help, that grew in importance to her.

“Social media's pushy algorithms sucked her further into her digital life, and continued to feed harmful content to her.

“The posts she saw would clearly have normalised, encouraged and escalated her depression; isolated her and persuaded her to keep it all to herself. They convinced her she had no hope."

Claire Murdoch, NHS national director for mental health, said: “If these tech giants really want to be a force for good, put a premium on their users well-being and take their responsibilities seriously then they should do all that they can to help researchers better understand how they operate and the risks posed - until then they cannot confidently say whether the good outweighs the bad.”

In response to the Royal College of Psychiatrists' report the biggest social network, Facebook said it is “already taking a number of the steps recommended”.

“We remove harmful content from our platforms and provide support for those who search for it,” a spokesman said.

“We are working closely with organisations such as the Samaritans and the Government to develop industry guidelines in this area.”

Last year Instagram revealed it was banning graphic images of self-harm after Health Secretary Matt Hancock said social media companies “need to do more” to curb their impact on teenagers’ mental health.

A Government spokesman said: “We are developing world-leading plans to make the UK a safer place to be online. This includes a duty of care on online companies, overseen by an independent regulator with tough enforcement powers, to hold them to account.

“The regulator will have the power to require transparency reports from companies outlining what they are doing to protect people online. These reports will be published so parents and children can make informed decisions about their internet use.”

The Royal College of Psychiatrists believes more should be done to protect vulnerable youngsters online [Photo: Getty]
The Royal College of Psychiatrists believes more should be done to protect vulnerable youngsters online [Photo: Getty]

The report comes as it was revealed last year that teenagers who spend more than three hours a day on social media may be at higher risk of mental health problems.

Findings from 6,595 youngsters aged 12 to 15 in the US found those who used social media more heavily were more likely to report issues such as depression, anxiety and loneliness, as well as aggression and anti-social behaviour, than teenagers who did not use social media.

Back in 2018 a study published by NHS digital revealed that 11-19 year olds who suffer from mental health issues are more likely to use social media every day.

That prompted new guidance suggesting NHS psychiatrists should be encouraged to ask under-18s with mental health issues about their social media usage.

The Royal College of Psychiatrists (RCP) suggests that psychiatrists should ask whether using social media is impacting their school work, sleep, eating habits and general mood.

Additional reporting PA.