Advertisement

‘Suicide content helps users express themselves’ Meta executive tells Molly Russell inquest

Molly Russell from Harrow in north-west London
Molly Russell from Harrow in north-west London

A Meta executive has defended suicidal content being posted on social media claiming it helps people to “share feelings and express themselves”, the Molly Russell inquest has heard.

Molly, from Harrow, north-west London, was 14 when she ended her life in November 2017. In the months before her death she had browsed, liked and saved thousands of disturbing and dark images on social media platforms, including Instagram and Pinterest.

Her inquest at North London Coroner’s Court has already seen a senior executive from Pinterest fly from the US to issue an apology for Molly being able to view graphic self-harm and suicide imagery on the platform.

However, Elizabeth Lagone, head of health and wellbeing at Meta, Instagram’s parent company formerly known as Facebook, gave evidence for the first time on Friday and defended suicidal posts remaining on the platform, describing them as “cries for help” and denying claims that depressed children were “just guinea pigs” in Instagram’s content policies.

She told the court that such posts give people the chance to “share their feelings, and enable them “to find community and express themselves”, as well as insisting that they should be allowed as long as they are being “posted in order to create awareness”, for people to “come together for support” or for someone to “talk about their own experience”.

Elizabeth Lagone - PA
Elizabeth Lagone - PA
Ian and Janet Russell
Ian and Janet Russell

However, Oliver Sanders KC representing the Russell family - who have become internet safety campaigners following her death - repeatedly challenged Ms Lagone on whether a child could tell the difference between “content that encourages or creates awareness” of suicide and self-harm.

During the hearing, for the first time, details relating to the Instagram accounts Molly interacted with before her death were also revealed -  seven per cent of “recommended” accounts for her to follow were either “sad or depressive related” - and some of the “most distressing” content she viewed was also played to the court.

Following legal discussions about whether to edit the video content because it was “so uncomfortable to view”, Coroner Andrew Walker ruled that the court should see what he described as footage which appeared to “glamourise harm to young people”. The graphic content, which was mainly black-and-white, depicted violent incidents of self-harm and suicide.

He concluded: “Molly had no such choice, so we would in effect be editing the footage for adult viewing when it was available in an unedited form for a child.”

During her questioning, Ms Lagone, who had flown in from Meta’s headquarters in San Francisco, California, said that it was an important consideration of the company, even during the time of Molly’s death, to “consider the broad and unbelievable harm that can be done by silencing (a poster) when talking about their troubles”.

However, Mr Sanders repeatedly challenged her, asking: “Would a 14-year-old be able to tell the difference between a post that encourages self-injury and one that creates awareness?”

Ms Lagone replied: “I really can’t answer that question because we don’t allow content that encourages self-injury.”

Molly Russell
Molly Russell

Instagram’s guidelines at the time of Molly’s death, which were shown to the court, said that users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted” it.

Ms Lagone also denied Instagram had treated children like Molly as “guinea pigs” when it launched content ranking - a new algorithmic system for personalising and sorting content - in 2016.

Mr Sanders said: “It’s right, isn’t it, that children, including children suffering from depression like Molly, who were on Instagram in 2016 were just guinea pigs in an experiment?”

Ms Lagone replied: “That is specifically not the way we develop policies and procedures at the company.”

Despite being ordered to hand over information relating to Molly’s account by the UK courts, Meta has insisted on only partly handing over details regarding accounts she interacted with due to GDPR regulations in a legal practice known as “gisted disclosure” where individuals’ identifying features are omitted or redacted.

The inquest heard a breakdown of the Instagram accounts that Molly was interacting with in the run-up to her death. This included: 34 of 476 of the Instagram accounts that were “recommended” for her to follow, as per the app’s algorithm, were either “sad or depressive related”.

Mr Sanders also told the court that there were two accounts that had blocked Molly, 23 that she had blocked, 846 that she followed and 272 that were following her. “We don’t know who they were,” he said of all of them.

Ms Lagone left the court via the back entrance in a convoy of two Mercedes vehicles as well as a people carrier.

She will return to court on Monday to conclude her evidence. The inquest continues.

Tory peers say social media firms must not be forced to remove ‘legal but harmful content’

The Online Safety Bill must strip out curbs on “legal but harmful” content to protect free speech on social media, say Conservative peers.

Lord Frost, the former Brexit minister, welcomed promises by ministers to “tweak” what he claimed was a “frighteningly illiberal” bill but said that must include “at the bare minimum” removing the plans forcing social media firms to take down “legal but harmful” content.

Backed by Lords Moylan, Strathcarron and Baroness Stowell, the former Cabinet minister, Lord Frost said: “As things stand, the Bill would see companies like Meta enforcing dangerously vague requirements to remove harmful content on their platforms.

“It would also hand the Secretary of State the power to designate what constitutes such material in future, with only minimal parliamentary involvement.

“The Bill needs to protect the concept that most people think is self-evident, that if you can say something in the real world you should be able to say it online too. If it is legal to say then it must be legal to type.”

The peers fear that free speech could be curbed as social media firms could censor content through “woke” prejudices or algorithms.

Freedom of speech

Their warning, backed by former supreme court judge Lord Sumption, comes as Index on Censorship, an organisation that campaigns for free speech, has set out a series of amendments to protect free speech.

As well as removing the Bill’s clause 13 - which combats “legal but harmful” content - Index on Censorship also proposes narrowing the definition of “illegal” to safeguard against algorithms censoring content that could compromise freedom of speech.

It has also proposed protections for end-to-end encrypted communications as they say any monitoring of private messaging could leave users exposed to hacking by the backdoor and leave the UK vulnerable to cyber attacks.

Lord Sumption said: “This is a serious and constructive proposal for changing the more objectionable parts of this controversial Bill. The government would do well to take note.”

Lord Moylan said: “It needs radical revision to ensure that it does not create any new limitations on freedom of expression or give power to large unsupervised corporations.”

Baroness Stowell said: “It’s vital that the Online Safety Bill is amended so that we can introduce speedily the much-needed online protections for our children without jeopardising the principle of freedom of speech for adults.”