Father of UK teenager blames social media for exposing her to ‘bleakest of worlds’

The father of a 14-year-old British teenager who took her own life after viewing harmful content online told an inquest that social media companies’ algorithms trapped his daughter in the “bleakest of worlds”.

In a hearing that is putting tech giants under the spotlight, the North London Coroner’s Court heard on Wednesday that Molly Russell from Harrow, London, died in November 2017 after viewing “hideous, graphic, harmful” content on social media sites.

In the months leading up to her death, Molly had viewed a large volume of posts on sites like Instagram and Pinterest related to anxiety, depression, suicide and self harm.

On the second day of the two-week hearing, Molly’s father, Ian Russell, said he was “shocked” that such graphic content was readily available online. He said the “feeling inside her must have come from the vast contact that she had with so many of these posts”.

Molly had continued to receive emails after her death from Pinterest which promoted distressing content, he said.

The high-profile inquest has ignited debate about the duty of care social media sites owe to potentially vulnerable users, and the extent to which algorithms play a role in the consumption of harmful and disturbing content.

Giving evidence on Wednesday, Russell said a search for the kind of content his daughter had viewed revealed disturbing posts.

Molly Russell’s father Ian Russell (centre), mother Janet Russell (right) and her sister (left) arrive at Barnet Coroner’s Court, north London, on the first day of the inquest into her death.
Molly Russell’s father Ian Russell, centre, mother Janet Russell, right, and her sister arrive at the coroner’s hearing in Barnet, north London, on the first day of the inquest into her death. © Kirsty O’Connor/PA

“You see wounds that may well have been quite freshly made, [those posts] are shocking to see,” he said, adding there was “other content that you see that suggest other forms of self harm . . . ways of ending your life, window ledges, bridges, railway tracks, nooses, guns, it’s just the bleakest of worlds.”

“If it isn’t flowers and it isn’t football, but it’s . . . self harm or suicide, and that content is recommended to you, pushed to you . . . even emailed to you by the platforms, the effect is obvious,” he added.

Russell, who has become a prominent campaigner for stronger regulation of tech sites, also read out a sometimes emotional “pen portrait” of his daughter.

“It is nearly five years since Molly died,” he said. “Five years ago the Russell family life was unremarkable, but imperceptibly our adorable youngest family member Molly had been struggling with her mental health and hiding her struggles from the rest of us while she battled her demons.”

According to a police statement read out in court, Molly had saved and downloaded a “significant number of depression quotes” to her phone.

Executives from Instagram-owner Meta and Pinterest will give evidence at the inquest after senior coroner Andrew Walker ordered them to appear in person rather than via remote link.

Elizabeth Lagone, head of health and wellbeing at Meta, and Jud Hoffman, head of community operations at Pinterest, are both due to testify.

Tens of thousands of posts have been reviewed by both Meta and Pinterest, examining the type of content Molly was engaging with in the months leading up to her death.

When asked about recent efforts taken by social media companies to remove harmful content on their sites, Russell told the inquest: “As recently as in August of this year I have seen similarly horrific content on platforms . . . So whatever steps have been taken, it’s apparent to me that they’re not effective enough and that young people are still in danger.”

The hearing comes as the passage through parliament of the online safety bill, which aims to compel internet companies to keep their platforms safe, has been paused. Liz Truss, the new prime minister, is said to be considering relaxing a clause that’s controversial among tech lobbyists which would make platforms responsible for removing content that was “legal but harmful”, such as bullying.