An Online Safety Bill amendment will be brought forward to help bereaved parents access information about social media companies, a peer said after the landmark ruling over 14-year-old Molly Russell’s death.
Baroness Beeban Kidron said she will table a change to the proposed legislation in the House of Lords after a coroner concluded content viewed on the internet contributed to the schoolgirl’s death.
Molly died in November 2017 after engaging with 2,100 depression, self-harm or suicide-related posts over a period of six months, an inquest at North London Coroner’s Court heard.
Despite appearing a “normal, healthy girl” flourishing at school, she was suffering from depression and vulnerable, the court was told.
Senior coroner Andrew Walker said material viewed by Molly on sites such as Instagram and Pinterest “was not safe” and “shouldn’t have been available for a child to see”.
Speaking at a press conference in Barnet after the conclusion of the inquest on Friday, crossbench peer Baroness Kidron said: “I think it was historic – nothing short of historic – the way that the conclusion was held.
“And we do know, and I’m afraid my inbox in Parliament is full of people who have lost children sadly, and many of them struggle to get the information that they want, to get the access, to get that transparency.
“And I will be bringing forward an amendment to the Online Safety Bill in the House of Lords that seeks to make it easier for bereaved parents to access information from social media companies.”
Speaking earlier on Friday, Molly’s father Ian Russell said the coroner’s conclusions are “an important step in bringing about much-needed change.”
He said his message to Instagram – and Facebook – boss Mark Zuckerberg would be: “Just to listen. Listen to the people that use his platform, listen to the conclusions the coroner gave at this inquest and then do something about it.”
Mr Russell added later: “We have heard a senior Meta executive describe this deadly stream of content the platform’s algorithms pushed to Molly as ‘safe’ and not contravening the platform’s policies.
“If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly.
“It’s time the toxic corporate culture at the heart of the world’s biggest social media platform changed.
“It’s time for the Government’s Online Safety Bill to urgently deliver its long-promised legislation.
“It’s time to protect our innocent young people instead of allowing platforms to prioritise their profits by monetising their misery.”
A Meta spokeswoman said the company is “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers” and will “carefully consider” the coroner’s full report.
The ruling has been described as a global first of its kind after concluding that content Molly was allowed to view by tech companies contributed to her death.
Andy Burrows, head of child safety online policy at the NSPCC, said: “This is social media’s big tobacco moment. For the first time globally, it has been ruled content a child was allowed and even encouraged to see by tech companies contributed to their death.
“The world will be watching their response.”
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here