top of page
Search
Writer's pictureDavid Babbs

Reflections on the inquest into the death of Molly Russell

It was impossible not to feel shocked, and angry, at the findings of the inquest into the tragic death of Molly Russell, which found that she “died from an act of self-harm whilst suffering from depression and the negative effects of online content”. It was also impossible not to be full of admiration at the dignity, bravery and persistence of her family. They had to fight for several years to uncover the truth about her activity on social media in the months leading up to her death.

We hope that it provides some comfort to them to know that thanks to their determination, there is now an opportunity for lessons to be learned which could prevent the same things happening to other teenagers.

Clean Up The Internet’s work to date has not focused specifically on child safety, or on social media platforms’ role in encouraging self harm. Nonetheless, the findings from the inquest into Molly Russell’s death are relevant to our work and we have considered them carefully. Here are some reflections.


Safety-by-design

The inquest revealed that the harm caused to Molly stemmed from the very design of several social media platforms. The problem was not with one or two specific pieces of content. Rather it was caused by the platforms’ recommender algorithms, which the platforms designed with the overriding aim of serving their advertising business model by maximising the amount of time a user spends on it. The algorithms on social media platforms identified, correctly, that serving Molly content about suicide, depression, anxiety and self-harm was an effective way of maintaining her engagement. The platforms therefore served her more and more such “images, video clips and text some of which were selected and provided without Molly requesting them”, with devastating consequences for her mental health.


The cumulative impact of the content selected and served up by these algorithms was not only harmful to children. It also posed risks to adults, including those professionally involved in the inquest. A child psychologist, acting as an expert witness to the inquest, described finding the material which the social media platforms had fed Molly “very disturbing”, saying that it had affected his sleep for several weeks. Molly, as a young girl, was particularly vulnerable. But the dangers of design features such as recommender algorithms, which can identify a users’ vulnerabilities and target them with content which exacerbates them, extend to adult users as well.


This reinforces our view that design, systems, and processes should be a priority focus of regulation of social media platforms. Clearly, as the Coroner highlights in Molly’s case, this should include specific consideration of safety for children, backed up by effective age assurance or age verification processes to ensure children access a version of the platform designed with child safety specifically in mind. But as the experience of the child psychologist highlighted, risky design features such as recommendation algorithms don’t just pose risks to children. The same goes for other risky design features, including that which has been our recent focus, the ease with which one can create and misuse anonymous and pseudonymous accounts.

Not all design features will pose the same level of risk for adult and child users, and a proportionate approach to mitigating those risks will often mandate differing treatment of different categories of users. For example, in some cases it may make sense for adult users to be given a choice as to whether a risky design feature is enabled or disabled for their account, whereas for a child’s account it may make sense for the feature to be simply unavailable, or at the very least disabled by default.

However, it seems clear to us that children will be best protected by a regulatory regime which requires platforms to take a safety-by-design approach to all their users, alongside child-specific safety obligations. This would also help ensure that children are still able to access, enjoy, and learn from as much of the internet as possible, rather than encouraging tech companies to create small, walled gardens for children whilst the rest of their platform is left riddled with unsafe features. An Online Safety Bill which takes seriously the safety of all users will also facilitate greater consistency and understanding amongst users in how safety standards are applied, for example aligning guidance and standards for Age Verification with those for Identity Verification.


Transparency and Accountability

The inquest was only able to access information about what Molly had been shown on social media thanks to the exceptional persistence of both her family and the coroner. The companies’ resistance to disclosing the information must have added to the distress of Molly’s family, as well as making it take an unnecessarily long time to uncover important lessons about how social media needs to change.


It is not hard to understand why the companies were resistant to disclosing this information voluntarily. The picture of Molly’s social media feeds which finally emerged led the coroner to conclude that content fed to her by social media recommender algorithms contributed to her death, and that some of the social media sites she visited “were not safe”.


If the companies had been successful in avoiding disclosure, as they had been in some previous inquests, the coroner might not have been able to reach this conclusion, nor to issue the Prevention of Future Deaths Report which made wide-ranging recommendations including that “consideration is given by the Government to reviewing the provision of internet platforms to children, with reference to harmful on-line content, separate platforms for adults and children, verification of age before joining the platform, provision of age specific content, the use of algorithms to provide content, the use of advertising and parental guardian or carer control including access to material viewed by a child, and retention of material viewed by a child.”

This was a particularly important example of why platforms must be required to be more transparent. We must stop relying on tech companies’ self-disclosure and self-assessment of what is happening on their platforms. We were reminded of our own experiences of seeking information from Twitter, regarding their claims that almost none of the abuse directed at footballers following the Euro2020 final came from anonymous accounts - claims for which they failed to provide evidence, and which we were ultimately able to show relied upon a willfully misleading definition of an anonymous account.


Bereaved families should have a clear legal route to accessing information in the case of a death of a child. More broadly, an independent regulator needs broad powers to access the information which it requires to investigate harms and form its own assessments of the safety of a platform.


The Online Safety Bill - an opportunity to stop this happening again

Most of the issues raised by the Coroner, and discussed in this post, have been raised and explored in a variety of different ways during the very long-running policy development process behind the Online Safety Bill. This process has included the Online Harms White Paper in April 2019, commitments to bring forward a Bill in several Queen’s Speeches, a draft Bill, a pre-legislative scrutiny committee, and a Bill which underwent weeks of debate in Committee but is currently paused part-way through Commons Report Stage.


The current version of the Bill is not perfect. We are not alone in having set out some suggestions for amendments, which we hope can strengthen it as it passes into the House of Lords. But the current Bill would make some very big steps in the right direction. Introducing new safety duties on platforms, and making Ofcom the independent regulator, would start to make the tech companies adopt more of a safety-by-design approach. Platforms likely to be accessed by children would be held to age-appropriate standards. Ofcom’s powers to require platforms to disclose information would reduce their ability to conceal risks and problems.


Many politicians have identified the Online Safety Bill as the legislative opportunity to implement the lessons which have been learnt from Molly’s death. Molly’s father, Ian Russell, was invited to give evidence at many stages during this process. He has now expressed serious concerns about the Bill being delayed, yet again, due to another change of Prime Minister followed by changes to the DCMS ministerial team. We agree with his view that “For the sake of our young, who are currently exposed to online dangers on a daily basis, this legislation can’t come soon enough,” and that “delaying the Online Safety Bill, for the second time in four months, demonstrates the damaging effects of the current political turmoil.”


We share Ian Russell’s hope that the latest delay to the Bill’s progress is a short one. The Secretary of State, Michelle Donelan, who retained her position through the latest round of changes, recently promised Ian Russell that the Bill would have passed into the House of Lords before Christmas. Clean Up The Internet will certainly be urging the government to keep this promise.


Comments


bottom of page