I recently watched the documentary Molly vs the Machines (Channel 4, first aired on 5th March at 9pm), which I knew had been initiated by Molly’s family after she took her own life at the age of 14. I was interested to understand what they had found, both as a way of making sense of such a tragedy and because it may help other parents think through what might have happened or could happen to their own child. It also raises awareness of potential dangers that many of us may not fully appreciate.
The film is a documentary, but it’s put together in a very clever and engaging way—worth watching for that alone. It re-enacts the inquest into Molly’s death, alongside contributions from very senior figures in Silicon Valley, discussing the nature of social media content and how it is delivered.
One of the key points raised was that Molly had been viewing content on her phone in her bedroom—a place that, to her parents, felt safe. The front door was locked; she was at home surrounded by love. Yet they had no idea what she was being exposed to. Molly sounded like a lovely young girl in her teens, interested in life, fashion, and friends—like so many others. But at some point, she must have come across a post related to self-harm. As anyone might, she was curious. What is this? What does it mean? So she clicked, read, and explored.
That curiosity is completely human. But from that moment, it seems the algorithms recognised her interest and began feeding her more and more of the same content. While it is possible to click the three dots on a post and say you’re not interested, or report something as harmful, a 14-year-old may not know to do that—or may not think to. Instead, she appears to have been drawn further and further into increasingly harmful material.
At that time, she may already have been feeling low. Whatever the underlying reasons, her mental state seems to have been fragile, and the content she was seeing may have contributed to a downward spiral. It was stated in the inquest that there may have been many factors involved—that this could have been the “straw that broke the camel’s back.” That may be true. But most people in a healthy state of mind would not seek out this kind of content, and the concern is that the algorithms can encourage a person’s feed to move in that direction.
It does make you wonder what our young people are seeing.
After we lost Andrew, I searched through his social media and online history as far as I could, trying to understand whether he had been influenced by anything or anyone, or if there had been bullying. As far as we could tell, there was nothing. He wasn’t
someone who used social media much—he preferred connecting with friends through online games and in person.
Social media platforms, and Silicon Valley more broadly, are often seen as symbols of progress. And for many people, they are a way to connect—with friends, with communities, with the wider world. But clearly, things can go wrong.
One quote from the film, by Joe Newquest, stayed with me: “The feature of all revolutions is progress first, and safety of the vulnerable second.” It does feel as though that may be what has happened here.
When questioned, Meta emphasised the importance of freedom of speech—the idea that people should be able to share thoughts and experiences openly and seek different perspectives. But at the same time, this means exposing young people to ideas and content that may never otherwise have entered their minds, giving them something to think about at a vulnerable time.
When I was a teenager, before the ‘web’, accessing information or images about self-harm simply wasn’t something that happened easily. The only exposure we might have had was through hearing about a suicide and wondering why it had happened, or through speculation and gossip. Even then, it was limited.
Now, that kind of content is not only easily accessible—it can also be actively promoted through algorithms.
After the film, I was filled with numerous thoughts about how easy it is to access information. I’m a keen fan of ChatGPT. I asked Chat the sort of question that could easily start a delve into concerning content. Chat immediately responded with concern and suggestions for help lines and contacts. I was relieved that there are some boundaries being followed online.
The inquest concluded that Molly’s death was contributed to by social media.