Meta failed to safeguard children despite ability, whistleblower says
Arturo Béjar says Meta can take effective action to control self-harm content on the Platform, however, it is still not doing anything about it.
Mark Zuckerberg’s Meta has not done enough to safeguard children after Molly Russell’s death, Arturo Béjar, a former senior engineer and consultant at Instagram and Facebook, said, emphasizing that the company has chosen not to make those changes as it already has the infrastructure ready to protect teenagers from harmful content.
Speaking with The Guardian, Arturo criticized Meta stating that “if they had learned the lessons from Molly Russell, they would create a product safe for 13-15-year-olds where in the last week one in 12 don’t see someone harm themselves, or threaten to do so. And where the vast majority of them feel supported when they do come across self-harm content.” This comes as the expert previously researched Instagram users and discovered that 8.4% of 13 to 15-year-olds have seen someone harm themselves or threaten to do so in the past week.
“They either need a different chief executive or they need him to wake up tomorrow morning and say: ‘This kind of content is not allowed on the platform’, because they already have the infrastructure and the tools for that [content] to be impossible to find,” Béjar criticized Zuckerberg.
Who is Molly Russell?
Instagram and Pinterest's harmful content related to suicide, self-harm, depression, and anxiety has pushed Molly Russell. a 14-year-old girl from Harrow, north-west London, to take her own life in 2017.
In a landmark ruling in 2022, an investigation into her death found that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content."
Béjar has met Molly's father Ian Russell in the UK this week, as well as politicians, regulators, and campaigners.
Meta is facing a lawsuit
Raúl Torrez, the New Mexico attorney general, brought a lawsuit with evidence based on Arturo's research at Instagram against Meta, accusing the company of failing to protect children from sexual abuse, predatory approaches, and human trafficking.
Unredacted documents from the lawsuit proofs that Meta employees warned the company was “defending the status quo” in the wake of Molly’s death when “the status quo is unacceptable to media, many impacted families and … will be unacceptable to the wider public."
In addition, Béjar testified before Congress last year detailing his experience at Meta and the “awful experiences” of his teenage daughter and her friends on Instagram, including unwanted sexual advances and harassment.
Arturo also stressed that Meta can work on stopping self-harm content in only three months, assuring that “they have all the machinery necessary to do that. What it requires is the will and the policy decision to say, for teenagers, we’re going to create a truly safe environment that we’re going to measure and report on publicly.”
Arturo Béjar knows what he's talking about
Béjar left Meta as a senior engineer in 2015, then returned as a consultant in 2019 for a two-year period where he conducted research showing that one in eight children aged 13 to 15 on Instagram had received unwanted sexual advances, while one in five had been victims of bullying on the platform and 8% had viewed self-harm content. His main responsibilities in his time at Meta were developing child safety tools and helping children cope with harmful content on the platform like bullying.
Arturo has provided Meta with a bunch of solutions for self-harm content, including making it easier for users to flag unwanted content and state why they don’t want to see it; regularly surveying users about their experiences on Meta platforms and facilitating reports submission by users talking about their experiences on Meta services.
As Instagram has a minimum age limit of 13, Béjar stated that underage users remain on the app. He is still monitoring the Instagram platform and says harmful content – including self-harm material – is still flooding the platform.