FB knew danger of Instagram on girls: Internal doc
A previously unpublished internal document reveals that Facebook was aware that Instagram pushed girls toward dangerous content.
A previously unpublished internal document reveals that Facebook, now known as Meta, was aware that Instagram was exposing girls to dangerous content.
An Instagram employee conducted an internal investigation in 2021 into eating disorders by creating a false account posing as a 13-year-old girl seeking diet advice, according to the document. She was directed to graphic content and accounts titled "skinny binge" and "apple core anorexic."
Other internal memos show Facebook employees raising concerns about company research that found Instagram made one-third of teen girls feel bad about their bodies and that teens who used the app experienced higher rates of anxiety and depression.
Attorney Matt Bergman started the Social Media Victims Law Center after reading the so-called "Facebook Papers", disclosed by whistleblower Frances Haugen last year. He is now representing over 1,200 families in lawsuits against social media companies.
Next year, Bergman and his team will begin the discovery process for the consolidated federal cases against Meta and other companies, which he claims are more about changing policy than monetary compensation. "Time after time, when they have an opportunity to choose between safety of our kids and profits, they always choose profits," Bergman told 60 Minutes correspondent Sharyn Alfonsi.
Read next: No Health Without Mental Health
Bergman worked as a product liability attorney for 25 years, specializing in asbestos and mesothelioma cases. He claims that the design of social media platforms ultimately harms children. "They have intentionally designed a product-- that is addictive," Bergman said.
"They understand that if children stay online, they make more money. It doesn't matter how harmful the material is."
"So, the fact that these kids ended up seeing the things that they saw, that were so disturbing," Alfonsi asked, "wasn't by accident; it was by design?" "Absolutely," Bergman said. "This is not a coincidence."
He argues that the apps were designed to evade parental authority and is calling for better age and identity verification protocols. "That technology exists," Bergman said. "If people are trying to hook up on Tinder, there's technology to make sure the people are who they say they are."
Bergman also wants to do away with algorithms that drive content to users.
Read next: Instagram fined $402mln for invading children's privacy in EU
"There's no reason why Alexis Spence, who was interested in exercise, should have been directed to anorexic content," Bergman said. "Number three would be warnings so that parents know what's going on. Let's be realistic, you're never going to have social media platforms be 100% safe. But these changes would make them safer."
Meta, the parent company of Facebook and Instagram, declined 60 Minutes' request for an interview, but its global head of safety Antigone Davis said, "We want teens to be safe online" and that Instagram doesn't "allow content promoting self-harm or eating disorders." Davis also said Meta has improved Instagram's "age verification technology."
In a test conducted by 60 Minutes two months ago, however, a producer was able to lie about her age and sign up for Instagram as a 13-year-old with no verifications. It was also able to search for skinny and harmful content. And while a prompt came up asking if the user wanted help, we instead clicked "see posts" and easily found content promoting anorexia and self-harm.