How Facebook drives skeptics towards denying climate change
A human rights body has suggested that Facebook pushes disinformation and conspiracy groups on users that are climate skeptics.
According to a report issued on Wednesday by Global Witness, Facebook's algorithm exacerbated doubts regarding climate change rather than directing users to credible information.
According to Facebook, its procedures are "designed to reduce misinformation."
Researchers produced two users: "Jane," a climate skeptic, and "John," who adhered to recognized scientific groups.
They then observed what Facebook's algorithm recommended for both accounts. Jane quickly came across information debunking man-made climate change, including pages calling it a "hoax" and denouncing mitigation efforts.
Posts accusing the "green movement" of "enslaving humanity" and the United Nations of being "an authoritarian regime with less legitimacy than Bugs Bunny" are two examples. Other blogs, such as one from CFact Campus, disputed that humans had any impact on the climate.
The organization is affiliated with the Committee for a Constructive Tomorrow, a Washington-based libertarian think tank that opposes the scientific consensus on climate change. The researchers used Jane's account to "like" a Facebook page promoting climate disinformation as a "starter" page, then repeated the procedure twice more, each time selecting a page with at least 14,000 followers and expressing doubt about the existence of climate change or its human causes.
Jane "liked" a Facebook page named I Love Carbon Dioxide in one scenario.
Former US Vice President Al Gore stated in 2009, citing climate experts, that "There is a 75% chance that the entire North Polar ice cap during some of the summer months could be completely ice-free within the next five to seven years."
Although the statement was a mischaracterization of findings, it was not a prediction that "all ice would melt by 2013."
Over the course of two months, researchers say Jane was recommended more and more conspiratorial and anti-science content.
Only one of the pages recommended to her account was devoid of climate-change misinformation.
Two-thirds of the pages lacked a warning label directing users to Facebook's climate-science center, an information hub established last year after Meta CEO Mark Zuckerberg told a US congressional committee that climate misinformation was a "big issue."
Meanwhile, John's account began by liking the page of the United Nations scientific agency, the Intergovernmental Panel on Climate Change (IPCC).
In contrast to Jane, John was continuously exposed to trustworthy science-based information.
As the simulation progressed, Facebook began to promote ever more extreme and fringe items to Jane, such as conspiracy theories about "chemtrails" - erroneous claims that condensation left by planes contains chemical agents capable of controlling the weather.
In other areas, such as gender-based abuse, the Facebook algorithm has been found to push people into rabbit holes, where information gets increasingly fringe as users engage with postings on a certain issue.
According to the IPCC, one of the difficulties blocking governments and the public from tackling climate change is misinformation. Its most recent report, endorsed by 195 nations, emphasizes how climate science disinformation "undermines climate science and disregards risk and urgency."
Facebook has claimed that its system is "designed to reduce misinformation, including climate misinformation, not to amplify it."
However, according to another recent study conducted by the Center for Countering Digital Hate and the Institute for Strategic Dialogue, less than 10% of erroneous messages on the site were labeled as disinformation.
According to Global Witness researcher Mai Rosner, "Facebook is not just a neutral online space where climate disinformation exists - it is quite literally putting such views in front of users' eyes."
Facebook, which has censored Palestinian voices, enabled hate speech against Rohingyas and dismissed Filipina maids' complaints on its platform of being abused, takes nearly none of the responsibility for the spread of misinformation.
In an interview with Axios, Meta's Virtual Reality Vice President Andrew Bosworth attributed the spread of political and COVID-19 misinformation to the will of the users who are spreading this information.
It is noteworthy that Facebook has been blocked in Russia after several instances of discrimination against Russian state media, Moscow's media regulator Roskomnadzor had announced as well as shortly allowing posts inciting hatred against Russians and the Russian government.