Facebook exec blames society for misinformation
In an interview with Axios, Facebook veteran Andrew Bosworth refuses to take responsibility for spread of political and COVID-19 misinformation.
For a trillion-dollar company to make such a broad statement raises some eyebrows -- Facebook, which has censored Palestinian voices, enabled hate speech against Rohingyas and dismissed Filipina maids' complaints on its platform of being abused, takes nearly to none of the responsibility for the spread of misinformation.
In an interview with Axios, named "Axios on HBO," Meta's Virtual Reality Vice President Andrew Bosworth attributed the spread of political and COVID-19 misinformation to the will of the users who are spreading this information.
"Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing," Bosworth said in the interview.
Upon being asked whether the hesitation to take vaccines is because of Facebook's incompetency in dealing with COVID-19 misinformation, Bosworth defended Facebook, asserting that the company had a major role in propagating major information campaigns to encourage vaccines. Opposing views on vaccination are a "choice," according to Bosworth.
"That's their choice. They are allowed to do that. You have an issue with those people. You don't have an issue with Facebook. You can't put that on me," he said.
Facebook kept it under wraps
However, all that dwarfs in front of the recent investigation, which saw Facebook order its employees, following a leak, to keep documents private. According to the investigation, Facebook chooses profit over impact.
A Facebook whistleblower, Francis Haugen, expressed concerns about the company's numerous business operations, and she consequently testified before US Congress and UK lawmakers.
The documents revealed several explosive pieces of information about the company's growth strategies, including bids to market its products directly to children. The company's internal research deemed its Instagram platform harmful to the mental health of young girls.
According to Haugen, Facebook has long prioritized profit over global impact.
News reports have recently raised a wide range of questions about the company's operations, including “the impact of its apps on young people's mental health, the company’s knowledge about the aggressive spread of misinformation and hate speech on its platforms, and the measures it took to stop the proliferation of human trafficking operations on its apps.”
Meta's latest endeavor: Human trafficking
Two years ago, Apple threatened to pull Facebook and Instagram from its app store over concerns about the platform being used as a tool to trade and sell maids in West Asia.
After publicly promising to deal with the issue, Facebook acknowledged in internal documents obtained by The Associated Press (AP) that it was “under-enforcing on confirmed abusive activity” that saw Filipina maids complaining on Facebook of being abused. Apple relented and Facebook and Instagram remained in the app store.
In a statement to the AP, Saudi Arabia’s Ministry of Human Resources and Social Development said the kingdom - from which 60% of trafficking ads come from - “stands firmly against all types of illegal practices in the labor market” and that all labor contracts must be approved by authorities. While keeping contact with the Philippines and other nations on labor issues, the ministry said Facebook had never been in touch with it about the problem.
It is true that Facebook disabled over 1,000 accounts on its websites, but its analysis papers acknowledged that as early as 2018, the company knew it had a problem with what it referred to as “domestic servitude.” It defined the problem as a “form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception.”
The executive director of Equidem Research, which studies migrant labor, Mustafa Qadri reasons "While Facebook is a private company, when you have billions of users, you are effectively like a state and therefore you have social responsibilities de facto, whether you like it or not."
It remains unclear whether the said program ever began, though Facebook said in its statement to the AP that it delivers “targeted prevention and support ad campaigns in countries such as the Philippines where data suggests people may be at high risk of exploitation.” Facebook did not answer specific questions posed by the AP about its practices.