The Apple threat, first reported in The Wall Street Journal last month by The Wall Street Journal shows the potential consequences of Facebook’s ongoing difficulties with moderating problematic content, especially in non English-speaking countries. Representatives for Haugen stated in a SEC complaint that they were interested in learning the truth about Facebook’s failure to stop human trafficking and almost losing access the Apple App Store. This revelation comes as tensions between Apple and Facebook have been rising in recent months due to privacy concerns.
Stone directed SME to a Facebook letter that was sent last summer to several United Nations representatives. It addressed human trafficking and its efforts to stop it. The letter notes that domestic servitude content “rarely” is reported to Facebook by users.
Facebook’s business faces a’severe’ threat
” A Facebook employee described the actions taken by the company over the course of a week in order to mitigate the threat. These included taking action against more that 130,000 pieces in Arabic of domestic servitude-related material on Facebook and Instagram, expanding its policy against domestic servitude and launching proactive detection instruments in Arabic and English.
Despite the chaos of that week, Facebook was well aware that such content existed before the BBC reached out. “Was this issue known by Facbeook prior to the BBC inquiry and Apple escalation?” According to the internal report, “Yes.” [Site Event]Internal documents show that Facebook workers in the Middle East, North Africa and North Africa regions flagged reports about Instagram profiles dedicated to selling domestic laborers in March 2018. These reports were not followed up on because our policies didn’t acknowledge the violation, according to an internal report on domestic servitude content from September 2019.
Stone, the spokesperson for Facebook, stated that the company had a policy against human exploitation abuses at the moment. “We have had such policies for a long time. He said that the policy was strengthened following that point.
Internal Facebook documents show that Facebook introduced an expanded “Human Exploitation” Policy on May 29, 2019. This policy included a ban against domestic servitude content that is related to recruitment, facilitation or exploitation.
A Facebook employee posted a summary of an investigation into a transnational human trafficking network in September 2019. The network used Facebook apps to facilitate the sexual exploitation and sale of at least 20 victims. The criminal network used more 100 fake Instagram and Facebook accounts to recruit female victims. They used Messenger and WhatsApp for coordination of transportation of the women to Dubai where they were forced into “massage parlors.”
The investigation revealed that $152,000 was spent on advertisements on its platforms to promote the scheme, including ads targeting men from Dubai. According to the report, the company deleted all pages and accounts that were related to the trafficking ring. According to the report, one of the recommended “action items” is to request that Facebook clarify its policies regarding ad revenue from human trafficking in order to “prevent reputational risks for the company (not profit from ads spent on HT).
A week later, a second report detailed more extensively the problem of domestic servitude abuse on Facebook’s platform. The document contains samples of Instagram ads for workers. One advertisement describes a 38-year old Indian woman who is being sold for $350. (The company claims it has removed the related accounts).
Recent documents show that Facebook has not been able to remove domestic servitude content despite its efforts to do so immediately and over the months and weeks following the Apple threat.
Internally distributed in January 2020, a report found that “our platform allows all three stages (recruitment facilitation, exploitation) via complex, real-world networks.” It also identified common naming conventions for domestic servitude accounts and helped with detection. “Facebook profiles, IG Profiles and Pages were used by labor traffickers to exchange victims’ documentation… promote victims for selling, and arrange buying and selling of other fees,” the document stated.
Researchers discovered that labor recruitment agencies often communicated directly with victims through direct messages, but rarely posted violations of post public content, making them difficult for detection. The report stated that Facebook did not have “robust proactive detector methods… of Domestic Servitude English and Tagalog in order to prevent recruitment” despite the Philippines being a top source country for victims and that it did not have detection capabilities enabled for Facebook stories. Researchers identified at least 1.7million users who could benefit by information about worker rights.
“While our past efforts are a beginning to address the off platform harm that results domestic servitude, there are still opportunities for prevention, detection, enforcement,” the February report stated. Stone stated that the company has made on-platform interventions to remind job seekers of their rights and provides information through its Help Center for those who come across human trafficking content.
Even though Facebook researchers have extensively investigated the issue in detail, domestic servitude content can still be easily found on Instagram. SME last Wednesday identified multiple Instagram accounts that claimed to be offering domestic workers for sale. They used several common account names as a basis, including one account called “Offering Domestic Workers” and featuring photos and descriptions about women. This account also includes their age, weight, length of contract, and other personal information. SME asked Facebook for clarification and they removed the posts.
Stone stated that Facebook launched “search interventions” in English and Spanish in early 2021. These search interventions create friction in search when users type in keywords related to specific topics (that we have vetted through academic experts). He said that these search intervention were launched for sex trading, sexual solicitation and prostitution (English) and domestic servitude and labor exploitation in Arabic.
Stone stated, “Our goal was to help deter people searching for this kind of content.” “We are continually improving this experience to add links to useful resources and expert organisations.”
This article is part a SME series published on The Facebook Papers. It contains over ten thousand pages leaked internal Facebook documents. They provide deep insight into the company’s internal culture, misinformation and hate speech moderation.
The entire series can be viewed here