For years, Zuckerberg and Sandberg had clear-cut responsibilities, which employees often referred to as the “Sandberg side” and “Mark side.” That began changing in 2020 after Facebook dealt with scandals involving privacy, misinformation and other toxic content on the platform.
The Oversight Board, a body set up to review content moderation decisions by Meta (formerly Facebook), has questioned the company’s investment in moderating content in languages other than English, pointing at the small number of user appeals from India and other countries.
In its first annual report, published on June 22, the Board, which was set up in 2018 by Meta as a quasi judicial body to oversee content moderation on its platforms, noted that 49 percent of the appeals came from the US and Canada.
In contrast, only 14 percent came from Latin America and the Caribbean, 9 percent from Asia Pacific and Oceania, 4 percent from the Middle East and North Africa, 2 percent from Central and South Asia, and 2 percent from Sub-Saharan Africa.
It is pertinent to note here that India has the most Facebook and Instagram users.
While recognising that the number of user appeals does not reflect the total number of Facebook and Instagram users in a country, the Oversight Board said: “Our decisions so far, which covered posts from India and Ethiopia, have raised concerns about whether Meta has invested sufficient resources in moderating content in languages other than English.”
The Oversight Board noted that in India and surrounding countries, where there is a mismatch in the number of users and user appeals, people may not be aware that they can appeal to the Board against Meta’s content moderation decisions.
“We also do not believe that the distribution of appeals data on this map reflects the actual distribution of content moderation issues around the globe. If anything, we have reason to believe that users in Asia, Africa, and the Middle East experience more, not fewer, problems with Meta’s platforms than other parts of the world,” the Oversight Board added.
Additionally, the Board also informed that Meta ‘committed’ to translate its Community Standards into several languages spoken in India. “Once completed, more than 400 million more people will be able to read Facebook’s rules in their native language,” it read.
The issue regarding Meta’s alleged lack of resources for dealing with non-English content moderation was earlier also highlighted in eight complaints Meta employee-turned whistleblower Frances Haugen filed with the US Securities and Exchange Commission.
In one of the complaints citing internal records, blamed Facebook users affiliated with Rashtriya Swayamsevak Sangh (RSS) for allegedly spreading hate in Hindi and Bengali languages.
An excerpt of a Meta internal document in SEC complaint said, “Our lack of Hindi and Bengali classifiers means much of this content is never flagged or actioned, and we have yet to put forth a nomination for designation of this group given political sensitivities.”
What Meta did not answer to Oversight Board
In its annual report, the Oversight Board highlighted two India-related cases.
The first was regarding a video post by Global Punjab TV, featuring a 17-minute interview with a professor, which was shared by a user. In the caption for the video, the post alleged: “Rashtriya Swayamsevak Sangh (RSS) and India’s ruling party Bharatiya Janata Party (BJP) were threatening to kill Sikhs, a minority religious group in India.”
After being reported by a user, a human moderator removed it on the grounds that it violated Facebook’s Dangerous Individuals and Organisations Community Standard. This action also led to restrictions on the account of the user who shared the content.
After the user appealed Meta’s decision to the Oversight Board, and after the board selected the case, Meta restored the content, “conceding that its initial decision was wrong”.
In relation to this matter, the Board said that Meta did not answer two questions: “The first question asked what specific language in the content caused Meta to remove it under the Dangerous Individuals and Organizations Community Standard. Meta responded that it was unable to identify the specific language that led to the erroneous conclusion…” the Board said.
“The second question included asking how many “strikes” users need for Meta to impose an account restriction, and how many violations of the Dangerous Individuals and Organizations policy are required for account-level restrictions. Meta responded that this information was not reasonably required for decision-making…” it added.
The second case involved a photo that was posted by a user in a Facebook group that depicted a man holding a sheathed sword and with accompanying texts that described President Emmanuel Macron of France as the “devil”.
Facebook removed the content for violating its policy on violence and incitement. However, the Oversight Board revoked the decision.
The Board said Meta had failed to answer a question. “The Board’s question asked whether Meta had previously enforced violations under the Violence and Incitement Community Standard against the user or group. Meta responded that information about the user’s previous behavior on the platform was irrelevant to the Board’s determination.”
In conclusion, the Board said, “Despite issues with Meta’s responses to our first recommendations, both Meta and the Board have taken action to improve the recommendations process during 2021.
“While this work is already producing results, key questions remain. How can we make our recommendations more meaningful? How can we work with Meta in this area without compromising our independence? And, crucially, how can we ensure that Meta honors its commitments, through actions that can be measured by the Board and felt by people across the world?” it added. Moneycontrol