With 2018 being proved not such a pleasant year for Facebook, recently, 1400 pages of leaked Facebook rulebook generally used by its moderators to monitor reported content on Facebook revealed several loopholes in its content management. It revealed that there are serious problems with not just the guidelines, but also how the actual moderation is done.
The unnamed Facebook employee, who leaked these documents, reportedly feared that the social network was making too many mistakes. He claimed that Facebook’s team of over 7500 moderators do not check the posts vigilantly and have little understanding of nuances behind languages.
A report by NYT claims that though the moderation guidelines are mainly set by Facebooks employees who are mostly engineers and lawyers, they have little knowledge about the regions they are setting guidelines for. For example, these engineers and lawyers will obviously have practically zero knowledge of political matters in India or Myanmar or Pakistan for that matter.
These rules are then sent across to third-party companies which hire moderators. Most of the moderators are hired from call centres or other low paying grades, who have a poor understanding of the rules.
For example, in the Indian context, one Facebook slide tells moderators that any post degrading an entire religion violates Indian law and should be flagged to be removed. The reality though is different. Indian law prohibits blasphemy in certain conditions, such as when the speaker intends to incite violence.
Another problem is that the Facebook rules seem to be written for English-speaking moderators. There is clearly a dearth of local moderators in the firm, who will have a better understanding of local context and language. The English speaking moderators generally use google translate to understand the non-English content.
Machine translated content can often strip out context and nuances, distorting the meaning of the content. This is how the moderators generally go wrong understanding the content which goes on the Facebook site.
According to the moderators, the entire exercise is frustrating as firstly they have to memorise all the rules that could violate Facebook’s community standards — rules which keep changing or getting updated regularly. Then these moderators have around 8-10 seconds per post to recall those rules and take action on a piece of content, considering Facebook has a whopping 2 bn plus user base.
Recently Facebook had hit another row when a bug exposed non-shared photos of users to third-party apps for a period of two weeks.
In fact, since the beginning of this year, the social media giant has been falling prey to several controversies one after the other.
In April this year, Facebook had stated that the personal data of nearly 87 million users may have been ‘improperly shared’ with political consultancy firm Cambridge Analytica. Facebook CEO Mark Zuckerberg had accepted the blame for the data breach.
Failing to protect user data, Facebook had severely faced criticism worldwide. The Government of India has also sent a notice to Facebook to provide details on the data of Indian users leaked to CA.
Recently, the Italian Competition Authority (AGCM) had imposed a penalty of 10 million euros($11.4 million) on Facebook, for illegally harvesting the data of its users for commercial purposes. The Authority had also asked Facebook to run an apology on its website and app.