Thursday, October 14, 2021 19:25 PM (GMT+7)
Some experts believe that the “black list” that Facebook built is one-sided and difficult to apply reasonably with the scale of this social network.
In an effort to prevent allegedly helping terrorism and extremist groups from spreading, Facebook has built up a long list over the years of organizations and individuals banned from the social network. Part of the detailed list and policy was published by The Intercept newspaper and caused certain mixed reactions.
This list dates back to 2012, when Facebook banned terrorist organizations or criminal activities on this social network, in response to a number of alarms from the US Congress about the risk of terrorist recruitment and propaganda. on the Internet.
Since then, this ban has evolved into a Dangerous Individuals and Organizations (DIO) policy – a set of speech and activity restrictions that are subject to nearly 3 billion users worldwide. .
In recent years, the DIO policy has been used more often, as Facebook’s influence grows larger and more political and social unrest spreads on this social network, such as the riots in the US. The US Capitol on January 6, 2021 or political upheaval in Myanmar.
However, the full list of individuals and organizations banned or restricted by severity has never been made public by Facebook, although many advocacy groups and even Facebook’s independent Oversight Committee have. request to publish the list. However, recently a part of this list, along with Facebook’s DIO management policy, was revealed and published.
3 levels in the DIO . list
The DIO List lists thousands of individuals and organizations categorized by activity, including crime, terrorism, hate speech and activism, militarized social movements, and non-domestic violence. water. The above groups continue to be classified into 3 different levels according to Facebook’s new policy in June, with corresponding levels of management and sanction.
At any rate, no organization or individual on the DIO list may exist on Facebook’s platforms, and users may not represent themselves as representatives of the organizations named. Levels play a role in determining how users are told about these organizations and individuals.
At level 1, Facebook users are not allowed to praise or support the object on the list, even with non-violent action. Level 1 includes terrorist, criminal and hate-activated organizations and their members.
At level 2, users can support nonviolent activities, but cannot make statements that are “substantially supportive” of the named subjects themselves. Level 2 includes non-state violent subjects.
At level 3, the user is free to speak about the objects in the list. Level 3 includes military social movements; these groups have not yet acted violently but frequently make hate speech, have a high potential for violence, or repeatedly violate DIO policy.
Partial list of banned terrorist organizations on Facebook. Source: The Intercept
According to The Intercept, terrorist objects make up 53.7% of the DIO list, followed by military social movements (23.3%), hate groups and individuals (17%). , criminals (4.9%), and non-state violent subjects (1%).
Some experts and commentators believe that Facebook’s DIO list goes too closely with the foreign policy of the US Government, specifically because many organizations named at level 1 are taken from the government’s data bank. government and several controversial lists.
In addition, some say that the list is more lenient with groups expressing far-right ideology and white supremacy than the level of danger previously identified by US security agencies. There are also a number of subjects who are not considered to have committed acts of violence or hatred, but because they are related to the US geopolitical opponent, they are still on the DIO list.
Responding to the above comments, Facebook said that the company carefully uses criteria to classify named organizations and individuals. In addition, Facebook believes that the DIO list is also more comprehensive and detailed than the list of many countries, thanks to the advice of independent scholars and experts.
Problems in applying management policy
While Facebook has publicly disclosed its simplified content management policy, the policy and management processes within Facebook are actually extremely complex, with many examples, theories, and the basis of handling content. difficult to distinguish.
Facebook’s global content management and operations team will have to use internal guidance to identify groups and individuals included in user posts, determine the attitudes of posts, and decide how to deal with them. with automatic content moderation filtering systems.
A section of Facebook’s internal guide to content management with lots of examples. Source: The Intercept
According to one moderator who works outside of the US, deciding the nature of content and handling is a never-ending battle. The nature of information and statements in general is often context-dependent, and it will therefore take a lot of effort for moderators to review and decide whether the context of the content posted is appropriate enough.
In addition, moderators’ decisions to handle content are compared, adding to the difficulty they will have to think about how others will moderate content.
Content moderation challenges become even greater when placed in the context of protests or socio-political upheaval in some countries. Faiza Patel, an expert from the Brennan Center for Justice, says that having organizations on the DIO list plays an important role in socio-political reality in certain countries, and therefore in controlling browsing all content that is “pro-supporting” will damage public opinion. Comments on geopolitical or military situations also become difficult to control according to Facebook’s guidelines.
It’s been 17 years since the social networking site Facebook was born in a small room at the Harvard University dormitory for…