Concept Note

1. Context

In accordance with its mandate and in response to the existing debates on the digital technologies impact on the exercise of human rights, the Inter-American Commission on Human Rights (IACHR) and the Office of the Special Rapporteur for Freedom of Expression (RFOE) are focused on promoting guarantees of freedom of expression in digital spaces, monitoring possible limitations and alerting about possible links between the use of stigmatizing speeches online and violent actions against certain groups.

The main challenges identified are related to the deterioration of the public debate, based on the increase in digital and physical violence against certain people and groups of people who exercise their right to freedom of expression, deliberate disinformation and the inadequacy of companies policies measured against democratic and human rights principles.

Understanding that the internet is an indispensable instrument for the full exercise of human rights, including freedom of expression, freedom of association and others of economic, cultural and political nature, concrete actions are required to facilitate the conditions of access, use and exploitation of the internet and technology itself. This includes a transversal approach that addresses the particular deficiencies and vulnerabilities of historically discriminated groups, as well as constant technological innovation and its scope.

In this regard, the IACHR has entrusted RFOE to carry out a multistakeholder inter-American dialogue related to three thematic axes:

  • The Deterioration of the Public Debate
  • Digital Literacy for the development of civic skills
  • Compatibility of Content Moderation policies with human rights standards.

During the different phases of the Dialogue, the RELE will address each of the thematic axes from two approaches to the exercise of the right to freedom of expression by Internet users: the first, in light of those users who access information and the second, from those who produce content. Both approaches require a particular approach in each of the thematic axes according to their needs.

This note briefly summarizes what is meant by Content Moderation. It also points out – not exhaustively – themes and sub-themes that will facilitate and scope the Dialogue.

2. Definition

Content Moderation is the organized practice of filtering content generated and viewed by users and posted on Internet sites, social networks, and other online media. There are several different forms of Content Moderation; pre-moderation, post-moderation, reactive moderation, distributed moderation, and automated moderation. The process can be done, for example, through the direct action of a person (the moderator who acts as the platform’s agent) or by an automated process based on artificial intelligence together with the processing of large amounts of data from these users. Modern Content Moderation processes have multiple sources – from keyword or image processing to the initiative of other users using alert mechanisms to request that content be moderated. Content Moderation does not always result in content removal. It can also result in how a content is prioritized or, for example, if it even is shown in search results. In this way, Content Moderation not only affects what a user is allowed to post (user as “Creator”), but also what one is allowed to see, consume, buy and believe (user as “Recipient”).

For this reason, for the purposes of the Dialogue of the Americas, Content Moderation includes all those regulations, decisions, or processes that an internet platform adopts internally to control the type, scope, and way in which content created by third parties flows (“ Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda ”, Office of the Special Rapporteur for Freedom of Expression, 2017). Content Moderation also refers to any interference that has the potential to privilege or silence part of the democratic debate, including, for example, through clustering processes (“filter bubbles”), among others.

Categories y subcategories

To analyze the main elements that influence Content Moderation on the Internet, it is proposed to make a breakdown starting from the two main approaches of Internet users: those who access and are recipients of information (“Recipients”) and those who create content (“Creators” ). This distinction is made knowingly and attentively to the fact that due to the very nature of the internet, people alternate between the roles of creators and recipients in its multiple and different uses of technology.

Within each approach, there are categories of actions that define and influence the Content Moderation, which may include but not limited to:

  • Filter / Promote = Actions on content in general and on non-protected speeches (for instance, hate speech) in specific, the measures and policies related to its handling on platforms and the action of the States to limit the broadcasting of content.
  • Block = Actions that prevent access to information or make it difficult for it to be spread online.

These categories (general types of actions) and subcategories (impacted content) are not exhaustive and more may be added during the Dialogue of the Americas.

Category
Subcategory
FILTER / PROMOTE
Deliberate misinformation
Hate speech
War propaganda
Incitement to illegal actions
BLOCK
Websites blocking
Internet slowdowns and blocking

4. Challenges

Each one of the three thematic axes of the Dialogue of the Americas presents particular challenges for its approach. This does not mean that they are the only related problems, but rather that they are issues that must be analyzed for the construction of comprehensive and sustainable action proposals. This division is artificial and may change during the Dialogue of the Americas, but its intent is to cover the challenges already documented by the IACHR / RFOE.

In relation to Content Moderation on the internet, these challenges are divided into three broad categories: political / legal, technical and social.

  • Challenges of a political / legal nature include the challenges related to the actions of the State, the private sector and organizations in a position of power to influence legislative reforms and public policies that regulate the circulation of content on the internet, Content Moderation policies; as well as for the adoption of proportional and necessary measures when faced with speeches that threaten safe spaces for freedom of expression; It also includes political will to respect and not criminalize freedom of expression and the establishment of alliances for the execution of political decisions.
  • Challenges of a technical nature include the practices and processes that sustain these Content Moderation practices, as well as those that guarantee the stability of the connections and integrity of the spaces for the distribution of content. These challenges are influenced by both the States, public or private institutions that make decisions on policies and internal tools for content analysis, restorative measures for improperly withdrawn content, and guarantees for the stability of connections at key moments of content dissemination.
  • Challenges of a social nature. On one hand, it has been shown that the internet is an essential platform for public deliberation. On the other hand, there are fears that its use for certain purposes will accelerate violence, or that decisions will be made on the Internet that will reduce the voice of those who want to express themselves. This leads to a generalized perception of excesses and censorship of speeches that increases levels of anxiety to ensure that one’s own voice is the only relevant one. Examples of this type of challenge will be deepened during the dialogue.

Challenges of an economic and cultural nature are also recognized and affect the three previously identified categories in a transversal way.

Nature
Examples of challenges
Political
The discretion and degree of transparency for the determination of content that can be filtered or obstructed, and therefore not protected, and if the filter or obstruction actions are necessary, proportional and in accordance with human rights standards.
The pressure that the States can exert towards social media platforms or intermediaries so that these actors conform to the content moderation criteria proposed by the States
Technical
The adoption of positive measures so that populations in vulnerable situations can share content, without being criminalized, and have the technical facilities to do so.
The degree of efficiency and analysis of contexts for the best determination – reduction of errors – within the moderation of content in accordance with human rights.
The quality of the databases that inform algorithms and the “quality” of the algorithms used.