What happened to the “Internet for trust”? Report on the UNESCO Conference*
#elections #freedom of expression #gender violence #misinformation #political violenceThe “Internet for Trust” Conference, organized by UNESCO, addressed the regulation of platforms, an important topic for democracies, lives and agencies around the world. There is a need for more spaces for the exchange of ideas and experiences, especially from an intersectional and Global South perspective, for the formulation of Guidelines for the Regulation of Digital Platforms, considering that the topic involves enormous complexities. Furthermore, it is important that these Guidelines are the product of a careful process that reinforces the human rights to freedom of expression and access to information.
Several national and global initiatives, aimed at formulating policy solutions to combat the proliferation of fake news, defamation and hate speech online, are underway. The issue is booming beyond therights and technology activism, especially in the wake of the assault on the U.S. Capitol on January 06, 2021 and the attack on the headquarters of the three branches (Legislative, Executive and Judicial) in Brazil on January 08 of this year.
UNESCO (United Nations Educational, Scientific and Cultural Organization) is leading one of these initiatives and, from March 21-23, 2023, organized the “Internet for Trust” conference in Paris, France. The aim of the conference was to discuss draft global guidelines for the regulation of platforms focused on the protection of human rights to freedom of expression and access to information. The event, held in hybrid form (face-to-face and online), was attended by more than 4,000 people, including government officials, regulators, academics, journalists, private sector and civil society representatives.
The organizations Coding Rights, Centro de Estudios en Libertad de Expresión y Acceso a la Información de la Universidad de Palermo – CELE (Argentina), Derechos Digitales (Chile), Fundación Karisma (Colombia), Hiperderecho (Peru), InternetLab (Brazil) and TEDIC (Paraguay) that are part of the AlSur consortium were present.
CondingRights had the opportunity to follow the UNESCO Conference in Paris and, in this brief contribution, we try to give an account of some of the spaces and speeches that caught our attention, especially from the Latin American perspective. The complete program, materials and videos of the “Internet for Trust” Conference are available here. During the three days, the debates were guided by version 2.0 of the document of guidelines for the regulation of digital platforms, prepared by UNESCO.
These, in addition to the guidelines themselves, proposed principles for regulation, which assign responsibilities to digital platforms beyond the duty of transparency on tools, systems and processes for moderation and content curation. During the meeting in Paris, the document was subject to criticism, from the use of vague legal concepts in its content, to the hasty way in which it was drafted, which created obstacles to the broad participation of sectors and especially civil society from the Global South.
Day 0, Side Events
On Day 0, the Side-Events took place, conducted by partner organizations and some by UNESCO itself. Among them, we can mention the panel Transparency, content moderation and freedom of expression. Multistakeholder perspectives from Latin America, organized by OBSERVACOM and UNESCO, with the participation of Amalia Toledo (Wikimedia Foundation), Bia Barbosa (Diracom and Advisor to the Internet Steering Committee in Brazil), Gustavo Gómez (Executive Director of OBSERVACOM), among others, whose contributions focused on the possibilities and effects of content regulation in this part of the American continent.
During the debate, the Special Rapporteur for Freedom of Expression of the Inter-American Commission on Human Rights (IACHR), Pedro Vaca Villareal, affirmed the IACHR’s concern for the deterioration of public debate and the lack of participation of governments in the discussions (on the theme of the Conference. He also pointed out the need to invest in digital literacy. The representative of the private sector, Raúl Echeberría, warned that the issue is complex and that it would be “naive to think that there is an alternative universe where we can solve problems on the Internet that we cannot solve in the real world”. Catalina Botero Marino, member of Facebook’s Oversight Board, argued that it is necessary to deconcentrate the regulation of platforms, with three components: a regulatory one led by the State, one of co-regulation (such as oversight boards) and a third of self-regulation of the company. In addition, she said it is essential that regulatory initiatives have a truly multi sectoral process that not only includes “organizations that work with freedom of expression”, but also “organizations that work with women, with indigenous people, with racialized people”. The director of the Center for Studies on Freedom of Expression and Access to Information (CELE) at the University of Palermo, Agustina Del Campo, listed the legal problems of the latest version of the draft proposed by UNESCO, including the lack of distinction between the types of services offered by digital platforms (cloud computing, messaging, etc. ) and the types of business of digital platforms (for-profit or not-for-profit), the lack of clarity on the objectives of the regulation and the lack of definition on how oversight of the obligations to be imposed on digital platforms will be carried out.
At the closing of Day 0, the Harvard Business School professor emerita Shoshana Zuboff, made important remarks at the conference, “Information as a Public Good: What Platform Regulation for a Troubled Digital Age?” While acknowledging UNESCO’s regulatory efforts, she asserted that none of the principles presented would be comparable to the economic power of digital platforms and that what she calls human commodification, a product of the massive extraction and sale of personal data in the current stage of capitalism, cannot be regulated (and therefore must be eliminated).
Days 1 and 2, “Internet for Trust” Conference
The “Internet for Trust” Conference opened on February 22 with messages from Audrey Azoulay, Director-General of UNESCO, the Prime Minister of Iceland, Katrín Jakobsdóttir, and the President of Brazil, Luiz Inácio Lula da Silva. The Brazilian highlighted the need for balance to guarantee, on the one hand, the individual exercise of freedom of expression and, on the other, the right of access to reliable information, as well as the need to legally regulate the activities of the platforms.
“(…) Regulation must guarantee the exercise of individual and collective rights. It must correct the distortions of a business model that generates profits by exploiting users’ personal data. To be more efficient, the regulation of digital platforms must be designed with transparency and social participation. At the international level, this issue must be coordinated multilaterally”, President Luiz Inácio Lula da Silva
The Philippine journalist Maria Angelita Ressa, Nobel Peace Prize Laureate (2021), was in charge of delivering the opening lectures and presented her courageous trajectory in the fight against fake news in a context of political persecution led by the Philippine government against her.
In addition to the Nobel Peace Prize Laureate, the inaugural round table was attended by Melissa Fleming (Under-Secretary General for Global Communications of the United Nations) and Brazilian youtuber Felipe Neto. With more than 44 million followers, the influencer emphasized the need to address the logic of “recommendations”, charging transparency to digital platforms.
The conference featured nine sessions and more than forty speakers on its panels. Of particular note was the contribution of the UN Special Rapporteur for Freedom of Expression in Session 3 – Promoting freedom of expression and information in the digital ecosystem. Irene Khan warned that there is already a moderation of content by some governments that create increasingly restrictive regulations and pointed out the risks of using vague terms in the document, such as the term “democracy”, which, when dealt with in the specific case, may have different definitions that vary from one country to another. In addition, he stressed the need for companies to adopt human rights standards, as well as the need for more time, consultation and listening to different sectors on the regulation of digital platforms, reinforcing the need to sit companies at the table of the debate.
“I am sorry to say. It is very dangerous to let governments regulate content, because the trend will be a lot of censorship. We have already seen it with many laws where governments try to regulate content that affects the sentiment of religious groups (…), or that is offensive to the image of political leaders of a high office. Those are not legitimate grounds under international law. So you have to be very careful. On the other hand, it is being left to companies, to private entities, with commercial intentions, to regulate what they think is harmful. That is equally dangerous. And that’s why you have to go back to international human rights standards and the development of those standards in the UN, in the UN system and in other international bodies, where there are very clear standards on how to identify hate speech,” Irene Khan.
She added: “I’m sorry to say. It’s very dangerous to let governments regulate content because the tendency will be a lot of censorship. We have seen that happen with many laws where governments try to regulate content that affects the sentiment of religious groups (…), or that is offensive to the image of political leaders of a senior official. Those are not legitimate grounds under international law. So you have to be very careful. On the other hand, it is being left to companies, to private entities, to commercial motives, to regulate what they think is harmful. That is equally dangerous. And that’s why we have to go back to international human rights standards and the development of these standards that is going on in the UN, in the UN system, in other international bodies, where there are very clear standards on how to identify hate speech,” Irene Khan.
On February 23, Session 9: Defining the Way Forward closed the multilateral conference discussions. Participants included Kerri-Ann Jones (Deputy Secretary-General of the OECD), Roberto Viola (Director General of DG CONNECT, European Commission), Agustina del Campo (CELE), among others.
The Minister of Justice of the Brazilian Supreme Court, Luís Roberto Barroso, was at the session and defended the regulation of the activities of digital platforms. He attributed the need for regulation to economic reasons (in order to tax the activity), to the protection of privacy and to the fight against what he called “coordinated inauthentic behavior, as well as illegal content and, in some cases, disinformation”. At the end of his speech, he said that the fight against disinformation is “a war of truth against falsehood, of trust against discredit, of good against evil”.
The representative of Meta, Miranda Sissons, demanded a better outlined framework process and questioned how the UNESCO document articulates with other UN-led processes, such as the Global Digital Compact and the Code of Conduct for Information Integrity on Digital Platforms.
Peggy Hicks, Director of the UN High Commissioner for Human Rights, emphasized the need to contextualize the discussions, as the impacts are different in different regions of the world and the UNESCO Guidelines cannot have a “one size fits all” approach.
In his closing remarks, Tawfik Jelassi, Assistant Director-General for Communication and Information at UNESCO, indicated that they will continue to collect contributions for the drafting of a new document and will hold regional consultations to discuss the Guidelines. The Guidelines are expected to be finalized by the end of September this year.
Challenges for international regulation of digital platforms
UNESCO’s initiative to hold the multistakeholder conference “Internet for Trust” is commendable, as it brings to the debate such an important issue for democracies, lives and agencies around the world. We need more spaces for the exchange of ideas and experiences, especially from an intersectional and Global South perspective, considering that the regulation of digital platforms is an issue that involves “technical, legal, emotional, political and procedural complexities”, as highlighted by Agustina Del Campo during the Conference.
The topic requires much reflection, debate, listening and participation from different sectors for the formulation of an instrument that respects international human rights normative instruments and is adapted to the different realities of the world, where, unfortunately, democracy is not a constant.
Several normative initiatives are underway. European countries will experience the Digital Service Act and the Digital Market Act, Germany has the NetzDG, the UK is debating the Online Safety Bill, for example. In Brazil, there is a history of accumulation on the subject around the Marco Civil da Internet and the Bill 2.630/2020; in addition, the new government intends to make new regulatory proposals. Listening to and reflecting on the voices of the South is extremely important. Not by chance, there was a strong presence of Brazilians and Latin Americans at the Conference, which evidences the relevance of the topic in the region.
Even so, it is important to call for discussion and give conditions of participation to the sectors that directly and daily suffer online violence, such as the women’s movement, indigenous, blacks, LGBTQIA+ population, in order to build a fair and effective regulation, in attention to Human Rights (of all). In this sense, it is necessary to make the Guidelines available in a wider variety of languages and seek clearer definitions and legal schemes. It is also important to define how the different global initiatives relate to each other, especially those led by the United Nations itself.
The challenges are enormous and it is necessary to avoid the Guidelines being just an immediate and circumstantial response to recent obscure anti-democratic events, such as the one in Brazil. It is necessary to consider that the absence of a careful drafting process and without allowing the broad participation of the various sectors may result in a document with effects contrary to the strengthening of the right to freedom of expression and access to information, serving as ammunition for anti-democratic governments to become even more authoritarian.
At the same time, it is necessary to recognize that platforms already exercise content moderation. Coding Rights’ research “Visibilidade Sapatão nas Redes”, by authors Ivanilda Figueiredo and Joana Varon, among other issues, evidences the existence of content control on social networks related to lesbian visibility and the lack of transparency of digital platforms’ policies, reinforcing the argument that these companies already have a huge concentration of power in their hands.
Finally, the Internet for Trust Conference provided a rich debate, especially on the issue of content moderation. However, there is an urgent need for a debate on the regulation of the platforms’ business model, which is based on the massive exploitation of data, segmentation, recommendation and monetization of content and behaviors, data monopoly, market concentration, etc., which are essential to combat the various power asymmetries of the present and which already compromise our future.
* Text by Vanessa Koetz, originally published at AL SUR