Paris (February 24, 2023) - The United Nations Educational, Scientific and Cultural Organization (UNESCO) successfully concluded the conference on content regulation and moderation on digital platforms, held here from February 21-23. UNESCO will present its suggestions next September.
The Inter-American Press Association (IAPA) position was presented by Jorge Canahuati, former president and head of the organization's Advisory Council. Canahuati warned about the risk of governments trying to over-regulate during the panel "Unpacking the implementation of regulating platforms," organized by Research ICT Africa (RIA), held on February 21.
"In the eagerness to regulate disinformation and other vices that tarnish freedom of expression, abuses can be committed against freedom of the press and journalistic content," Canahuati said. He advised that any application of a legitimate regulatory model " must meet specific requirements: independent and autonomous, free from economic, political, and government pressures." He added: "the first line of defense against disinformation and other vices - hate speech, digital harassment, and all kinds of abuses - should be self-regulation."
The session was moderated by Alison Gillwald of RIA, a digital policy, regulation, and governance think tank based in Cape Town, South Africa. Gillwald summarized research commissioned by UNESCO and conducted by RIA, published as a conference working paper. The first part explains why lies and hate speech proliferate on the internet, the second addresses the problems with platforms' content moderation policies, and the third deals with a hybrid mix of regulatory provisions by all stakeholders.
Maria Donde, Head of International Content Policy at the U.K. communications regulator Ofcom, highlighted the importance of initiatives such as the Global Online Safety Regulators Network.
Pansy Tlakula, of the South African regulator, considered that existing regulators should be built on and "not create new ones."
Joan Donovan, a Harvard academic, proposed that companies should be forced to respect their rules, starting with consumer privacy and data protection. In addition, she felt that technology regulation should focus on processes rather than products.
Nighat Dad of the Pakistan Digital Rights Foundation, said problematic laws should be reformed. He gave an example of legislation protecting women and children, which is used against journalists.
The UNESCO conference "Internet for Trust" was attended by more than four thousand people, including government officials, representatives of regulatory bodies, media organizations, digital companies, academics, and journalists.
- Digital governance and the challenges for trust and safety (Part 1)
- Digital governance and the challenges for trust and safety (Part 2)
- Digital governance and the challenges for trust and safety (Part 3)
- Internet for Trust - Towards Guidelines for Regulating Digital Platforms for Information as a Public Good, Paris, 2023, UNESCO
- Windhoek+30 Declaration 2021 - May 3, 2021
Other approaches by Canahuati:
"We agree that digital platforms should have more editorial responsibility for distributing their content. However, the first line of defense against disinformation and other vices - hate speech, digital harassment, and all kinds of abuses - should be self-regulation in terms of moderation and content curation."
"The best regulation of freedom of expression in the Americas is expressed in most constitutions. Almost like a carbon copy of the U.S. Constitution, they state that Congresses may not enact laws restricting freedom of the press and freedom of expression."
"Many governments of all political stripes create laws and decrees to limit, prohibit or block information. They illegitimately use rules in defense of national security and national sovereignty, protection of privacy or against discrimination, on crimes of defamation and contempt or insult to the authorities, to control information and criticism."
"We believe that government regulation on content moderation may become an indirect but legal, practical, and illegitimate formula to censor journalistic content distributed by the platforms."
"Platforms must have high standards of editorial responsibility because they not only distribute journalistic content but also generate their position and recommend it in search results. Therefore, they must comply with transparency policies, especially those related to their algorithms and the use of artificial intelligence. Furthermore, they must comply with explicit guidelines on content and act quickly and efficiently in the event of complaints. Finally, they must put universal principles on human rights and freedom of expression before political, corporate, and commercial interests."
"Excessive regulation should not be imposed on digital platforms and other private companies that generate content. Instead, requirements already established in laws and norms that respect human rights, such as those referring to the apology of violence and hate speech, the protection of minors, and the elimination of malicious information from sources without proper identification, should be considered."