Response from IT for Change to the 'UNESCO Guidance for Regulating Digital Platforms: A Multistakeholder Approach'

UNESCO has released its initial draft proposal of the Guidelines on Regulating Digital Platforms: a multistakeholder approach (Guidance Document) for inputs and observations. The document is meant to serve as a guidance on the regulation, co-regulation, and self-regulation of digital platforms, “with the aim of supporting freedom of expression and the availability of accurate and reliable information in the public sphere, while dealing with content that potentially damages human rights and democracy”.

 

The specific focus of the Guidance Document is on the structures and processes to regulate online content on these platforms so that users can have a ‘safer, critical and self-determined interaction’ with it. Additionally, the Document puts great emphasis on the role of an independent regulator to set the goals of online content regulation and oversee platform compliance, without intervening in actual content decisions.

 

In a detailed submission, IT for Change provided comments and recommendations on the Guidance Document, based on our research efforts over several years on the digitally transformed information ecosystem, platform-mediated and algorithm-driven public sphere, and the legal, institutional, and design safeguards necessary to secure a just and equitable online space that can foster democratic values.

The key areas of our submission are outlined below:

  1. Definition of digital platforms: The Guidance Document does not define the term ‘digital platforms’ and only gives a few examples such as social media networks, search engines, and content sharing platforms. Different platforms perform different kinds of functions and exercise different levels of control in relation to the content that they host, and have different scales of reach. This necessitates different regulatory approaches. In its current form, the framework and obligations prescribed in the Guidance Document seem to pertain more aptly to the regulation of large social media companies that host user-generated content and perform moderation and curation functions. Therefore, our comments address the regulation of online content on large social media platforms that we refer to as ‘digital content platforms’.
     

  2. Responsibilities of the government: Firstly, all direct and indirect government restrictions on online content should follow due process, and the reasons for such restrictions should be made public. Secondly, the governments should take legislative and policy measures to require digital content platforms to respect user rights to freedom of expression, the right to information, and equality and non-discrimination, and hold platforms liable for their actions that threaten user rights or facilitate various forms of online harms, including online gender-based violence.
     

  3. On transparency process of digital content platforms: The Guidance Document should ask digital content platforms for more robust disclosure of information about their policies, practices, and operations. This should include transparency not only regarding the means employed for content moderation as required by the Document, but also the algorithmic tools used for content curation, the platform’s policies and practices regarding placement of advertisements on content, number of human moderators, their expertise, and employment status, and details about the complaints received by the platforms; all with greater nuance and attention to local as well as cultural specificity of the country concerned.
     

  4. On the content management policies: Content management policies should clearly define and inform the users about the types of content liable to be taken down under the mandate of the law, and the types that will be restricted due to their harmful nature, albeit their lawfulness. Platforms should disclose the criteria they use to determine whether a content which is legal, is still harmful enough to be restricted and what actions will be taken with respect to such content. Further, the content management policies should take into account the particular context of the country in which they are operating, particularly, the unique disadvantages and locally specific forms of online abuse faced by certain social groups falling at the intersection of marginalized identities.
     

  5. On enabling environment and content that potentially damages democracy and human rights: Digital content platforms should conduct periodic risk assessments and submit reports of the same to the independent regulator in order to determine, identify and address any actual or potential harm or human rights impact of their operations/actions. Such risk assessment is necessary to check the platform’s role in facilitating content that potentially damages democracy and human rights, including mis- and disinformation and hate speech. Further, digital platforms hosting news, such as news apps or news aggregators, or even social media platforms sharing news content, should optimize their curation algorithms for diversity in order to advance the goal of securing information as a public good.
     

  6. On user reporting: Users should have access to all information related to reporting and grievance redressal, including details of the representative appointed by the digital content platform in their own country, information about policies in a digestible format and in all relevant languages, and recourse/appeal mechanisms available for content or account-related decisions taken by the platform.
     

  7. On media and information literacy: Digital content platforms should implement specific media and information literacy measures for women, children, youth, and indigenous groups, as well as for gender and sexual minorities.
     

  8. On Data Access: In the interest of transparency leading to accountability, digital content platforms should be encouraged and required by the regulator to share their data with independent researchers, auditors, supervisory authorities, and other external stakeholders.
     

  9. Interoperability and data portability: To minimize internet fragmentation and promote diversity of information and viewpoints, platforms should allow for data portability, and interoperability. The concerned regulators must lay down standards for this and oversee their implementation.
     

  10. On Regulator’s Constitution and Powers: The regulator should be multi-sectoral and the Guidance Document should elaborate on the composition and qualification of its members, the civil measures that it can take against erring platforms, the details of any investigation carried out or entrusted by it, and transparency regarding the prioritizations and decisions that it makes.
     

 

Read our full submission here. 

Focus Areas
What We Do
Resource Type