Defining a policy framework to respond to infodemics

First published on Sunday Business & IT, August 2, 2020.

As early as February this year, the World Health Organization (WHO) dubbed the coronavirus disease 2019 (Covid-19) as a massive infodemic. The plethora of information makes it challenging for people to look for trustworthy sources and reliable guidance when they need it. I watched the viral video of United States-based Nigerian-trained doctor Stella Immanuel before Facebook, YouTube and Twitter took it down for promoting misinformation. Dr. Immanuel claimed to have treated over 350 Covid-infected people in the US using hydroxychloroquine (HCQ), zinc and Zithromax. US President Donald Trump even shared multiple accounts of the video with his 84 million Twitter followers. I flagged it as false news on Facebook because the doctor said, “You don’t need a mask. There is a cure.” I think Dr. Immanuel was too excited and exaggerated in saying that HCQ was a cure and the wearing of face masks was not needed. Half-truths on HCQ shared with false contextual information is a type of misinformation.

The New England Journal of Medicine reports that as of June 1, 2020, listed 203 Covid-19 trials with HCQ, while 60 focused on prophylaxis. During a pandemic, doctors cannot afford to wait for hard evidence when faced with patients who might die. In fact, coronavirus patients in the Philippines who need the investigational drug, remdesivir could get special “compassionate permits” from the Food and Drug Administration, but conspiracy theories surround the use of anti-malarial drug HCQ to beat Covid-19.

“It’s easy to dismiss conspiracies, but we have to understand why they’re taking hold,” according to First Draft’s co-founder and US Director Dr. Claire Wardle. “We have to recognize that there’s a lot of misinformation and people are becoming nodes for this because they’re scared.”

At the RightsCon Online, the forum on information and democracy created a working group on the fight against infodemic and its response to information chaos. This group is co-headed by Rappler Chief Executive Officer Maria Ressa and former member of the European parliament and International Policy Director at Stanford’s Cyber Policy Center Marietje Schaake. Final recommendations would be presented to the signatory, the States of the International Partnership on Information and Democracy.

Participation in the inaugural working group does not end at RightsCon Online. Experts, academics and jurists all over the world can still contribute to define a policy framework or set of recommendations to respond to the infodemics through the four structural challenges. They cover four major topics: meta regulation of content moderation, platforms’ design and reliability of information, rules for messaging systems exploiting the possibilities of the online public domain, and transparency of structuring platforms regarding their business models and algorithmic choices. The contributions would enable the working group’s rapporteurs’ team to draw up recommendations for governments and digital platforms. Policy framework by the Forum covers the following:

Meta regulation of content moderation. To evolve from content regulation to meta regulation (regulation of the corporate actors that dictate the moderation rules), we need to develop a set of principles that platforms and social media would have to accept in accordance with international standards of freedom of opinion and expression.
Platforms’ design and reliability of information.

The pandemic has showed the need to reverse the amplification of sensational content and rumor by promoting reliable news and information in a structural manner. Based on established criteria, mechanisms and policies to promote the authenticity, reliability and findability of content are to be determined.

Mixed private and public spaces on private messaging systems. The virality of fake news shared on messaging apps is reinforced by the use of groups that sometimes have thousands of members. It is important to define minimal rules for messaging apps that exploit the possibilities of the online public domain while complying with international standards on freedom of opinion and expression.

Transparency of digital platforms. Access to the qualitative and quantitative data of leading digital platforms, as well ad access to their algorithms is a prerequisite for evaluating them. Transparency requirements must, therefore, be imposed on the platforms to determine whether they are respecting their responsibilities in the aforementioned areas and with regard to their business models and algorithmic choices.

The call for contributions is open until Sept. 15, 2020. You could sign up here: