Back

ACSUS

Africa-U.S. Journalism Forum: Civil Society Responses to Disinformation Challenges in Africa

Thanks to Bob and others for having me here today. The National Endowment for Democracy is an independent grantmaking institution dedicated to the growth and strengthening of democratic institutions around the world, and the International Forum for Democratic Studies – my section at NED – is essentially NED’s research arm. Here and in a previous job countering disinformation at the U.S. State Department, I’ve been consistently surprised by the enormity of the threat of disinformation and the power of the information space to either strengthen or undermine governance. Information is power. It’s a weapon and a shield. Healthy, independent forms of information are integral to the functioning of democracy. And bad, corrupted information creates divisions and distrust, which will eat away at democracy. Political polarization, distrust in government and key institutions like science, kleptocracy, and instability often have roots in a corrupted information space.

But I look forward today to hearing from you all about how this issue looks in the African context, to hear your ideas, and learn from you.

Before talking about the African context, I wanted to first step back to make sure we have a common understanding about what is and isn’t disinformation.

[Slide 2]

First: some definitions of terms.

At its most basic element, disinformation is socially or politically motivated and deliberately misleading messaging. Let’s use an example from perhaps the most common language for the world…the world of football (or in my case, soccer). “Christian Ronaldo is a terrible soccer player and hates children.” This statement is deliberately misleading (he is amazing) and it has a social or political effect (of making him a pariah in society for hating his most adoring fans).

Now, to misinformation: false information that is spread, regardless of intent. Suppose a casual soccer fan reads this statement on Facebook from his friend, and then likes or shares that post for any variety of reasons (maybe it was funny, he’s a Manchester United fan, whatever). That is misinformation, and it’s a major problem in the highly stressed and decentralized media industry…the unintentional spread of false information.

Next, malinformation. Malinformation is genuine information presented without proper context in order to deceive audiences. Back to our example: “Christian Ronaldo misses goals every game, and children can’t stand him.” What is missing here is the context: Christian Ronaldo takes a little over 5 shots per game…and makes just over 1 a game (an incredibly high rate). And the children that hate him are simply children who live in Milan and grew up rooting for the main rival of Ronaldo’s team in nearby Turin. Most young soccer fans really like Ronaldo, but that’s vital, essential context that is left out. Propaganda operates similarly as mal-information, but it’s state-generated messaging. So, in our example, the governor of Turin says “Christian Ronaldo misses goals every game, and children can’t stand him.”

The term “fake news” has become a loaded term, and one that is poorly understood and honestly difficult to separate from the abuse of the term by illiberal leaders, everyday citizens, and media organizations.

Disinformation, which really is at the root of these manipulations of the information space operate in a few ways globally and locally. I would simply separate the types of information into two categories: authentic behavior and inauthentic behavior. Authentic behavior takes the form of official, intentional, and overt amplification of false information. This can take the form of press or political leaders conducting false messaging or enclosed groups – like Facebook or WhatsApp groups – doing the same where real, genuine citizens are developing and spreading lies. Inauthentic behavior is the use of bots, paid accounts, imposter websites, or other covert tools. These activities utilize personas that are not who they say they are. A common tactic is called “astroturfing,” coming from the practice of using fake grass instead of real grass on yards outside of homes. Astroturfing involves the use of multiple, centrally-controlled social media accounts to amplify official, state-driven disinformation that gives the appearance of broad support for that state’s official position. The use of fake Chinese Twitter accounts – called the 50-cent army because that’s how much they are paid per post – is common among China’s official propaganda outlets. A huge amount of Twitter accounts – usually newly created ones that only follow each others’ accounts – immediately like and retweet the misleading tweets of China’s official foreign ministry spokesperson.

Disinformation can spread even more quickly in social media through the use of advanced technologies like natural language processing to make content sound like it came from a human (but without having to pay for one), or micro-targeting of specific populations, or the use of certain algorithms that continue feeding disinformation to groups when those groups heavily engage the content.

[Slide 3]

Now, to the African context. Recently, we organized a workshop with a variety of African civil society organizations, journalists, and researchers to discuss the impact of disinformation on the African media environment. A few trends emerged from these the discussions.

  1. COVID-19 is accelerating the spread of disinformation. Media consumption is high. Public health is an intensely emotional and personal topic. Illiberal actors often amplify or create disinformation through statements designed to achieve some type of political end during this time of crisis.
  2. Disinformation has become a factor in political transitions in emerging democracies. Disinformation – both domestic- and foreign-based – have complicated transitions undergoing in Sudan, Somalia, and Ethiopia.
  3. The lines are blurring between domestically-sourced and foreign-sourced disinformation. Disinformation activities by Russia and China – due to prominent examples in the U.S., Eastern Europe, and Taiwan – are well-known. However, domestically-sourced and amplified disinformation is continuing. Authoritarian or illiberal regimes in the Gulf and Turkey have become sources of disinformation in Africa, often through geopolitically motivated campaigns to increase their influence and access within certain countries. Domestic governments are utilizing the same tactics that more experienced players, like Russia, have long practiced. There is a growth in the use of public relations firms hired by government agencies to find ways to spread deliberately false and misleading political content. In Nigeria, we’ve come across instances of these “firms” operating essentially as loose, unincorporated groups of individuals-for-hire.
  4. New social media apps are being put to use to spread disinformation, but also by civil society organizations to counter it. The growth of disinformation in Africa through WhatsApp is an excellent example. Quartz and other media organizations have reported on the spread of misinformation and disinformation about COVID-19’s effects and “cures.” One example was the spread of the false claim in Nigeria that salt baths could cure COVID. As vaccine distribution has slowly ramped up globally, vaccine hesitation is rampant among these closed networks that are free from common fact-checking and content moderation policies like those from Twitter. At the same time, new sources of independent media – like South Africa’s The Continent – have grown and been able to penetrate these networks.
  5. Lastly, radio, print, and television are still at play as technology access can be mixed throughout the African continent.

[Slide 4]

The workshop we held also developed some solutions for civil society organizations, and I think journalism more specifically.

  1. We – and I mean funders like NED and those like us – need to invest in the capacity of organizations to track analyze, and counter disinformation. Countering disinformation takes the form of research, fact-checking, improving citizen detection of disinformation, and improving overall media literacy.
  2. Responses must be localized. Counter-disinformation activities need to use local languages. You can’t respond in English to disinformation being spread in Swahili or French. Credible voices that present true information or fact-check false information are often local voices and personalities that people know. Local media platforms must be involved in the counter-disinformation effort and better resourced in ways that bigger media enterprises are already doing to fact-check their content or analyze these trends.
  3. Like the example I gave about WhatsApp, responses to disinformation must be targeted to where the audience is. A South African company developed a bot, just purchased by the WHO for use globally, as an automated means of providing true content about COVID-19 on the WhatApp platform in a wide variety of languages.

[Slide 5]

  • Lastly, here are some resources I’d like to bring to your attention that are specific to journalists looking to better understand the spread of disinformation, and also to ensure accuracy in their reporting.
    • First Draft often provides journalists and researchers with solid tools to both analyze disinformation and track the spread of true information in response. They recently created a Vaccine Insights Hub specific to COVID that tracks disinformation across various social media platforms.
    • Code for Africa is an organization that partners with other global counter-disinformation researchers to analyze disinformation trends in Africa. One recent example of their work is a testament to how these organizations are working together. The recent Ugandan election was rife with practices that undermined democratic processes, including the shutting down of the internet around election time. A South African-based researcher with the Digital Forensics Lab, began by U.S. think tank, detected a complex network of disinformation and inauthentic behavior from the Ugandan government against opponents of the President. When the internet was shut down by the government, DFR’s partner (Code for Africa) worked with an East African fact-checking organization (PesaCheck) to help fact-check disinformation by through phone calls to verify (or debunk) claims by the government and visits to sites that were cited in official statements about the election. When internet access began to be restored, PesaCheck went back to its steady practice of providing fact-checking services online and in social media platforms like WhatsApp.
    • Two NED partners, the International Republic Institute and the National Democratic Institutes, and the International Foundation for Election Systems worked together to develop an online guide to the full suite of organizations countering disinformation to help better understand what works…and what doesn’t.

I’ll stop there, and I’m looking forward to hearing your perspectives and questions on this dynamic challenge.