“The power of visual evidence is under threat”: What do Nigerian elections teach us?

“The power of visual evidence is under threat”: What do Nigerian elections teach us?

Only 26.72 percent of the total 93.47 million registered voters came out to vote during last year’s Presidential elections in Nigeria. These numbers, as reported by our client Dataphyte, are the lowest since the country’s return to democracy in 1999. Though erosion of trust amongst Nigerian voters cannot simply be blamed on the state of the media in the country, headlines about a deluge of fake news on social media in the run up to the elections and political parties secretly paying social media influencers to spread disinformation as nine out of 10 Nigerians access news online raises some questions.

With the proliferation of AI-powered deepfakes and narrative wars built on systemic disinformation, will fake news become a permanent feature of elections going forward? Are media organizations with limited funds ready to tackle this challenge? If not, how can we help them prepare for it?

We discussed this with Oluwadara Ajala, Program Manager for our Nigeria Media Innovation Program. Having worked closely with a slew of independent newsrooms during the elections last year and having recently delivered a keynote address on Media manipulation and technology’s spread of information divergence in Nigeria at the Media Party in New York, Dara shared her insights on the threat to the power of visual evidence in the light of growing deepfakes, how newsrooms are using AI to counter AI, and more.

Question: You used a quote in your presentation deck at the Media Party in New York which said, ‘Seeing is no more believing’. Can you tell us how this reflected in the elections in Nigeria?

Answer: The power of visual evidence is under threat, and the trust basis of audio visual evidence is being undermined. More significant than struggling to believe something was the phenomenon where it seems as though nobody believes anything because foundations of what what people hold to be true are being eroded. And holders of power tend to take advantage of this in a phenomenon called the Liar’s Dividend, where making people believe one untrue thing can make them believe that everything else is untrue as well.  In Nigeria, from the audience perspective, we saw significant: erosion of trust, news apathy, news avoidance, but worse, ‘ostriching’ (“the deliberate avoidance or ignorance of conditions as they exist”, according to Merriam Webster). Only 26% of registered voters turned up to vote. People seemed to dismiss major truths, no matter how groundbreaking the investigation, because they had been inundated with allegations of the real being fake and fakes being real. This had an impact on media where investigative stories that journalists risk their lives for ended up having significantly low impact or even views and reaction. Two examples of how this illusion played out were the alleged audio conspiracy between the Peoples Democratic Party Presidential candidate and some other personalities to rig the 2023 elections, and another purported audio conversation between the Labour Party candidate with another influential figure circulated widely on social media. Naturally both audio files were dismissed by the parties involved as being ‘deepfakes’.  

Question: You also mention “AI to counter AI” in your talk. Could you tell us how media organizations are using AI to tackle AI related issues?

Answer: AI is a tool that dances to the highest bidder, this not just in terms of financial but skills as well. We saw the development of the self-funded alliance of 12 fact-checking news and research organisations, including Dubawa (a NAMIP cohort member). These organisations worked before, during and post elections, fact-checking declarations made during debates and campaigns, information spread across social media as well as post voting results information. These points were critical because these are the areas where misinformation spreads fastest and is most detrimental. Dubawa’s Kemi Busari mentioned that matching the speed of misinformation was herculean so they found creative ways to develop a system of verification of digital news items that is readily available to journalists and newsrooms using AI. Currently open-source AI tools exist that can help journalists and citizens quickly detect AI-enabled misinformation, particularly shallow fake photos or videos. These were and continue to be actively utilised to match misinformation speeds but also specifically target the audiences that may have influenced by misinformation through synthetic media.

Question: What are some of the common shortcomings you notice when it comes to independent media tackling AI deepfakes and misinformation? What are some steps that young organizations can take to build a solid foundation to counter the emerging tech wave in media?

Answer: Media have struggled to catch up with the audience’s perspective on the confirmation of what is true. A groundbreaking national level investigative piece that possibly took years to uncover can be dismissed in as little as a sentence from a power holder saying it is fake. And suddenly truth has to now start defending itself and providing evidence of its reality post-accusation. The accuser now becomes the defender. Thus, organizations can take the scientific approach to reporting as a fellow speaker at Media Party New York, Julia Angwin, put it. Here, like a published scientific paper, each story needs to be accompanied by what she called an “ingredients label” or what I call ‘proof of concept’ that lays out its hypothesis, sample size, reporting techniques, key findings and limitations.

Media’s important role can be to help audiences make sense of the chaos and  provide the information and insights needed for citizens to develop a whole new set of skills and quickly, to help provide an anchor for the drifting ship that is truth, reality and facts. Media need to also understand that with erosion of public trust, citizens are moving to more community-focused ways of identifying the truth. Stronger audience understanding and developing products that not only meets the needs of the audience but empowers them to take small, tangible action at their community level is needed.