AI, Hashtags and Deepfakes: The New Arsenal in Kenya’s Political Sphere

  • 18 Jul 2025
  • 3 Mins Read
  • 〜 by James Ngunjiri

The Kenyan political scene is experiencing sophisticated and well-coordinated disinformation campaigns that are shaping public perception and limiting civic participation. Disinformation is already in full display, and is rampant at both national and grassroots levels, fuelled by political, economic, and personal interests. Various actors, including politicians, strategists, and members of the public, are contributing to the disinformation ecosystem.

A recent analysis conducted by Baraza Media Lab, a Fumbua initiative, shows that the Kenyan information ecosystem in 2025 predominantly reflects negative sentiment. This is driven by coordinated disinformation campaigns, targeted propaganda, fear-based messaging, and suppression tactics.

The analysis indicates that the Kenyan digital space is increasingly weaponised, with state-aligned and non-state actors leveraging advanced tactics to manipulate public perception. The sentiment manipulations are to evoke a myriad of emotions, with the main ones being fear, anger, distrust, urgency, and grief, especially in relation to state violence, youth protests, abductions, and disinformation.

The Constitution limits freedom of expression to prevent the spread of false information. Key laws include the Computer Misuse and Cybercrimes Act (CMCA) and the Kenya Information and Communications Act (KISA). For instance, CMCA criminalises the intentional publication of false information, with penalties including fines of up to KSh 5 million or imprisonment. However, there is no clear legal distinction between misinformation and disinformation, complicating enforcement.

Even though multiple laws, social media platform guidelines, and user awareness efforts exist, disinformation remains a challenging issue to address. This is particularly true in the current polarised environment of Kenyan politics, which is compounded by sophisticated technological tools, the technical ability of its perpetrators to create and disseminate content, and a public that is not sufficiently aware of disinformation.

The Baraza Lab report reveals the weaponisation of hashtags and psychological operations (psyops). For instance, hashtags like #BBCForChaos, #DogsOfWar, #ArrestHanifa, and #Tulienitubonge exhibit a pattern of coordinated, state-aligned narratives aimed at delegitimising dissent or external critique, often utilising state officials, AI-generated content, and influencer accounts.

The report notes that there is also evidence and a pattern of narrative hijacking that is co-opted to distract, delegitimise critics, and reframe protests as “foreign-funded chaos.”

In addition, there are counter-narratives and rebuttal campaigns.  For instance, there have been grassroots and activist-driven hashtags, such as #JusticeForGenZ, #SiriNiNumbers, and #RejectFinanceBill2024, which aim to push back against the official narrative, demand justice, and rally for protest action.

The report also highlights the aspects of psychological warfare or fear-based narratives. Under this, digital themes like death warnings, such as BloodDonation Wednesday, mortuary bookings, abductions, and violent threats, show a deliberate attempt to scare citizens away from protesting.

Use of Emerging Technology

Deepfake videos, AI-generated imagery, and automated content trends are evident in several campaigns, indicating a sophisticated propaganda machine utilising modern tools to influence public discourse.

The report highlights a case where CNN’s Fareed Zakaria’s videos have been used multiple times, even by senior government officials.

This might indicate a group that has managed to train its AI model on Fareed’s mannerisms and is using that repeatedly.

There has also been the use of AI-generated content designed to dissuade the public from participating in protests and discourage digital activism.

Tribal & Regional Messaging

The report notes that narrative manipulation surrounding development in Central Kenya contributes to ethnic divisions—for instance, the 41 vs 1 narrative leverages existing political fault lines for polarisation.

This has also been fuelled by the “Handshake” or cooperation between the “opposition” and the government, with sentiments from opposition leaders encouraging “our time to eat” narratives. This further fuels the tribal divide.

Past Studies

Studies conducted earlier by the Kenya ICT Action Network (KICTANet), a multi-stakeholder think tank focused on ICT policy and regulations in Kenya, established that the publication and sharing of fake news on social media were rampant both before and after the general elections. Among the reasons given for the increase in fake news circulation were the speed and anonymity with which digital technology enables the spread and reach of information, financial profit gained by fake news purveyors, and the ability to reach large audiences quickly.

During this period, public institutions, particularly those involved in political processes, such as the Independent Electoral and Boundaries Commission (IEBC) and the Office of the Registrar of Political Parties, have faced disinformation.