By Victor Bwire
Now more than ever before, those working on Kenya’s radicalization and violent extremism need to target interventions, including countering violent extremism ideologies and recruitment through virtual spaces.
Increasingly, the recruitment and spread of violent extremist ideas have expanded in space, reach, and influence through digital platforms, especially among the youth and other vulnerable groups.
The use of social media, messaging apps, gaming platforms, and related online forums for recruitment, information manipulation, crowdsourcing, and fundraising, among others, are the leading threats posed by violent extremism globally.
To counter this and provide sufficient mitigation measures, prevention interventions must increasingly target behavior change among the most vulnerable groups, rather than one-off campaigns.
Counter-narratives should focus on knowledge, attitude, and behavior change, not just sensitization.
A review and customization of communication interventions and tools for narrative development is urgently needed, while the need for information exchange among actors is equally critical.
Niche messaging, and more importantly the use of appropriate tools for specific communities and audiences, guided by accessibility of channels, relevance of applications, and affordability of campaign methodologies, are key variables—especially as interventions move into virtual spaces.
While global, regional, and country strategies are important, community-specific considerations remain critical. Youth are in virtual spaces, but traditional media has also converged and now operates online. Integrated communication is therefore very relevant.
Digital platforms, through their various tools including gaming applications, are the new theatre for radicalization and recruitment, and agencies involved in the fight against terrorism must invest in strategic communications to counter this threat.
Countering violent extremism narratives and deconstructing radical ideologies targeting youth through online platforms are critical interventions.
The involvement of digital platform providers is also critical, as most of their community rules are currently unable to deal effectively with the dangers and threats of violent extremism.
Big tech companies must appreciate that while freedom of expression is vital, they must, at the source and innovation level, provide adequate mitigation measures.
They host the applications and games that lure youth online and, by extension, make them targets for recruitment into violent extremism.
They must not focus only on opportunities or hide behind free speech arguments at the expense of protecting communities from the threats of radicalization.
On the criminal front, they must work with governments to ensure platforms are not agents of crime. They have a responsibility to society.
It is imperative and urgent that government, intelligence, law enforcement agencies, and other non-state actors—including the private sector, faith-based organizations, and civil society—intensify collaborative and joint actions to prevent recruitment through virtual spaces.
Efforts to implement the country’s multi-agency approach must include tech companies and communities in delivering proactive media and digital literacy sessions.
These should be rolled out across communities, especially targeting youth in both formal and informal education institutions, with clear messages on the dangers of consuming harmful content.
We must move prevention of violent extremism conversations into digital spaces using sharp and targeted messaging, as traditional counter-narrative campaigns can no longer keep pace with online realities.
Most youth are on Facebook, TikTok, YouTube, and related digital platforms, and that is where prevention campaigns must be taken.
Unlike a few years ago, when media was the oxygen of terrorism, virtual spaces have now taken over and become the primary arena for action.
Current prevention efforts by governments and other players have yielded results, but more is needed as recruitment shifts online.
Governments must increase security operations through intelligence gathering, information sharing—including cross-border cooperation—and enhanced deterrence operations.
Non-state actors must strengthen community resilience through counter-narratives, media and digital literacy, online messaging, and counter-radicalization initiatives.
Media-security dialogues are also needed to minimize terrorist publicity while ensuring information sharing does not compromise national security.
Cooperation among all players is strategic to ensure coordination across sectors.
This includes nurturing partnerships between communities, civil society, governments, and security agencies, and enabling communities to embrace technology in identifying and responding to early warning signs of radicalization and violent extremism.
Trends in terrorism financing have been established through research, and these findings must be disseminated to support effective interventions. With collaboration between security agencies and the media, such trends can be exposed.
Terrorism financing is increasingly online, including through online banking, transnational crimes, organized criminal gangs, cross-border poaching activities, gaming platforms, and related systems.
With intelligence support, the media can trail and expose these activities through investigative reporting, helping to name and shame perpetrators.
Sector players must use online platforms and local-based media spaces to upscale interventions targeting vulnerable groups in areas where violent extremism has taken root, in order to build resilience against harmful narratives.
Media must be increasingly included in counter-terrorism efforts, as journalists are often exposed to physical danger during operations.
They risk being targeted by terrorists or security agencies, experience trauma, or face attempts at recruitment or radicalization.
Specific interventions for the media are therefore necessary, as they are a high-risk group in the war on terror.
The Media Council of Kenya Code of Conduct for Media Practice 2025 already includes provisions on artificial intelligence, user-generated content, and gaming. These provisions restrict the media from allowing such tools to be used to exploit vulnerable populations in any way.
This discussion must now be taken directly to the doors of digital platform providers.
The writer, Victor Bwire, is the Director of Media Training and Development at MCK.



