From the sidelines to the frontlines: Why philanthropy must equip activists for the age of AI

The rapid rise of artificial intelligence (AI) is fundamentally reshaping society- and with it, the landscape of activism. Authoritarian governments are swiftly adopting AI for surveillance, predictive policing and the suppression of dissent, while corporations deploy it in ways that deepen inequality and weaken accountability. Legal frameworks lag behind almost everywhere.
Meanwhile, grassroots activists – the very individuals defending democratic values – remain largely unequipped. Lacking the resources, knowledge and tools to respond to or leverage AI,
they are dangerously exposed in this new technological era. Even in democracies, unregulated AI poses growing threats to civil liberties and public trust.
Philanthropy has a critical role to play in correcting this imbalance. Yet many foundations remain uncertain about how to engage. Still, those committed to human rights and democratic resilience must act with urgency – not only to shield civil society from AI’s harms but to empower them to shape and use it ethically.
Activists are being left behind – even as AI is used to undermine rights
Activists worldwide increasingly recognise that AI is reshaping their work – both as a tool they might use and a threat they must confront. In a recent survey of CANVAS partners, 100% of respondents expressed a desire to better understand and use AI. They see opportunities to enhance organising but face major barriers: limited resources, unclear risks and a lack of trusted tools tailored to their needs. This leaves them exposed.
Meanwhile, authoritarian actors are deploying AI for surveillance, predictive policing and social media monitoring to pre-empt dissent. Generative AI is fabricating scandals, spreading disinformation and harassing activists – shrinking civic space not only in repressive regimes, but increasingly in democracies.
While state and corporate actors rapidly adopt AI tools, grassroots groups struggle to build even basic AI literacy or threat awareness. Ironically, activists in entrenched authoritarian regimes are sometimes more digitally aware of technological threats than those in backsliding democracies. The difference? Experience. Where activists have long faced surveillance and information control, they’ve learned to adapt. In democracies, however, many seem unprepared. This isn’t a failure of imagination – it’s a failure of access and experience. Safe, responsible AI is costly, often proprietary and rarely built for civic resistance. The result: a widening power imbalance, with civil society reacting to technologies they had no role in shaping.
What activists hope for from philanthropy
Through our recent work- training activists on AI, building AI tools and engaging hundreds of frontline defenders – we’ve heard consistent hopes for how philanthropy can help:
- Invest in capacity, not just tools: Activists need more than access – they need training, strategic support and guidance on surveillance, privacy and bias. Philanthropy can fund hands-on learning, AI literacy and responsible use frameworks rooted in real-world needs.
- Support safe, equitable AI infrastructure: Many activists distrust corporate AI. Philanthropy can back open-source, community-driven alternatives grounded in transparency and data sovereignty.
- Fund rapid response to AI threats: Activists need flexible support – from digital security audits to legal aid- to counter AI-driven harms.
- Enable policy advocacy and governance: Philanthropy can ensure that activists are represented in national and global AI policy spaces, where their voices remain scarce.
Responsible AI needs civil society
Too often, AI policy conversations occur in exclusive spaces – labs, boardrooms and summits – with limited grassroots engagement. When civil society is included, it’s often too late.
Activists must be central to AI design. They know how facial recognition chills free speech, how predictive policing reinforces injustice, how opaque algorithms cause harm. Their lived experience isn’t peripheral – it’s critical.
Staying ahead of the curve: Movement strategy for a new era
At CANVAS, we approach AI through a movement-building lens grounded in strategy and adaptation. Just as successful campaigns require discipline in the streets or halls of government, they require foresight in digital spaces. As AI becomes both a tool of repression and a potential asset for resistance, activists must treat it as strategic terrain – requiring preparation, skills and coordination.
AI-powered tactics – deepfakes, surveillance, disinformation – are being used to divide and suppress civic movements. Though most visible in repressive regimes, these tactics increasingly surface in democratic contexts as well – used by political actors, private firms or bad-faith networks to suppress dissent, distort debate or intimidate activists. In this environment, increased understanding and coordination are crucial. Movements need collective protocols to respond – whether to disinformation, surveillance, reputational attacks or other AI-enabled threats. Philanthropy can support this strategic resilience.
Activists also need more than tools – they need strategy tailored to their realities. CANVAS workshops help non-violent campaigners map power, anticipate countermeasures and reduce risk. Embedding AI literacy into these processes ensures that technology isn’t an afterthought – it’s integral to a movement’s long-term strategy.
Supporting civil society in the AI era is not a one-off – it’s a long-term commitment. Like any non-violent campaign, timing, sequencing and adaptability matter. Philanthropy must be prepared to fund not only trainings or tools, but the strategic growth of resilient movements.
From risk to vision
Empowering activists to use AI responsibly isn’t just defensive – it’s visionary. Imagine AI helping land defenders track encroachment, surfacing underreported trends in gender-based violence, or powering real-time legal support for communities in need. These are possible – if philanthropy invests now.
This means funding experimentation, supporting local technologists and creating space for civic-led innovation. Most of all, it means rejecting the notion that civil society must catch up only after AI is fully developed. Instead, let’s co-create AI systems and tools that reflect civic values and grassroots needs. Let’s support AI – built by activists, for activists.
As more foundations and grassroots organisations engage AI, flexibility is key. The technology is evolving rapidly and so must strategies. Delaying engagement until everything is clear means missing the chance to shape norms and priorities – and then having to react later to problems defined or created by others. Civil society voices are urgently needed – especially as synthetic media, autonomous agents and surveillance tools raise new threats of manipulation and control.
It’s time to act
The AI era is here – reshaping public discourse, redistributing power and redrawing the boundaries of freedom and control. Civil society cannot remain on the sidelines and philanthropy cannot afford to leave them there.
Foundations have long championed human rights, justice and democracy. Now is the time to extend that leadership into the age of AI. The need is urgent. The risks are real. And the potential – for both harm and radical progress – is immense. Let’s equip activists not just to survive this moment but to shape what comes next.
Authors

