
When All Tech Is Human released its 2025 Responsible Tech Guide, it crystallized something I’ve been feeling: the work we’ve been doing at the intersection of technology and social impact has a name, a framework, and an increasingly urgent mandate.
Subscribe Now for more digital development reports
The comprehensive guide reveals a field in rapid maturation—one that ICT4D professionals are already part of, whether we realize it or not.
But there’s a catch: while Responsible Tech is moving from “emerging idea” to “established field,” it’s also fragmenting across competing priorities, geographic boundaries, and resource constraints that should sound familiar to anyone working in global development.
The Agency Question
The guide opens with a philosophical anchor: the tension between Human Agency and Tech Determinism. This is the core question facing every digital development project. Do we believe technology’s trajectory is inevitable (determinism), or do we believe humans can actively shape it (agency)?
All Tech Is Human firmly stakes its position: “Our tech future happens BY us, not TO us.”
For ICT4D practitioners deploying digital health systems in fragmented healthcare environments or building farmer information platforms in low-connectivity contexts, this framing should resonate. We’ve never had the luxury of tech determinism. We’ve always had to bend technology to local realities, not the other way around.
But the guide’s survey of 275 practitioners reveals a troubling reality: the biggest concern about AI governance in 2025 isn’t technical capability—it’s regulatory lag and fragmentation, followed closely by political pressures and global governance challenges.
In other words, the very contexts where ICT4D operates.
Intersections We Need to Explore
The guide identifies three main Responsible Tech disciplines: Responsible AI (RAI), Trust & Safety (T&S), and Public Interest Technology (PIT). For development practitioners, the intersection of RAI and PIT is where the action is.
Both fields emphasize inclusion and harm reduction, particularly for marginalized communities who are disproportionately impacted by algorithmic systems. RAI focuses on how we build AI systems responsibly; PIT ensures those systems serve democratic and societal goals.
This isn’t theoretical. Consider the AI governance challenges facing African countries: countries simultaneously trying to leapfrog development stages while protecting citizens from algorithmic harms, often with limited regulatory capacity.
The guide notes that while Europe leads with frameworks like GDPR and the EU AI Act, the Global South faces a capacity gap: the harms of technology are significant, but the resources for enforcement, oversight, and technical expertise are limited.
Skills Gap as a Recognition Gap
One of the guide’s most valuable contributions is mapping career pathways. The 2025 careers report reveals that about 75% of jobs labeled “entry-level” in Responsible Tech require prior experience—a double impossibility that should sound familiar to anyone who’s tried to break into international development.
However, you likely already have Responsible Tech skills.
The guide identifies “translation”—the ability to blend business, ethical, technical, and legal frameworks across different organizational structures—as the highest-value skill. If you’ve ever had to explain why a perfectly functional technical solution won’t work in a specific cultural context, or navigated donor requirements while respecting community needs, you’re already doing this work.
The challenge is that ICT4D and Responsible Tech communities aren’t talking to each other enough. The guide explicitly notes the need to bridge technical and non-technical communities and emphasizes that the most pressing challenges cannot be solved from a single vantage point.
Yep, we know and live that reality.
From Principles to Practice
The guide’s case studies illustrate why siloed approaches fail. As the guide frames it:
“Thorny tech & society issues are not problems with simple solutions; they’re systemic challenges with technical, legal, cultural, and economic roots.”
Responsible Tech isn’t a separate field from ICT4D. It’s the same work with different terminology, facing parallel challenges at different scales. The sooner we recognize that, the sooner we can share lessons, build coalitions, and ensure that governance frameworks emerging in Brussels and Washington actually work in Nairobi and Dhaka.
Our tech future isn’t inevitable. But shaping it requires showing up. For ICT4D practitioners, this suggests three practical shifts:
1. Stop treating “ethics” as a checkbox exercise.
The guide’s survey revealed that the gap between skills listed in job descriptions and skills actually needed is significant—and soft skills like “institutional navigation” and “complexity communication” often matter more than technical depth.
2. Engage with the growing governance infrastructure.
The guide tracks “Moments That Mattered in 2025,” including the release of ISO/IEC 42005:2025 for AI impact assessments and state-level AI legislation across the U.S. Understanding these frameworks helps position your work strategically.
3. Join the conversation.
All Tech Is Human’s network spans 114 countries with 50,000+ practitioners. Regional gatherings through their ATIHx initiative have emerged in cities from Barcelona to Boston to Philadelphia. The ecosystem exists—it just needs more voices from contexts where technology’s stakes are highest.

