The datafication of refugees: humanitarian agencies & biometrics with Zara Rahman from the Superrr Lab

Biometrics – our fingerprints, faces, irises, for instance – are increasingly used to verify identity. But what happens when this data collection is applied to vulnerable populations, like refugees and asylum seekers, in ways that can remove agency rather than offer them protection? In the humanitarian space, organizations justify biometric data collection in a way to increase efficiency, yet stories have shown that such mechanisms can be weaponized: data handed over to oppressive governments, misidentifications leading to life-altering mistakes, and accountability often falling on the very people humanitarian programs claim to help. Beyond survival depending on data-driven systems, racial capitalism also plays a critical role by reinforcing the same global inequalities that force people to migrate in the first place. Who benefits from implementing biometric data collection in a humanitarian context, and who bears the consequences when it fails? To answer these questions and more, I had the pleasure to talk with Zara Rahman, author of “Machine Readable Me: The Hidden Ways That Technology Shapes Our Identities”, Strategic Advisor at the SUPERRR Lab and Visiting Research collaborator at the Citizens and Technology Lab at Cornell University. Zara is a researcher, writer, public speaker and non-profit executive, whose interests lie at the intersection of technology, justice and community. For over a decade, her work has focused on supporting the responsible use of data and technology in advocacy and social justice, working with activists from around the world to support context-driven and thoughtful uses of tech and data.

Om Podcasten

Shifting the narrative from Big Tech to Responsible Tech by sharing & archiving the work of change makers.At the intersection of technology and social justice, Activists Of Tech is a seasonal weekly podcast dedicated to the amplification and archival of minority voices among activists, thought leaders, academics, and practitioners of responsible tech. Shifting the narrative from Big Tech to responsible tech takes honesty: this is a "say it as it is" type of podcast, and no topic is too taboo not to be named and addressed.The topics covered encompass a variety of responsible tech areas and focus on social justice, AI harm, AI bias, AI regulation and advocacy, minorities in tech, gender equality, tech and democracy, social media, and algorithmic recommendations, to name a few. We also talk about solutions and how to make tech inclusive and beneficial for all.