The TESCREAL to fascism pipeline with Adrienne Williams from the Distributed AI Research Institute (DAIR)

Reading about TESCREAL feels like reading a bad sci-fi storyline written by a man with a god complex. Unfortunately, it’s real: a movement that allows its proponents to use the threat of human extinction to justify expensive or harmful projects and demand billions of dollars to save us from these "existential threats". Sounds familiar, doesn’t it? These are very real aspirations, and some of the tech billionaires setting the rules of the tech industry even call themselves “creators”. It’s worse than godfather, if you ask me, though it’s fairly close. We’ve seen it for example with OpenAI asking for billions to “save us” from the very AI systems they are still trying to build, somehow, and more generally AI gurus working on how to save us from killer robots or AI systems becoming conscious instead of addressing hunger, homelessness, inequalities, or, environmental issues in their country (even just in their city would have more of an impact than stealing money under the excuse of saving people who don’t exist yet from evil AI systems that also don’t exist yet). But it’s not about helping, or “AI for Humanity” as they like to call it: it’s about power, influence and money. And the pipeline from these delusions to the far right ideology and technofascism is pretty straightforward.Adrienne Williams, researcher at the Distributed AI Research Institute joined me to talk about TESCREAL, neocolonialism, policymaking, and everything in between.Created, hosted and produced by Mélissa M'Raidi-Kechichian.

Om Podcasten

Shifting the narrative from Big Tech to Responsible Tech by sharing & archiving the work of change makers.At the intersection of technology and social justice, Activists Of Tech is a seasonal weekly podcast dedicated to the amplification and archival of minority voices among activists, thought leaders, academics, and practitioners of responsible tech. Shifting the narrative from Big Tech to responsible tech takes honesty: this is a "say it as it is" type of podcast, and no topic is too taboo not to be named and addressed.The topics covered encompass a variety of responsible tech areas and focus on social justice, AI harm, AI bias, AI regulation and advocacy, minorities in tech, gender equality, tech and democracy, social media, and algorithmic recommendations, to name a few. We also talk about solutions and how to make tech inclusive and beneficial for all.