Suzy Shepherd on Imagining Superintelligence and "Writing Doom"

Suzy Shepherd joins the podcast to discuss her new short film "Writing Doom", which deals with AI risk. We discuss how to use humor in film, how to write concisely, how filmmaking is evolving, in what ways AI is useful for filmmakers, and how we will find meaning in an increasingly automated world.    Here's Writing Doom:   https://www.youtube.com/watch?v=xfMQ7hzyFW4    Timestamps:  00:00 Writing Doom   08:23 Humor in Writing Doom  13:31 Concise writing   18:37 Getting feedback  27:02 Alternative characters  36:31 Popular video formats  46:53 AI in filmmaking 49:52 Meaning in the future

Om Podcasten

The Future of Life Institute (FLI) is a nonprofit working to reduce global catastrophic and existential risk from powerful technologies. In particular, FLI focuses on risks from artificial intelligence (AI), biotechnology, nuclear weapons and climate change. The Institute's work is made up of three main strands: grantmaking for risk reduction, educational outreach, and advocacy within the United Nations, US government and European Union institutions. FLI has become one of the world's leading voices on the governance of AI having created one of the earliest and most influential sets of governance principles: the Asilomar AI Principles.