Perils of Principles in AI Ethics

The hospital faced an ethical question: should we deploy robots to help with elder care? Let’s look at a standard list of AI ethics values: justice/fairness, privacy, transparency, accountability, explainability. But as Ami points out in our conversation, that standard list doesn’t include a core value at the hospital: the value of caring. And that’s one example of one of three objections to a view he calls “Principalism.” Principalism is the view that we do AI ethics best by first defining our AI ethics values or principles at that very abstract level. This objection is that the list will always be incomplete. Given Ami’s expertise in ethics and experience as a clinical ethicist, it was insightful to see how he gets ethics done on the ground and his views on how organizations should approach ethics more generally. Ami Palmer received his PhD in philosophy from Bowling Green State University where he wrote his dissertation on the challenges conspiricism and science denialism pose to democratic policymaking. His primary research areas include the effects of medical misinformation on clinical interactions and the ethics of AI in healthcare. With respect to medical misinformation he has recently developed a conversation guide to help providers better navigate conversations with patients who endorse medical misinformation. In the ethics of AI, he coauthored the American Nursing Association's Position Statement of the Ethics of AI in Nursing. His hobbies include judo, Brazilian Jiu Jitsu, dance, and hiking with his 4 wiener dogs.

Om Podcasten

I talk with the smartest people I can find working or researching anywhere near the intersection of emerging technologies and their ethical impacts. From AI to social media to quantum computers and blockchain. From hallucinating chatbots to AI judges to who gets control over decentralized applications. If it’s coming down the tech pipeline (or it’s here already), we’ll pick it apart, figure out its implications, and break down what we should do about it.