#65 Prof. PEDRO DOMINGOS [Unplugged]

Note: there are no politics discussed in this show and please do not interpret this show as any kind of a political statement from us.  We have decided not to discuss politics on MLST anymore due to its divisive nature.  Patreon: https://www.patreon.com/mlst Discord: https://discord.gg/HNnAwSduud [00:00:00] Intro [00:01:36] What we all need to understand about machine learning [00:06:05] The Master Algorithm Target Audience [00:09:50] Deeply Connected Algorithms seen from Divergent Frames of Reference [00:12:49] There is a Master Algorithm; and it's mine! [00:14:59] The Tribe of Evolution [00:17:17] Biological Inspirations and Predictive Coding [00:22:09] Shoe-Horning Gradient Descent [00:27:12] Sparsity at Training Time vs Prediction Time [00:30:00] World Models and Predictive Coding [00:33:24] The Cartoons of System 1 and System 2 [00:40:37] AlphaGo Searching vs Learning [00:45:56] Discriminative Models evolve into Generative Models [00:50:36] Generative Models, Predictive Coding, GFlowNets [00:55:50] Sympathy for a Thousand Brains [00:59:05] A Spectrum of Tribes [01:04:29] Causal Structure and Modelling [01:09:39] Entropy and The Duality of Past vs Future, Knowledge vs Control [01:16:14] A Discrete Universe? [01:19:49] And yet continuous models work so well [01:23:31] Finding a Discretised Theory of Everything

Om Podcasten

Welcome! We engage in fascinating discussions with pre-eminent figures in the AI field. Our flagship show covers current affairs in AI, cognitive science, neuroscience and philosophy of mind with in-depth analysis. Our approach is unrivalled in terms of scope and rigour – we believe in intellectual diversity in AI, and we touch on all of the main ideas in the field with the hype surgically removed. MLST is run by Tim Scarfe, Ph.D (https://www.linkedin.com/in/ecsquizor/) and features regular appearances from MIT Doctor of Philosophy Keith Duggar (https://www.linkedin.com/in/dr-keith-duggar/).