ICLR 2020: Yoshua Bengio and the Nature of Consciousness

In this episode of Machine Learning Street Talk, Tim Scarfe, Connor Shorten and Yannic Kilcher react to Yoshua Bengio’s ICLR 2020 Keynote “Deep Learning Priors Associated with Conscious Processing”. Bengio takes on many future directions for research in Deep Learning such as the role of attention in consciousness, sparse factor graphs and causality, and the study of systematic generalization. Bengio also presents big ideas in Intelligence that border on the line of philosophy and practical machine learning. This includes ideas such as consciousness in machines and System 1 and System 2 thinking, as described in Daniel Kahneman’s book “Thinking Fast and Slow”. Similar to Yann LeCun’s half of the 2020 ICLR keynote, this talk takes on many challenging ideas and hopefully this video helps you get a better understanding of some of them! Thanks for watching!  Please Subscribe for more videos! Paper Links: Link to Talk: https://iclr.cc/virtual_2020/speaker_7.html The Consciousness Prior: https://arxiv.org/abs/1709.08568 Thinking Fast and Slow: https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp/0374533555 Systematic Generalization: https://arxiv.org/abs/1811.12889 CLOSURE: Assessing Systematic Generalization of CLEVR Models: https://arxiv.org/abs/1912.05783 Neural Module Networks: https://arxiv.org/abs/1511.02799 Experience Grounds Language: https://arxiv.org/pdf/2004.10151.pdf Benchmarking Graph Neural Networks: https://arxiv.org/pdf/2003.00982.pdf On the Measure of Intelligence: https://arxiv.org/abs/1911.01547 Please check out our individual channels as well! Machine Learning Dojo with Tim Scarfe: https://www.youtube.com/channel/UCXvHuBMbgJw67i5vrMBBobA Yannic Kilcher: https://www.youtube.com/channel/UCZHmQk67mSJgfCCTn7xBfe Henry AI Labs: https://www.youtube.com/channel/UCHB9VepY6kYvZjj0Bgxnpbw 00:00:00 Tim and Yannics takes 00:01:37 Intro to Bengio 00:03:13 System 2, language and Chomsky 00:05:58 Cristof Koch on conciousness 00:07:25 Francois Chollet on intelligence and consciousness 00:09:29 Meditation and Sam Harris on consciousness 00:11:35 Connor Intro 00:13:20 Show Main Intro 00:17:55 Priors associated with Conscious Processing 00:26:25 System 1 / System 2 00:42:47 Implicit and Verbalized Knowledge [DONT MISS THIS!] 01:08:24 Inductive Priors for DL 2.0 01:27:20 Systematic Generalization 01:37:53 Contrast with the Symbolic AI Program 01:54:55 Attention 02:00:25 From Attention to Consciousness 02:05:31 Thoughts, Consciousness, Language 02:06:55 Sparse Factor graph 02:10:52 Sparse Change in Abstract Latent Space 02:15:10 Discovering Cause and Effect 02:20:00 Factorize the joint distribution 02:22:30 RIMS: Modular Computation 02:24:30 Conclusion #machinelearning #deeplearning

Om Podcasten

Welcome! We engage in fascinating discussions with pre-eminent figures in the AI field. Our flagship show covers current affairs in AI, cognitive science, neuroscience and philosophy of mind with in-depth analysis. Our approach is unrivalled in terms of scope and rigour – we believe in intellectual diversity in AI, and we touch on all of the main ideas in the field with the hype surgically removed. MLST is run by Tim Scarfe, Ph.D (https://www.linkedin.com/in/ecsquizor/) and features regular appearances from MIT Doctor of Philosophy Keith Duggar (https://www.linkedin.com/in/dr-keith-duggar/).