Machine Learning Street Talk

ICLR 2020: Yoshua Bengio and the Nature of Consciousness

av Machine Learning Street Talk | Publicerades 5/22/2020

In this episode of Machine Learning Street Talk, Tim Scarfe, Connor Shorten and Yannic Kilcher react to Yoshua Bengio’s ICLR 2020 Keynote “Deep Learning Priors Associated with Conscious Processing”. Bengio takes on many future directions for research in Deep Learning such as the role of attention in consciousness, sparse factor graphs and causality, and the study of systematic generalization. Bengio also presents big ideas in Intelligence that border on the line of philosophy and practical machine learning. This includes ideas such as consciousness in machines and System 1 and System 2 thinking, as described in Daniel Kahneman’s book “Thinking Fast and Slow”. Similar to Yann LeCun’s half of the 2020 ICLR keynote, this talk takes on many challenging ideas and hopefully this video helps you get a better understanding of some of them! Thanks for watching! 

Please Subscribe for more videos!

Paper Links:

Link to Talk:

The Consciousness Prior:

Thinking Fast and Slow:

Systematic Generalization:

CLOSURE: Assessing Systematic Generalization of CLEVR Models:

Neural Module Networks:

Experience Grounds Language:

Benchmarking Graph Neural Networks:

On the Measure of Intelligence:

Please check out our individual channels as well!

Machine Learning Dojo with Tim Scarfe:

Yannic Kilcher:

Henry AI Labs:

00:00:00 Tim and Yannics takes

00:01:37 Intro to Bengio

00:03:13 System 2, language and Chomsky

00:05:58 Cristof Koch on conciousness

00:07:25 Francois Chollet on intelligence and consciousness

00:09:29 Meditation and Sam Harris on consciousness

00:11:35 Connor Intro

00:13:20 Show Main Intro

00:17:55 Priors associated with Conscious Processing

00:26:25 System 1 / System 2

00:42:47 Implicit and Verbalized Knowledge [DONT MISS THIS!]

01:08:24 Inductive Priors for DL 2.0

01:27:20 Systematic Generalization

01:37:53 Contrast with the Symbolic AI Program

01:54:55 Attention

02:00:25 From Attention to Consciousness

02:05:31 Thoughts, Consciousness, Language

02:06:55 Sparse Factor graph

02:10:52 Sparse Change in Abstract Latent Space

02:15:10 Discovering Cause and Effect

02:20:00 Factorize the joint distribution

02:22:30 RIMS: Modular Computation

02:24:30 Conclusion

#machinelearning #deeplearning

Om Podcasten

This is the audio podcast for the ML Street Talk YouTube channel at Thanks for checking out Machine Learning Street Talk! Join in our discussion of the most exciting papers in Machine Learning and Artificial Intelligence! This channel is managed by Yannic Kilcher (Yannic Kilcher), Tim Scarfe (Machine Learning Dojo with Tim Scarfe), and Connor Shorten (Henry AI Labs). Please leave a comment with your thoughts on the papers and one of us will get around to chat with you!