Jay McClelland | Neural Networks: Artificial and Biological

Jay McClelland is a pioneer in the field of artificial intelligence and is a cognitive psychologist and professor at Stanford University in the psychology, linguistics, and computer science departments. Together with David Rumelhart, Jay published the two volume work Parallel Distributed Processing, which has led to the flourishing of the connectionist approach to understanding cognition. In this conversation, Jay gives us a crash course in how neurons and biological brains work. This sets the stage for how psychologists such as Jay, David Rumelhart, and Geoffrey Hinton historically approached the development of models of cognition and ultimately artificial intelligence. We also discuss alternative approaches to neural computation such as symbolic and neuroscientific ones. Patreon (bonus materials + video chat):https://www.patreon.com/timothynguyen Part I. Introduction 00:00 : Preview 01:10 : Cognitive psychology 07:14 : Interdisciplinary work and Jay's academic journey 12:39 : Context affects perception 13:05 : Chomsky and psycholinguists 8:03 : Technical outline Part II. The Brain 00:20:20 : Structure of neurons 00:25:26 : Action potentials 00:27:00 : Synaptic processes and neuron firing 00:29:18 : Inhibitory neurons 00:33:10 : Feedforward neural networks 00:34:57 : Visual system 00:39:46 : Various parts of the visual cortex 00:45:31 : Columnar organization in the cortex 00:47:04 : Colocation in artificial vs biological networks 00:53:03 : Sensory systems and brain maps Part III. Approaches to AI, PDP, and Learning Rules 01:12:35 : Chomsky, symbolic rules, universal grammar 01:28:28 : Neuroscience, Francis Crick, vision vs language 01:32:36 : Neuroscience = bottom up 01:37:20 : Jay’s path to AI 01:43:51 : James Anderson 01:44:51 : Geoff Hinton 01:54:25 : Parallel Distributed Processing (PDP) 02:03:40 : McClelland & Rumelhart’s reading model 02:31:25 : Theories of learning 02:35:52 : Hebbian learning 02:43:23 : Rumelhart’s Delta rule 02:44:45 : Gradient descent 02:47:04 : Backpropagation 02:54:52 : Outro: Retrospective and looking ahead Image credits:http://timothynguyen.org/image-credits/ Further reading: Rumelhart, McClelland. Parallel Distributed Processing. McClelland, J. L. (2013). Integrating probabilistic models of perception and interactive neural networks: A historical and tutorial review   Twitter: @iamtimnguyen   Webpage: http://www.timothynguyen.org

Om Podcasten

The Cartesian Cafe is the podcast where an expert guest and Timothy Nguyen map out scientific and mathematical subjects in detail. This collaborative journey with other experts will have us writing down formulas, drawing pictures, and reasoning about them together on a whiteboard. If you’ve been longing for a deeper dive into the intricacies of scientific subjects, then this is the podcast for you. Topics covered include mathematics, physics, machine learning, artificial intelligence, and computer science. Content also viewable on YouTube: www.youtube.com/timothynguyen and Spotify. Timothy Nguyen is a mathematician and AI researcher working in industry. Homepage: www.timothynguyen.com, Twitter: @IAmTimNguyen Patreon: www.patreon.com/timothynguyen