Till podcasten

Attention in Neural Nets - Linear Digressions

Avsnittet publicerades: 6/17/2019

There’s been a lot of interest lately in the attention mechanism in neural nets—it’s got a colloquial name (who’s not familiar with the idea of “attention”?) but it’s more like a technical trick that’s been pivotal to some recent advances in computer vision and especially word embeddings. It’s an interesting example of trying out human-cognitive-ish ideas (like focusing consideration more on some inputs than others) in neural nets, and one of the more high-profile recent successes in playing around with neural net architectures for fun and profit.

Om podcasten

In each episode, your hosts explore machine learning and data science through interesting (and often very unusual) applications.