Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

In this episode of Machine Learning Street Talk, Tim Scarfe, Yannic Kilcher and Connor Shorten chat about Large-scale Transfer Learning in Natural Language Processing. The Text-to-Text Transfer Transformer (T5) model from Google AI does an exhaustive survey of what’s important for Transfer Learning in NLP and what’s not. In this conversation, we go through the key takeaways of the paper, text-to-text input/output format, architecture choice, dataset size and composition, fine-tuning strategy, and how to best use more computation. Beginning with these topics, we diverge into exciting ideas such as embodied cognition, meta-learning, and the measure of intelligence. We are still beginning our podcast journey and really appreciate any feedback from our listeners. Is the chat too technical? Do you prefer group discussions, interviewing experts, or chats between the three of us? Thanks for watching and if you haven’t already, Please Subscribe! Paper Links discussed in the chat: Text-to-Text Transfer Transformer: https://arxiv.org/abs/1910.10683 Experience Grounds Language (relevant to divergent discussion about embodied cognition): https://arxiv.org/pdf/2004.10151.pdf On the Measure of Intelligence: https://arxiv.org/abs/1911.01547 Train Large, Then Compress: https://arxiv.org/pdf/2002.11794.pdf Scaling Laws for Neural Language Models: https://arxiv.org/pdf/2001.08361.pdf The Illustrated Transformer: http://jalammar.github.io/illustrated... ELECTRA: https://arxiv.org/pdf/2003.10555.pdf Transformer-XL: https://arxiv.org/pdf/1901.02860.pdf Reformer: The Efficient Transformer: https://openreview.net/pdf?id=rkgNKkHtvB The Evolved Transformer: https://arxiv.org/pdf/1901.11117.pdf DistilBERT: https://arxiv.org/pdf/1910.01108.pdf How to generate text (HIGHLY RECOMMEND): https://huggingface.co/blog/how-to-ge... Tokenizers: https://blog.floydhub.com/tokenization-nlp/

Om Podcasten

Welcome! We engage in fascinating discussions with pre-eminent figures in the AI field. Our flagship show covers current affairs in AI, cognitive science, neuroscience and philosophy of mind with in-depth analysis. Our approach is unrivalled in terms of scope and rigour – we believe in intellectual diversity in AI, and we touch on all of the main ideas in the field with the hype surgically removed. MLST is run by Tim Scarfe, Ph.D (https://www.linkedin.com/in/ecsquizor/) and features regular appearances from MIT Doctor of Philosophy Keith Duggar (https://www.linkedin.com/in/dr-keith-duggar/).