[Linkpost] “Notes and updates on GPT-5” by Yadav

This is a link post. (I wrote this quickly for my own edification. It isn't original thinking; it condenses takes I’ve read on X over the past few days. Where relevant I’ve linked tweets. I hope it's helpful for people.) Watching an OpenAI launch now feels like watching an Apple keynote in the early 2010s. As a kid, I was giddy about which new, unaffordable iPhone would drop. I’ll probably look back on my twenties as the years I felt the same mix of excitement and perhaps fear about whatever OpenAI released next. At first glance on X, before I had access to the model, the mood was mostly disappointment. Some people pushed their timelines out; a few even talked about the end of the pre-training paradigm. The live presentation had a couple of funny slip-ups. I’m not a technical person, and my naive expectation was that GPT-5 would feel [...] --- First published: August 9th, 2025 Source: https://forum.effectivealtruism.org/posts/m8H2QooDJzGaTDXYK/notes-and-updates-on-gpt-5 Linkpost URL:https://robertgaurav.xyz/notes/gpt5 --- Narrated by TYPE III AUDIO. ---Images from the article: gpt-3 -> gpt-4 were all ~100x scaleups in pretraining compute. gpt-5 is not. building projects like stargate and colossus2 to actually 100x gpt-4 compute takes time, let the labs cook :)". The tweet offers a perspective on AI model development, suggesting that while previous GPT models saw significant computational increases, GPT-5 development may require more patience as larger projects are being built." style="max-width: 100%;" />Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

Om Podcasten

Audio narrations from the Effective Altruism Forum, including curated posts, posts with 30 karma, and other great writing. If you'd like fewer episodes, subscribe to the "EA Forum (Curated & Popular)" podcast instead.