Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them

Welcome to Episode 397 of the Microsoft Cloud IT Pro Podcast. In this episode, Scott and Ben dive into the world of local LLMs—large language models that run entirely on your device. We’re going to explore why more IT pros and developers are experimenting with them, the kinds of models you can run, and how you can integrate them directly into your workflow, including in Visual Studio Code for AI-assisted coding. Your support makes this show possible! Please consider becoming a premium member for access to live shows and more. Check out our membership options. Show Notes Ollama Running LLMs Locally: A Beginner's Guide to Using Ollama open-webui/open-webui LM Studio LM. Studio Model Catalog Why do people like Ollama more than LM Studio? A Starter Guide for Playing with Your Own Local AI! host ALL your AI locally Run your own AI (but private) About the sponsors Would you like to become the irreplaceable Microsoft 365 resource for your organization? Let us know!

Om Podcasten

On the MS Cloud IT Pro Podcast, Scott and Ben discuss the Microsoft Cloud with a focus on IT Pros. They'll discuss the latest in Microsoft 365 and Office 365 News, Azure news and talk about their experiences with managing the Microsoft Cloud as well as interview industry experts on various cloud technology. They'll cover things such as SharePoint, Exchange, Microsoft Teams, PowerShell, Azure, Azure AD, Security, Networking, Storage, and the many other technologies and products that have made their way into the Microsoft 365 suite and Azure. To stay up-to-date on the latest in Microsoft Cloud news and gain some valuable knowledge as you deploy it within your own organization, make sure to tune in every week! Find out more at https://msclouditpro.com.