Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

As folks continue to use LLMs, best practices are emerging to help users get the most out of LLMs. OpenAI's ChatGPT allows users to tailor responses to match their tone and desired output goals. Many have reported that using custom instructions results in much more accurate, precise, consistent, and predictable results. But why would you want to do this and why does it matter? In this episode, hosts Kathleen Walch and Ron Schmelzer discuss why this is a best practice. What are custom instructions in ChatGPT? In ChatGPT, custom instructions are provided by answering two questions asked in settings that get sent along with your prompts: What would you like ChatGPT to know about you to provide better responses? How would you like ChatGPT to respond? It's important to note that once created, these instructions will apply to all future chat prompt sessions (not previous or existing ones). This allows you to make somewhat permanent settings that don’t have to be constantly reset. Custom prompt instructions are short and generally limited to about 1500 characters, so keep it precise and concise. Show Notes: Free Intro to CPMAI course CPMAI Certification Subscribe to Cognilytica newsletter on LinkedIn Properly Scoping AI Projects [AI Today Podcast] Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast] AI Today Podcast: AI Glossary Series – OpenAI, GPT, DALL-E, Stable Diffusion AI Today Podcast: AI Glossary Series – Tokenization and Vectorization

Om Podcasten

Cognilytica's AI Today podcast focuses on relevant information about what's going on today in the world of artificial intelligence. Hosts Kathleen Walch and Ron Schmelzer discuss pressing topics around artificial intelligence with easy to digest content, interview guests and experts on the subject, and cut through the hype and noise to identify what is really happening with adoption and implementation of AI.