Module 3, Lecture 3.1 | Working with LLMs in Practice
This lecture transitions from theory to practice — the hands-on mechanics of working with LLMs through their APIs. It covers the anatomy of an API call (model, system prompt, messages, response), message roles and how they shape behavior, conversation history as context management, streaming responses, and error handling with exponential backoff. Three live coding demos walk through a first API call, a multi-turn conversation with growing token counts, and a production retry pattern.
CMPS 130: Python Programming — If you're new to Python or need a refresher, this course covers the fundamentals you'll need: variables, functions, loops, dictionaries, and working with libraries.
Messages API Reference — Anthropic Docs — The official reference for Anthropic's Messages API: request format, response structure, parameters, and streaming. The primary API you'll use throughout the course.
Chat Completions API — OpenAI Docs — OpenAI's equivalent guide for their Chat Completions endpoint. Worth reading alongside the Anthropic docs to see how the same message-role-response pattern appears across providers.
Anthropic Cookbook — Anthropic's official collection of code examples and patterns, including multi-turn conversations, error handling, and streaming implementations.