Day 02 Core Concepts

Streaming Responses: Real-Time AI Output

Streaming makes your AI app feel instant — the response appears word by word instead of all at once after a delay. Day 2 implements streaming with the Claude API.

~1 hour Hands-on Precision AI Academy

Today’s Objective

Streaming makes your AI app feel instant — the response appears word by word instead of all at once after a delay. Day 2 implements streaming with the Claude API.

Without streaming, the user stares at a blank screen for 3-10 seconds waiting for a complete response. With streaming, they see text appear immediately. This is how every major AI product works, and users expect it.

stream.js — Basic Streaming
STREAM.JS — BASIC STREAMING
import Anthropic from '@anthropic-ai/sdk';
import * as dotenv from 'dotenv';
dotenv.config();

const client = new Anthropic();

async function streamChat(userMessage) { console.log('Assistant: '); const stream = await client.messages.stream({ model: 'claude-opus-4-5', max_tokens: 1024, messages: [{ role: 'user', content: userMessage }] }); // Print each text chunk as it arrives for await (const chunk of stream) { if (chunk.type === 'content_block_delta' && chunk.delta.type === 'text_delta') { process.stdout.write(chunk.delta.text); } } // Get final message after streaming completes const message = await stream.finalMessage(); console.log('

[Tokens used:', message.usage.input_tokens, '+', message.usage.output_tokens, ']'); return message.content[0].text;
}

await streamChat('Write a short poem about JavaScript.');

Streaming in an HTTP Response

When you stream to a browser via HTTP, you use Server-Sent Events (SSE). Here is the Express route pattern:

Express Streaming Route
EXPRESS STREAMING ROUTE
app.post('/chat/stream', async (req, res) => { const { message } = req.body; // Set SSE headers res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); const stream = await client.messages.stream({ model: 'claude-opus-4-5', max_tokens: 1024, messages: [{ role: 'user', content: message }] }); for await (const chunk of stream) { if (chunk.type === 'content_block_delta' && chunk.delta.type === 'text_delta') { res.write(`data: ${JSON.stringify({ text: chunk.delta.text })}

`); } } res.write('data: [DONE]

'); res.end();
});
Browser — Consuming SSE Stream
BROWSER — CONSUMING SSE STREAM
async function streamMessage(message, onChunk) { const res = await fetch('/chat/stream', { method: 'POST', headers: {'Content-Type': 'application/json'}, body: JSON.stringify({ message }) }); const reader = res.body.getReader(); const decoder = new TextDecoder(); while (true) { const { done, value } = await reader.read(); if (done) break; const lines = decoder.decode(value).split('
'); for (const line of lines) { if (line.startsWith('data: ')) { const data = line.slice(6); if (data === '[DONE]') return; const { text } = JSON.parse(data); onChunk(text); } } }
}

// Usage
let output = '';
await streamMessage('Tell me a story', (chunk) => { output += chunk; document.getElementById('output').textContent = output;
});
Day 2 ExerciseBuild a Streaming CLI App
  1. Create stream.js and run it — watch the response appear word by word in your terminal.
  2. Time the difference between streaming and non-streaming for a long response.
  3. Add the streaming Express route to a server.
  4. Build a simple HTML page that renders streaming text in real time.

Supporting Resources

Go deeper with these references.

Anthropic
Claude API Reference Official documentation for the Messages API, tool use, and streaming.
npm
@anthropic-ai/sdk Official Node.js SDK for the Anthropic API with TypeScript support.
GitHub
Anthropic Cookbook Official Anthropic code examples for common JavaScript + Claude patterns.

Day 2 Checkpoint

Before moving on, make sure you can answer these without looking:

Continue To Day 3
Express Chat Server: Multi-User Conversations