diff --git a/src/content/changelog/agents/2025-09-03-agents-sdk-beta-v5.mdx b/src/content/changelog/agents/2025-09-03-agents-sdk-beta-v5.mdx new file mode 100644 index 000000000000000..97c8323f3f59e9f --- /dev/null +++ b/src/content/changelog/agents/2025-09-03-agents-sdk-beta-v5.mdx @@ -0,0 +1,268 @@ +--- +title: Agents SDK v0.1.0 and workers-ai-provider v2.0.0 with AI SDK v5 support +description: The latest release updates the Agents SDK with full AI SDK v5 compatibility, updated workers-ai-provider v2.0.0 with enhanced streaming and tool support, seamless legacy message migration, tool confirmation detection, and React hooks for building production-ready AI chat interfaces. +products: + - agents + - workers +date: 2025-09-10 +--- + +We've shipped a new release for the [Agents SDK](https://github.com/cloudflare/agents) bringing full compatibility with [AI SDK v5](https://ai-sdk.dev/docs/introduction) and introducing automatic message migration that handles all legacy formats transparently. + +This release includes improved streaming and tool support, tool confirmation detection (for "human in the loop" systems), enhanced React hooks with automatic tool resolution, improved error handling for streaming responses, and seamless migration utilities that work behind the scenes. + +This makes it ideal for building production AI chat interfaces with Cloudflare Workers AI models, agent workflows, human-in-the-loop systems, or any application requiring reliable message handling across SDK versions — all while maintaining backward compatibility. + +Additionally, we've updated workers-ai-provider v2.0.0, the official provider for Cloudflare Workers AI models, to be compatible with AI SDK v5. + +#### useAgentChat(options) + +Creates a new chat interface with enhanced v5 capabilities. + +```ts +// Basic chat setup +const { messages, sendMessage, addToolResult } = useAgentChat({ + agent, + experimental_automaticToolResolution: true, + tools, +}); + +// With custom tool confirmation +const chat = useAgentChat({ + agent, + experimental_automaticToolResolution: true, + toolsRequiringConfirmation: ["dangerousOperation"], +}); +``` + +#### Automatic Tool Resolution + +Tools are automatically categorized based on their configuration: + +```ts +const tools = { + // Auto-executes (has execute function) + getLocalTime: { + description: "Get current local time", + inputSchema: z.object({}), + execute: async () => new Date().toLocaleString(), + }, + + // Requires confirmation (no execute function) + deleteFile: { + description: "Delete a file from the system", + inputSchema: z.object({ + filename: z.string(), + }), + }, + + // Server-executed (no client confirmation) + analyzeData: { + description: "Analyze dataset on server", + inputSchema: z.object({ data: z.array(z.number()) }), + serverExecuted: true, + }, +} satisfies Record; +``` + +#### Message Handling + +Send messages using the new v5 format with parts array: + +```ts +// Text message +sendMessage({ + role: "user", + parts: [{ type: "text", text: "Hello, assistant!" }], +}); + +// Multi-part message with file +sendMessage({ + role: "user", + parts: [ + { type: "text", text: "Analyze this image:" }, + { type: "image", image: imageData }, + ], +}); +``` + +#### Tool Confirmation Detection + +Simplified logic for detecting pending tool confirmations: + +```ts +const pendingToolCallConfirmation = messages.some((m) => + m.parts?.some( + (part) => isToolUIPart(part) && part.state === "input-available", + ), +); + +// Handle tool confirmation +if (pendingToolCallConfirmation) { + await addToolResult({ + toolCallId: part.toolCallId, + tool: getToolName(part), + output: "User approved the action", + }); +} +``` + +### Automatic Message Migration + +Seamlessly handle legacy message formats without code changes. + +```ts +// All these formats are automatically converted: + +// Legacy v4 string content +const legacyMessage = { + role: "user", + content: "Hello world", +}; + +// Legacy v4 with tool calls +const legacyWithTools = { + role: "assistant", + content: "", + toolInvocations: [ + { + toolCallId: "123", + toolName: "weather", + args: { city: "SF" }, + state: "result", + result: "Sunny, 72°F", + }, + ], +}; + +// Automatically becomes v5 format: +// { +// role: "assistant", +// parts: [{ +// type: "tool-call", +// toolCallId: "123", +// toolName: "weather", +// args: { city: "SF" }, +// state: "result", +// result: "Sunny, 72°F" +// }] +// } +``` + +### Tool Definition Updates + +Migrate tool definitions to use the new `inputSchema` property. + +```ts +// Before (AI SDK v4) +const tools = { + weather: { + description: "Get weather information", + parameters: z.object({ + city: z.string(), + }), + execute: async (args) => { + return await getWeather(args.city); + }, + }, +}; + +// After (AI SDK v5) +const tools = { + weather: { + description: "Get weather information", + inputSchema: z.object({ + city: z.string(), + }), + execute: async (args) => { + return await getWeather(args.city); + }, + }, +}; +``` + +### Cloudflare Workers AI Integration + +Seamless integration with Cloudflare Workers AI models through the updated workers-ai-provider v2.0.0. + +#### Model Setup with Workers AI + +Use Cloudflare Workers AI models directly in your agent workflows: + +```ts +import { createWorkersAI } from "workers-ai-provider"; +import { useAgentChat } from "agents/ai-react"; + +// Create Workers AI model (v2.0.0 - same API, enhanced v5 internals) +const model = createWorkersAI({ + binding: env.AI, +})("@cf/meta/llama-3.2-3b-instruct"); +``` + +#### Enhanced File and Image Support + +Workers AI models now support v5 file handling with automatic conversion: + +```ts +// Send images and files to Workers AI models +sendMessage({ + role: "user", + parts: [ + { type: "text", text: "Analyze this image:" }, + { + type: "file", + data: imageBuffer, + mediaType: "image/jpeg", + }, + ], +}); + +// Workers AI provider automatically converts to proper format +``` + +#### Streaming with Workers AI + +Enhanced streaming support with automatic warning detection: + +```ts +// Streaming with Workers AI models +const result = await streamText({ + model: createWorkersAI({ binding: env.AI })("@cf/meta/llama-3.2-3b-instruct"), + messages, + onChunk: (chunk) => { + // Enhanced streaming with warning handling + console.log(chunk); + }, +}); +``` + +### Import Updates + +Update your imports to use the new v5 types: + +```ts +// Before (AI SDK v4) +import type { Message } from "ai"; +import { useChat } from "ai/react"; + +// After (AI SDK v5) +import type { UIMessage } from "ai"; +// or alias for compatibility +import type { UIMessage as Message } from "ai"; +import { useChat } from "@ai-sdk/react"; +``` + +## Resources + +- [Migration Guide](https://github.com/cloudflare/agents/blob/main/docs/migration-to-ai-sdk-v5.md) - Comprehensive migration documentation +- [AI SDK v5 Documentation](https://ai-sdk.dev/docs/migration-guides/migration-guide-5-0) - Official AI SDK migration guide +- [An Example PR showing the migration from AI SDK v4 to v5](https://github.com/cloudflare/agents-starter/pull/105) +- [GitHub Issues](https://github.com/cloudflare/agents/issues) - Report bugs or request features + +## Feedback Welcome + +We'd love your feedback! We're particularly interested in feedback on: + +- **Migration experience** - How smooth was the upgrade process? +- **Tool confirmation workflow** - Does the new automatic detection work as expected? +- **Message format handling** - Any edge cases with legacy message conversion?