Skip to Content
AI SDK Chat

Vercel AI SDK Example

The result.toUIMessageStreamResponse method returns a Response object, which is forwarded directly to the Next.js route handler. On the client side, you can use the useChat hook to consume the stream and render chat messages seamlessly, as it’s outlined in the AI SDK  documentation.

Result

Code

src/modules/ai-sdk/AiSdkController.ts
import { HttpException, HttpStatus, post, prefix, operation, type VovkRequest } from 'vovk'; import { streamText, convertToModelMessages, UIMessage } from 'ai'; import { openai } from '@ai-sdk/openai'; @prefix('ai-sdk') export default class AiSdkController { @operation({ summary: 'Vercel AI SDK', description: 'Uses [@ai-sdk/openai](https://www.npmjs.com/package/@ai-sdk/openai) and ai packages to chat with an AI model', }) @post('chat') static async chat(req: VovkRequest<{ messages: UIMessage[] }>) { const { messages } = await req.json(); const LIMIT = 5; if (messages.filter(({ role }) => role === 'user').length > LIMIT) { throw new HttpException(HttpStatus.BAD_REQUEST, `You can only send ${LIMIT} messages at a time`); } return streamText({ model: openai('gpt-5-nano'), system: 'You are a helpful assistant.', messages: convertToModelMessages(messages), }).toUIMessageStreamResponse(); } }

The code above is fetched from GitHub repository. 

Last updated on