Example: Chat SDK / Vercel AI
Build a streaming chat assistant that creates, lists, and moves kanban-lite tasks through a small Next.js app powered by the Vercel AI SDK.
What you will build
The example in examples/chat-sdk-vercel-ai/ is a compact App Router app with one chat page,
one server route, and one kanban-lite client module. The assistant can:
- Create tasks from a natural-language request
- List tasks, optionally filtered by status column
- Move tasks across standard columns such as
backlog,in-progress,review, anddone - Stream tool-backed responses into a simple chat UI on
http://localhost:3001
-
Check the prerequisites
You need the following before starting:
- Node.js 20+ for the Next.js app in
examples/chat-sdk-vercel-ai/ - A running kanban-lite standalone server, unless you choose mock mode
- An OpenAI API key, or another provider supported by the Vercel AI SDK if you swap providers in the source
The example defaults to talking to a live kanban-lite server at
http://localhost:3000and a board nameddefault. - Node.js 20+ for the Next.js app in
-
Start kanban-lite
In a separate terminal, start the standalone server that the chat route will call over HTTP:
npx kanban-lite serve # or, if you have the CLI installed globally: kl serveThis serves the board API on
http://localhost:3000. The chat app intentionally runs on3001to avoid port collisions.If you do not want to run the server yet, you can set
KANBAN_USE_MOCK=truelater and the example will keep all task operations in memory instead. -
Install the example app
cd examples/chat-sdk-vercel-ai npm installThe shipped
package.jsonexposes three scripts:npm run dev,npm run build, andnpm run start. The app uses Next.js 15, React 19, the Vercel AI SDK, and@ai-sdk/openai. -
Configure the environment
Copy the template exactly as the example README describes:
cp .env.example .env.localThen set the values used by the shipped source:
OPENAI_API_KEY— required for the default OpenAI providerKANBAN_API_URL— defaults tohttp://localhost:3000KANBAN_BOARD_ID— defaults todefaultKANBAN_API_TOKEN— optional bearer token for auth-protected serversKANBAN_USE_MOCK— set totrueto skip the live server
Mock mode is handy for UI demos and local experimentation. Live mode is the important integration path because it exercises the public kanban-lite REST API.
-
Run the chat app
npm run devOpen http://localhost:3001. The app renders a single chat page with starter prompts such as:
Create a task: Fix the signup email flow, high priority List all backlog tasks What tasks are in progress? Move task mock-1 to doneMessages stream through the Vercel AI SDK
useChathook, and tool results are rendered inline under the assistant response. -
Build for production
npm run build npm run startThe production server also binds to port
3001, matching the shipped script contract in the example app.
How the app is organized
The example stays intentionally tiny. These are the files worth reading first:
app/page.tsx
Client component that uses useChat, renders the message list, and shows tool invocation results inline.
app/api/chat/route.ts
Server route that calls streamText, defines the tool contract, and turns tool executions into a streaming response.
lib/kanban.ts
HTTP client and mock implementation for kanban-lite. This is the integration seam between the chat app and the board.
.env.example
Documents every environment variable consumed by the example.
The kanban-lite integration seam
The important design choice in this example is that it does not import kanban-lite internals. Instead,
lib/kanban.ts talks to the standalone server over the documented REST API. That keeps the chat app decoupled,
and it mirrors how a real external app would integrate with kanban-lite in production.
The shipped module wraps three operations:
createTask(...)→POST /api/boards/:boardId/taskslistTasks(...)→GET /api/boards/:boardId/tasksmoveTask(...)→PATCH /api/boards/:boardId/tasks/:id/move
The chat route in app/api/chat/route.ts exposes those operations as three AI tools: create_task,
list_tasks, and move_task. The model can call a tool, wait for the result, and then stream a short
human-readable summary back to the user. Tiny surface area, very real behavior — like a good espresso shot.
Key code
The core integration pattern is a Vercel AI SDK streamText call with server-side tools that delegate to the kanban-lite REST API.
Here is a condensed version of app/api/chat/route.ts:
// app/api/chat/route.ts — tool registration + streaming
import { openai } from '@ai-sdk/openai';
import { convertToCoreMessages, streamText, tool } from 'ai';
import { z } from 'zod';
import { createCard, listCards, moveCard } from '@/lib/kanban';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: openai('gpt-4o-mini'),
system: SYSTEM_PROMPT,
messages: convertToCoreMessages(messages),
tools: {
create_card: tool({
description: 'Create a new kanban card',
parameters: z.object({
title: z.string(),
priority: z.enum(['critical','high','medium','low']),
}),
execute: async ({ title, priority }) =>
createCard({ title, priority }),
}),
list_cards: tool({ /* ... */ }),
move_card: tool({ /* ... */ }),
},
});
return result.toDataStreamResponse();
}
Why kanban-lite fits this flow
The board is the durable system of record, while the LLM acts as a conversational layer on top. That split matters: the assistant can interpret intent in natural language, but task state still lives in explicit kanban-lite columns and cards, where humans and other tools can inspect it later.
- The chat UI gives you natural-language control
- The kanban-lite server gives you stable task storage and a documented HTTP contract
- The mock mode keeps the UI usable even when you are prototyping without a backend
Provider swap notes
The shipped example uses @ai-sdk/openai and openai('gpt-4o-mini'). The source comments in
app/api/chat/route.ts also show how to swap to another Vercel AI SDK provider, such as Anthropic,
by changing the import, model call, and corresponding API key.
Extension ideas
- Add more tools for comments, labels, metadata filters, or form submission
- Show richer task cards in the UI instead of raw JSON tool output
- Connect to an auth-protected kanban-lite server with
KANBAN_API_TOKEN - Offer a board selector by changing
KANBAN_BOARD_IDdynamically - Persist chat history or add guardrails around allowed columns and priorities
Source and related docs
The runnable app lives in examples/chat-sdk-vercel-ai/. For adjacent reference material, use the links below.