Memstate + Vercel AI SDK
Add persistent, versioned memory to Vercel AI SDK agents via MCP.
The Vercel AI SDK supports MCP via the @ai-sdk/mcp package. Memstate plugs in as a local stdio MCP client, automatically exposing all memory tools to generateText, streamText, and generateObject.
Requirements
- Node.js 18+
- TypeScript (recommended) or JavaScript
- Anthropic or OpenAI API key
- Memstate API key — get one free
1
Install dependencies
Terminal
pnpm add ai @ai-sdk/anthropic @ai-sdk/mcp @modelcontextprotocol/sdkAlso works with npm install or yarn add. Swap @ai-sdk/anthropic for @ai-sdk/openai if using OpenAI models.
2
Set environment variables
Shell
export ANTHROPIC_API_KEY="sk-ant-..."
export MEMSTATE_API_KEY="mst_..."3
Connect Memstate via MCP
Create an MCP client with createMCPClient, call mcpClient.tools() to discover all Memstate tools, then pass them to generateText.
agent.ts
/**
* Memstate AI + Vercel AI SDK
* Requires: pnpm add ai @ai-sdk/anthropic @ai-sdk/mcp @modelcontextprotocol/sdk
* Env: ANTHROPIC_API_KEY, MEMSTATE_API_KEY
*/
import { createMCPClient } from "@ai-sdk/mcp";
import { anthropic } from "@ai-sdk/anthropic";
import { generateText } from "ai";
import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";
async function main() {
// Create MCP client — spawns @memstate/mcp as a local subprocess
const mcpClient = await createMCPClient({
transport: new StdioClientTransport({
command: "npx",
args: ["-y", "@memstate/mcp"],
env: {
...process.env,
MEMSTATE_API_KEY: process.env.MEMSTATE_API_KEY!,
},
}),
});
try {
// Discover all Memstate tools automatically
const tools = await mcpClient.tools();
const { text } = await generateText({
model: anthropic("claude-opus-4-5"),
tools: tools as any, // MCP tools are compatible with Vercel AI tool schema
maxSteps: 5,
prompt:
'Save a memory: memstate_remember(project_id="demo", ' +
'content="## Vercel AI SDK\\nMemstate connected via Vercel AI SDK MCP.", ' +
'source="agent"). Then call memstate_get(project_id="demo") and show the result.',
});
console.log(text);
} finally {
await mcpClient.close();
}
}
main().catch(console.error);Streaming responses
Use streamText for streaming responses — the MCP tools work identically:
streaming.ts
import { streamText } from "ai";
const { textStream } = streamText({
model: anthropic("claude-opus-4-5"),
tools: tools as any,
maxSteps: 5,
prompt: "What do you remember about the demo project?",
});
for await (const chunk of textStream) {
process.stdout.write(chunk);
}✓
Test it — run the onboarding prompt
Onboarding prompt
I'm onboarding Memstate AI memory for this project. Please:
1. Analyze this codebase and write a concise high-level architecture overview in markdown.
2. Save it with: memstate_remember(project_id="<your_project>", content="<the markdown>", source="agent")
3. Then call memstate_get(project_id="<your_project>") and show me the memory tree.