Setup/LangChain
LangChain

Memstate + LangChain

Add persistent, versioned memory to LangChain and LangGraph agents via MCP.

The langchain-mcp-adapters package bridges LangChain tools with any MCP server. Memstate plugs in as a local stdio MCP server, giving every LangGraph agent automatic access to structured, project-scoped memory.

Requirements

  • Python 3.10+
  • Node.js 18+ (for npx @memstate/mcp)
  • Anthropic or OpenAI API key
  • Memstate API key — get one free
1

Install dependencies

Terminal
pip install langchain-mcp-adapters langchain-anthropic langgraph
2

Set environment variables

Shell
export ANTHROPIC_API_KEY="sk-ant-..."
export MEMSTATE_API_KEY="mst_..."
3

Connect Memstate via MCP

Use MultiServerMCPClient as a context manager. It spawns the MCP server subprocess and converts all Memstate tools into LangChain-compatible tools automatically.

agent.py
"""
Memstate AI + LangChain / LangGraph
Requires: pip install langchain-mcp-adapters langchain-anthropic langgraph
Env: ANTHROPIC_API_KEY, MEMSTATE_API_KEY
"""
import asyncio
import os
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_anthropic import ChatAnthropic
from langgraph.prebuilt import create_react_agent


async def main() -> None:
    async with MultiServerMCPClient(
        {
            "memstate": {
                "transport": "stdio",
                "command": "npx",
                "args": ["-y", "@memstate/mcp"],
                "env": {
                    **os.environ,
                    "MEMSTATE_API_KEY": os.environ["MEMSTATE_API_KEY"],
                },
            }
        }
    ) as client:
        tools = client.get_tools()
        model = ChatAnthropic(model="claude-opus-4-5")
        agent = create_react_agent(model, tools)

        response = await agent.ainvoke(
            {
                "messages": [
                    {
                        "role": "user",
                        "content": (
                            'Save a memory: memstate_remember(project_id="demo", '
                            'content="## LangChain Integration\\nMemstate connected via LangChain MCP adapters.", '
                            'source="agent"). Then retrieve it with memstate_get(project_id="demo").'
                        ),
                    }
                ]
            }
        )
        print(response["messages"][-1].content)


if __name__ == "__main__":
    asyncio.run(main())

Using OpenAI models

Swap in any LangChain-compatible model — the MCP integration is model-agnostic:

agent.py (OpenAI)
"""Using OpenAI models instead of Anthropic"""
from langchain_openai import ChatOpenAI

model = ChatOpenAI(model="gpt-4o-mini")
agent = create_react_agent(model, tools)

Test it — run the onboarding prompt

Onboarding prompt
I'm onboarding Memstate AI memory for this project. Please:
1. Analyze this codebase and write a concise high-level architecture overview in markdown.
2. Save it with: memstate_remember(project_id="<your_project>", content="<the markdown>", source="agent")
3. Then call memstate_get(project_id="<your_project>") and show me the memory tree.

Native LangChain integration available

For deeper LangChain integration — including MemstateStore (LangGraph BaseStore), MemstateRetriever, and MemstateChatMessageHistory — see the LangChain integration guide.