How to Add Persistent Memory to LangChain Agents
If you build a LangChain agent, it will forget everything the moment the Python script finishes executing. To build an agent that learns over time, you need persistent memory. Here is how to add it using the Model Context Protocol (MCP).
The traditional way to add memory to LangChain is using `ConversationBufferMemory` backed by a database like Redis. This works fine for simple chatbots, but it fails for autonomous agents.
If your agent needs to remember facts (like "the user prefers dark mode" or "the API key is stored in AWS Secrets Manager"), dumping the entire raw chat history into the prompt is incredibly inefficient and prone to hallucination.
Instead, we want to give the agent the ability to actively read and write structured facts. We can do this by exposing the Memstate MCP server as a set of LangChain tools.
The Architecture
Instead of wrapping our LLM in a memory chain, we are going to build a standard ReAct agent (or LangGraph state machine) and give it "Memory Tools."
- `store_memory`: The agent uses this to save a new fact.
- `search_memories`: The agent uses this to query past facts.
Why This is Better
By treating memory as a tool rather than a buffer, the LLM decides when it needs to remember something. This saves massive amounts of tokens because you aren't injecting irrelevant history into every single prompt.
Step 1: Setup the Environment
First, you need a Memstate API key. You can get one for free at memstate.ai.
Then, install the necessary packages. (We will use the Python ecosystem for this example, but the concepts apply perfectly to LangChain.js as well).
pip install langchain langchain-openai mcp
Step 2: Connect the MCP Client
LangChain recently added native support for MCP tools. You can use the `mcp` package to connect to the Memstate server and automatically convert its tools into LangChain `Tool` objects.
import os
from mcp.client.stdio import stdio_client
from mcp.client.session import ClientSession
from langchain_core.tools import tool
# Set your API key
os.environ["MEMSTATE_API_KEY"] = "your_api_key_here"
# In a real app, you would use the MCP client to dynamically load tools.
# For simplicity, here is how you define them manually:
@tool
def store_memory(keypath: str, value: str) -> str:
"""Store a fact in persistent memory.
Use this when you learn something new about the user or project.
Example keypath: user.preferences.theme
"""
# Logic to call Memstate API
return f"Successfully stored {value} at {keypath}"
@tool
def search_memories(query: str) -> str:
"""Search persistent memory for facts.
Use this before answering questions to check if you already know the answer.
"""
# Logic to call Memstate API
return "Retrieved memory: user prefers dark mode."Step 3: Create the Agent
Now, we simply pass these tools to our LangChain agent.
from langchain_openai import ChatOpenAI
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate
llm = ChatOpenAI(model="gpt-4o")
tools = [store_memory, search_memories]
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful AI assistant. You have access to persistent memory tools. ALWAYS use search_memories to check for user preferences before answering. If the user tells you a new fact, use store_memory to save it."),
("human", "{input}"),
("placeholder", "{agent_scratchpad}"),
])
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
# Test the agent
agent_executor.invoke({"input": "I like my UI in dark mode. Please remember that."})How It Executes
When you run that script, the agent will recognize the instruction, format a keypath (e.g., `user.ui.theme`), and call the `store_memory` tool. The fact is now saved securely in Memstate.
If you kill the Python script, start a completely new session tomorrow, and run:
agent_executor.invoke({"input": "Write a CSS snippet for a button."})The agent will first call `search_memories`, retrieve the fact that you prefer dark mode, and write the CSS using dark mode colors.
Conclusion
By decoupling memory from the LangChain framework and treating it as an external tool via MCP, you create agents that are significantly smarter, use fewer tokens, and can share their brain with other tools (like Cursor or Windsurf).
Add Memory to Your Agent
Memstate AI provides the structured, versioned memory your LangChain agent needs.