Integrations/LangChain & LangGraph

LangChain & LangGraph

Give your LangChain and LangGraph agents structured, versioned, semantic memory. Memstate organizes facts in a keypath hierarchy — like a filesystem for knowledge — with automatic versioning and time-travel built in.

Installation

Terminal
pip install langchain-memstate

Requires Python 3.10+. Dependencies: langchain-core, langgraph, httpx, pydantic.

Quick Start

The fastest way to get started: create a MemstateStore, give your agent the Memstate tools, and run it.

quick_start.py
from langchain_memstate import MemstateStore, get_memstate_tools
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

# 1. Create the store
store = MemstateStore(api_key="mst_...", project_id="my-agent")

# 2. Give the agent tools to interact with memory directly
tools = get_memstate_tools(api_key="mst_...", project_id="my-agent")

# 3. Create the agent
agent = create_react_agent(
    ChatOpenAI(model="gpt-4o-mini"),
    tools=tools,
    store=store,
)

# 4. Run it
result = agent.invoke({
    "messages": [{"role": "user", "content": "Remember that Alice prefers Python for backend work"}]
})

Get your API key

Sign up at memstate.ai/dashboard and create a free API key. No credit card required.

Keypath Hierarchy

Memstate's defining feature is the keypath system. Instead of storing memories as unstructured blobs, every fact lives at a dot-separated path like project.myapp.auth.provider. This makes your agent's knowledge graph navigable, searchable by prefix, and semantically queryable — all at once.

// Example knowledge tree
project
└── myapp
├── auth.provider → "OAuth2 + JWT"
├── database.engine → "PostgreSQL 16"
└── cache.engine → "Redis 7"
users
└── alice
├── preferences.language → "Python"
├── preferences.framework → "FastAPI"
└── role → "Senior Backend Engineer"
keypath_demo.py
from langchain_memstate import MemstateStore

store = MemstateStore(api_key="mst_...", project_id="my-project")

# Store facts at structured keypaths — like a filesystem for knowledge
store.put(("users", "alice", "preferences"), "language",   {"value": "Python"})
store.put(("users", "alice", "preferences"), "framework",  {"value": "FastAPI"})
store.put(("users", "alice"),                "role",       {"value": "Senior Backend Engineer"})
store.put(("project", "myapp", "auth"),      "provider",   {"value": "OAuth2 + JWT"})
store.put(("project", "myapp", "database"),  "engine",     {"value": "PostgreSQL 16"})
store.put(("project", "myapp", "cache"),     "engine",     {"value": "Redis 7"})

# Retrieve a specific fact by its exact keypath
item = store.get(("users", "alice", "preferences"), "language")
print(item.value)  # {"value": "Python"}
print(item.namespace)  # ('users', 'alice', 'preferences')
print(item.key)        # 'language'

# Semantic search — finds the most relevant memories across the whole tree
results = store.search(("users",), query="what does alice prefer for backend work?")
for r in results:
    print(f"[{'.'.join(r.namespace)}.{r.key}] score={r.score:.3f}")
    print(f"  {r.value}")

# Browse the entire knowledge tree under a prefix
tree = store.browse(("project", "myapp"))
for keypath, summary in tree.items():
    print(f"  {keypath}: {summary}")

Automatic Versioning

Every time a memory is updated, Memstate creates a new version. The previous value is never deleted — it is preserved in the version history. This means your agent always has a complete audit trail of how its knowledge has evolved.

No data loss by design

Calling store.put() on an existing keypath does not overwrite — it creates a new version. The old value remains accessible via get_history() and time-travel queries.

versioning_demo.py
from langchain_memstate import MemstateStore

store = MemstateStore(api_key="mst_...", project_id="my-project")

# Write the initial fact
store.put(("project", "myapp", "auth"), "provider", {"value": "Basic Auth"})

# Six months later — the team migrates to OAuth2
# Memstate automatically creates a new version; the old one is NEVER deleted
store.put(("project", "myapp", "auth"), "provider", {"value": "OAuth2 + JWT"})

# Another upgrade — add MFA
store.put(("project", "myapp", "auth"), "provider", {"value": "OAuth2 + JWT + MFA"})

# See the full audit trail
history = store.get_history(("project", "myapp", "auth"), "provider")
for version in history:
    print(f"v{version['version']} [{version.get('created_at', '')}]: {version.get('summary', '')}")
# v1 [2024-01-15]: Basic Auth was used for API authentication
# v2 [2024-07-22]: OAuth2 + JWT is now used for API authentication
# v3 [2025-01-10]: OAuth2 + JWT + MFA is now used for API authentication (current)

Time-Travel Queries

Every remember or store operation creates a new global revision. You can reconstruct exactly what your agent knew at any revision -- useful for debugging decisions, auditing changes, and rolling back to a known-good state.

time_travel_demo.py
from langchain_memstate import MemstateStore

store = MemstateStore(api_key="mst_...", project_id="my-project")

# What did the agent know about the project at revision 5?
# This reconstructs the exact state of memory at that point in time.
snapshot = store.get_at_revision(("project", "myapp"), at_revision=5)

print("Project state at revision 5:")
for keypath, summary in snapshot.items():
    print(f"  {keypath}: {summary}")

# Useful for:
# - Debugging: "What did the agent know when it made that decision?"
# - Auditing: "How has our architecture evolved over time?"
# - Rollback: "What was the correct value before the bad update?"

Agent Tools

get_memstate_tools() returns a list of LangChain tools that give your agent direct access to the full Memstate API. These tools are designed with rich descriptions so the LLM knows exactly when and how to use each one. The primary tool is memstate_remember -- just pass any text and Memstate's custom-trained models automatically extract and organize the facts.

ToolDescription
memstate_rememberAuto-extract facts from any text. Custom-trained AI organizes into keypaths automatically. Recommended primary tool.
memstate_storeStore a specific value at an exact keypath. Use when you already know the keypath. Auto-versioned.
memstate_recallSemantic search across all memories. Returns top-k results with scores.
memstate_browseBrowse the knowledge tree by keypath prefix. Returns a structured map.
memstate_get_historyGet the full version history for a keypath.
memstate_time_travelRetrieve memory state at a specific past revision.
agent_tools_demo.py
from langchain_memstate import get_memstate_tools
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

# Get all 6 Memstate tools
tools = get_memstate_tools(api_key="mst_...", project_id="my-agent")

# Or select specific tools
tools = get_memstate_tools(
    api_key="mst_...",
    project_id="my-agent",
    include_tools=["remember", "recall", "browse"],  # skip store, history + time_travel
)

agent = create_react_agent(ChatOpenAI(model="gpt-4o-mini"), tools)

# The agent can now:
# - memstate_remember: Auto-extract facts from any text (recommended primary tool)
# - memstate_store: Store a specific value at an exact keypath (precise writes)
# - memstate_recall: Semantic search across all memories
# - memstate_browse: Browse the knowledge tree by keypath prefix
# - memstate_get_history: See how a fact has changed over time
# - memstate_time_travel: Reconstruct memory state at any past revision
result = agent.invoke({"messages": [
    {"role": "user", "content": (
        "I'm starting work on the myapp project. "
        "First check what you already know about it, "
        "then remember that we're adding a new payments module using Stripe."
    )}
]})
print(result["messages"][-1].content)

RAG Retriever

MemstateRetriever implements LangChain's BaseRetriever interface, making it a drop-in for any RAG chain, RetrievalQA, or LCEL pipeline that accepts a retriever. Each result is returned as a LangChain Document with rich metadata including keypath, score, and version.

retriever_demo.py
from langchain_memstate import MemstateRetriever
from langchain.chains import RetrievalQA
from langchain_openai import ChatOpenAI

# Use Memstate as a retriever in any LangChain RAG pipeline
retriever = MemstateRetriever(
    api_key="mst_...",
    project_id="my-agent",
    k=5,                    # return top 5 results
    score_threshold=0.3,    # filter out low-relevance results
)

# Drop it into RetrievalQA
qa_chain = RetrievalQA.from_chain_type(
    llm=ChatOpenAI(model="gpt-4o-mini"),
    retriever=retriever,
)

answer = qa_chain.invoke({"query": "What database does myapp use?"})
print(answer["result"])

# Or use it directly
docs = retriever.invoke("authentication provider for myapp")
for doc in docs:
    print(f"[{doc.metadata['keypath']}] score={doc.metadata['score']:.3f}")
    print(f"  {doc.page_content}")

Persistent Chat History

MemstateChatMessageHistory implements LangChain's BaseChatMessageHistory interface. Conversation messages are stored at structured keypaths, making them searchable and persistent across sessions, server restarts, and deployments.

chat_history_demo.py
from langchain_memstate import MemstateChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder

# Build a chain with persistent conversation history
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    MessagesPlaceholder(variable_name="history"),
    ("human", "{input}"),
])

chain = prompt | ChatOpenAI(model="gpt-4o-mini")

chain_with_history = RunnableWithMessageHistory(
    chain,
    # Each session_id gets its own conversation stored in Memstate
    lambda session_id: MemstateChatMessageHistory(
        api_key="mst_...",
        session_id=session_id,
        project_id="my-chatbot",
    ),
    input_messages_key="input",
    history_messages_key="history",
)

# Session 1 — first conversation
response = chain_with_history.invoke(
    {"input": "My name is Alice and I prefer Python."},
    config={"configurable": {"session_id": "alice-session-1"}},
)

# Session 1 — later in the same session (history is preserved)
response = chain_with_history.invoke(
    {"input": "What programming language do I prefer?"},
    config={"configurable": {"session_id": "alice-session-1"}},
)
print(response.content)  # "You mentioned you prefer Python."

Full Demo: Project Architect Agent

This end-to-end demo shows all of Memstate's unique capabilities working together: a project architect agent that builds a living knowledge graph, automatically versions every change, and can time-travel back to any point in history.

architect_agent_demo.py
"""
Full demo: A project architect agent that builds a living knowledge graph.

This demo shows Memstate's unique strengths:
1. Keypath hierarchy — facts organized like a filesystem
2. Automatic versioning — every update preserves history
3. Semantic search — find facts by meaning, not just exact path
4. Time-travel — reconstruct what the agent knew at any point
"""

import os
from langchain_memstate import MemstateStore, get_memstate_tools
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

API_KEY = os.environ["MEMSTATE_API_KEY"]
PROJECT_ID = "demo-architect-agent"

store = MemstateStore(api_key=API_KEY, project_id=PROJECT_ID)
tools = get_memstate_tools(api_key=API_KEY, project_id=PROJECT_ID)

agent = create_react_agent(
    ChatOpenAI(model="gpt-4o-mini"),
    tools=tools,
    store=store,
    prompt=(
        "You are a software architect agent. You maintain a structured knowledge graph "
        "about the project using Memstate. Use keypaths like 'project.myapp.auth.provider' "
        "to organize facts hierarchically. Always check what you already know before "
        "answering, and always remember new information you learn."
    ),
)

# Turn 1: Learn the initial architecture
agent.invoke({"messages": [{"role": "user", "content": """
    Here's our initial architecture:
    - Database: PostgreSQL 14
    - Auth: Basic Auth with bcrypt
    - Cache: None
    - Frontend: React 17
    - Backend: Python/Flask
    Please remember all of this.
"""}]})

# Turn 2: Architecture evolves — Memstate auto-versions each change
agent.invoke({"messages": [{"role": "user", "content": """
    We've upgraded:
    - Database: PostgreSQL 16 (added pgvector)
    - Auth: Migrated to OAuth2 + JWT
    - Cache: Added Redis 7
    - Frontend: Upgraded to React 18
    Please update your memory.
"""}]})

# Turn 3: Query the current state
result = agent.invoke({"messages": [{"role": "user", "content":
    "Give me a full overview of our current architecture."
}]})
print("Current architecture:")
print(result["messages"][-1].content)

# Turn 4: Time-travel — what did we know at the start?
print("\n--- Direct time-travel query ---")
snapshot = store.get_at_revision(("project",), at_revision=1)
print("Architecture at revision 1:")
for keypath, summary in snapshot.items():
    print(f"  {keypath}: {summary}")

# Turn 5: See the full version history for auth
print("\n--- Auth provider version history ---")
history = store.get_history(("project", "myapp", "auth"), "provider")
for v in history:
    print(f"  v{v['version']}: {v.get('summary', '')}")

API Reference

MemstateStore

LangGraph BaseStore implementation. Use as the store= argument in any LangGraph agent.

ParamTypeDescription
api_keystrYour Memstate API key
project_idstrProject to scope all operations to
base_urlstrAPI base URL (default: https://api.memstate.ai)

MemstateRetriever

LangChain BaseRetriever. Drop into any RAG chain or LCEL pipeline.

ParamTypeDescription
api_keystrYour Memstate API key
project_idstrProject to search
kintNumber of results to return (default: 5)
score_thresholdfloatMinimum relevance score 0.0–1.0 (default: 0.0)
keypath_prefixstr?Optional keypath prefix to scope searches

MemstateChatMessageHistory

LangChain BaseChatMessageHistory. Persistent cross-session conversation memory.

ParamTypeDescription
api_keystrYour Memstate API key
session_idstrUnique identifier for this conversation session
project_idstrProject to store messages in
keypath_prefixstrKeypath prefix for messages (default: "conversations")