Add persistent memory to LangGraph agents and workflows.
LangGraph Integration
CortexDB provides persistent long-term memory for LangGraph agents, complementing LangGraph's built-in short-term state management.
Installation
pip install cortexdbai[langgraph]
Setup
from langgraph.graph import StateGraph, MessagesState
from cortexdb.integrations.langgraph import CortexMemoryNode, cortex_remember, cortex_recall
# Add CortexDB nodes to your graph
builder = StateGraph(MessagesState)
# Recall relevant context before the agent responds
builder.add_node("recall_memory", cortex_recall(
tenant_id="my-app",
))
# Store the conversation turn after the agent responds
builder.add_node("store_memory", cortex_remember(
tenant_id="my-app",
))
builder.add_node("agent", agent_node)
# Wire the graph
builder.add_edge("recall_memory", "agent")
builder.add_edge("agent", "store_memory")
builder.set_entry_point("recall_memory")
graph = builder.compile()
Memory-Augmented Agent
from cortexdb import Cortex
cortex = Cortex(api_key="your-api-key")
def recall_node(state: MessagesState):
"""Recall relevant memories before agent processes the message."""
last_message = state["messages"][-1].content
result = cortex.recall(
query=last_message,
tenant_id="my-app",
)
state["memory_context"] = result.context
return state
def remember_node(state: MessagesState):
"""Store the conversation turn in CortexDB."""
for msg in state["messages"][-2:]: # user + assistant
cortex.remember(
content=msg.content,
tenant_id="my-app",
)
return state
Under the Hood
The integration wraps CortexDB's REST API. Here are the equivalent calls:
# remember() — store a conversation turn
curl -X POST https://api.cortexdb.ai/v1/remember \
-H "Authorization: Bearer your-cortex-api-key" \
-H "Content-Type: application/json" \
-d '{
"content": "User asked about deployment status.",
"tenant_id": "my-app"
}'
# Returns: { "event_id": "evt_abc123" }
# recall() — retrieve relevant context
curl -X POST https://api.cortexdb.ai/v1/recall \
-H "Authorization: Bearer your-cortex-api-key" \
-H "Content-Type: application/json" \
-d '{
"query": "What is the deployment status?",
"tenant_id": "my-app"
}'
# Returns: { "context": "...", "confidence": 0.89, "latency_ms": 15 }
Configuration
| Parameter | Default | Description |
|---|---|---|
| tenant_id | Required | Tenant identifier |
| source | langgraph | Source identifier |