Add long-term memory to IBM BeeAI Framework agents.
BeeAI Integration
CortexDB provides persistent long-term memory for IBM's BeeAI Framework agents, enabling agents to remember context across sessions with semantic retrieval of past interactions and stored knowledge.
Installation
pip install cortexdbai[beeai]
Quick Start
from cortexdb import Cortex
from cortexdb_beeai import CortexDBMemory
client = Cortex(base_url="https://api.cortexdb.ai", api_key="your-cortex-api-key")
memory = CortexDBMemory(
client=client,
tenant_id="my-app",
)
# Store a memory
memory.add("The deployment cadence is every two weeks on Tuesdays.")
# Search for relevant memories
result = memory.search("deployment schedule")
print(result.context)
As Memory Backend
The CortexDBMemory class provides a full memory interface for BeeAI agents:
memory = CortexDBMemory(
client=client,
tenant_id="my-app",
)
# Store individual memories
memory.add("Customer prefers email communication.")
memory.add("Account tier: Enterprise")
# Save conversation turns automatically
memory.save(
input_text="What is our refund policy?",
output_text="Refunds are available within 30 days of purchase.",
)
# Recall context as formatted text (for prompt injection)
result = memory.recall("refund policy")
print(result.context)
# Load memory variables (returns object with .context)
result = memory.load("What are the account details?")
print(result.context)
# Clear all memories
memory.clear()
As Agent Tools
Use CortexDB tools to give BeeAI agents explicit memory operations:
from cortexdb import Cortex
from cortexdb_beeai import CortexDBSearchTool, CortexDBStoreTool, CortexDBForgetTool
client = Cortex(base_url="https://api.cortexdb.ai", api_key="your-cortex-api-key")
# Create tools
search_tool = CortexDBSearchTool(client=client, tenant_id="my-app")
store_tool = CortexDBStoreTool(client=client, tenant_id="my-app")
forget_tool = CortexDBForgetTool(client=client, tenant_id="my-app")
# Use tools directly
result = search_tool.run("previous deployment issues")
print(result.context)
store_tool.run("Resolved DNS issue by updating the CNAME record.")
forget_tool.run("outdated server config", reason="Servers migrated to new infrastructure")
# Access tool metadata for agent registration
print(search_tool.name) # "cortexdb_search"
print(search_tool.description) # "Search CortexDB for relevant memories..."
print(search_tool.input_schema) # JSON schema for tool inputs
Configuration
| Parameter | Default | Description |
|---|---|---|
| base_url | https://api.cortexdb.ai | CortexDB server URL |
| api_key | None | CortexDB API key |
| tenant_id | "default" | Tenant identifier |
Under the Hood
The integration wrapper maps to CortexDB's REST API:
# memory.add("The deployment cadence is every two weeks on Tuesdays.")
curl -X POST https://api.cortexdb.ai/v1/remember \
-H "Authorization: Bearer your-api-key" \
-H "Content-Type: application/json" \
-d '{
"content": "The deployment cadence is every two weeks on Tuesdays.",
"tenant_id": "my-app"
}'
# Returns: { "event_id": "019d6359-d3cc-7671-9e4c-9151011fa016" }
# memory.recall("deployment schedule")
curl -X POST https://api.cortexdb.ai/v1/recall \
-H "Authorization: Bearer your-api-key" \
-H "Content-Type: application/json" \
-d '{
"query": "deployment schedule",
"tenant_id": "my-app"
}'
# Returns: { "context": "...", "confidence": 0.89, "latency_ms": 10 }
Complete Example
from cortexdb import Cortex
from cortexdb_beeai import CortexDBMemory, CortexDBSearchTool, CortexDBStoreTool
client = Cortex(base_url="https://api.cortexdb.ai", api_key="your-cortex-api-key")
# Set up shared memory for a BeeAI agent
memory = CortexDBMemory(
client=client,
tenant_id="engineering",
)
# Pre-load knowledge
memory.add("Runbook: If CPU > 90% for 5 min, scale horizontally.")
memory.add("Runbook: If memory > 85%, check for memory leaks in Java services.")
memory.add("On-call rotation: Mon-Wed Alice, Thu-Fri Bob, Weekends Charlie.")
# Agent processes an incident
user_query = "CPU is spiking on the payment service"
result = memory.recall(user_query)
# Context now contains the relevant runbook entry
print(f"Retrieved context:\n{result.context}")
print(f"Confidence: {result.confidence}")
# Save the interaction for future reference
memory.save(
input_text=user_query,
output_text="Initiating horizontal scaling for payment service per runbook.",
)
# Also available as explicit tools for the agent
search = CortexDBSearchTool(client=client, tenant_id="engineering")
store = CortexDBStoreTool(client=client, tenant_id="engineering")
# Agent can use tools during reasoning
result = search.run("payment service incidents")
print(result.context)
store.run("Payment service scaled to 6 replicas. CPU normalized at 45%.")