Give Claude, Cursor, Windsurf, VS Code Copilot, and any MCP-compatible AI tool persistent long-term memory via CortexDB.
MCP Server
CortexDB includes a Model Context Protocol (MCP) server that gives AI tools persistent long-term memory. Install it once, add your API key, and every conversation in your IDE has access to CortexDB's memory system — store decisions, recall context, explore your knowledge graph.
Works with: Claude Desktop, Claude Code, Cursor, Windsurf, VS Code (Copilot), and any MCP-compatible client.
Installation
pip install cortexdb-mcp
Verify it works:
cortexdb-mcp --help
Setup
Every MCP client uses a JSON config file. The config is the same everywhere — just the file path differs.
You need two values:
- CORTEXDB_URL —
https://api.cortexdb.ai - CORTEXDB_API_KEY — Your API key from cortexdb.ai/dashboard
Claude Desktop
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"cortexdb": {
"command": "cortexdb-mcp",
"env": {
"CORTEXDB_URL": "https://api.cortexdb.ai",
"CORTEXDB_API_KEY": "cx_live_your_key_here"
}
}
}
}
Claude Code (CLI)
# One-command setup
claude mcp add cortexdb cortexdb-mcp \
-e CORTEXDB_URL=https://api.cortexdb.ai \
-e CORTEXDB_API_KEY=cx_live_your_key_here
Or edit ~/.claude/mcp.json manually with the same config as above.
Cursor
Edit ~/.cursor/mcp.json:
{
"mcpServers": {
"cortexdb": {
"command": "cortexdb-mcp",
"env": {
"CORTEXDB_URL": "https://api.cortexdb.ai",
"CORTEXDB_API_KEY": "cx_live_your_key_here"
}
}
}
}
Or: Cursor Settings > MCP Servers > Add Server.
Windsurf
Edit ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"cortexdb": {
"command": "cortexdb-mcp",
"env": {
"CORTEXDB_URL": "https://api.cortexdb.ai",
"CORTEXDB_API_KEY": "cx_live_your_key_here"
}
}
}
}
VS Code (GitHub Copilot)
Add to .vscode/mcp.json in your project (or ~/.vscode/mcp.json globally):
{
"servers": {
"cortexdb": {
"command": "cortexdb-mcp",
"env": {
"CORTEXDB_URL": "https://api.cortexdb.ai",
"CORTEXDB_API_KEY": "cx_live_your_key_here"
}
}
}
}
Available Tools (20)
Once connected, your AI assistant has access to these tools:
Memory Operations
| Tool | Description |
|------|-------------|
| memory_store | Store a memory with optional source, type, tags, and TTL |
| memory_search | Search memories using natural language (hybrid BM25 + vector + graph retrieval) |
| memory_forget | Delete memories with audit trail (GDPR-compliant) |
| get_context | Deep contextual retrieval combining search and knowledge graph |
| advanced_search | Search with structured filters (source, type, time range) |
Episode Management
| Tool | Description |
|------|-------------|
| memory_list | List episodes with pagination and type filtering |
| memory_get | Get a specific episode by ID with full content and metadata |
| memory_update | Update episode content or metadata |
| memory_delete | Delete a specific episode |
| memory_bulk_delete | Bulk delete with query matching and dry-run preview |
Knowledge Graph
| Tool | Description |
|------|-------------|
| entity_list | List entities (people, services, projects, concepts) |
| entity_get | Get entity details with relationships and recent episodes |
| entity_edges | Get all relationships for an entity |
| entity_link | Create a relationship between two entities |
Admin & Observability
| Tool | Description |
|------|-------------|
| health_check | Check CortexDB server health |
| get_usage | View usage stats and tier limits |
| get_insights | Generate proactive insights (incident spikes, knowledge gaps) |
| get_ontology | View entity and relationship types in the schema |
| export_data | Export memories as JSON |
| import_data | Import memories from JSON |
Resources
Resources provide read-only data that AI tools can browse:
| Resource URI | Description |
|---|---|
| cortexdb://health | Server health status |
| cortexdb://metrics | Request metrics (total, active, errors) |
| cortexdb://usage | Usage statistics and tier limits |
| cortexdb://episodes | Recent 50 episodes |
| cortexdb://entities | Top 100 knowledge graph entities |
| cortexdb://insights | Proactive insights |
| cortexdb://ontology | Entity and relationship type schema |
Prompt Templates
Pre-built prompts for common workflows:
| Prompt | Description |
|---|---|
| investigate_incident | Investigate an incident using stored memories |
| summarize_knowledge | Summarize everything known about a topic |
| deployment_review | Pre-deployment safety review using historical context |
| onboard_to_codebase | Onboard to a codebase using stored knowledge |
| weekly_digest | Generate a weekly activity summary |
Usage Example
Once configured, your AI assistant uses CortexDB tools automatically:
You: Remember that we decided to migrate payments to Stripe v3 on March 15th. The main driver was PCI compliance.
Assistant: [calls memory_store] Stored. Event ID: evt_a1b2c3
You: What do we know about the payments migration?
Assistant: [calls memory_search] Based on CortexDB:
- You decided to migrate payments to Stripe v3 on March 15th
- The main driver was PCI compliance
- Confidence: 0.94
You: Show me all entities related to the payments service
Assistant: [calls entity_get] The payments-service has 3 relationships:
- DEPENDS_ON → stripe-gateway-v3
- OWNED_BY → backend-team
- RELATED_TO → billing-service
Configuration Reference
| Environment Variable | Default | Description |
|---|---|---|
| CORTEXDB_URL | https://api.cortexdb.ai | CortexDB server URL |
| CORTEXDB_API_KEY | (none) | API key from your dashboard |
| CORTEXDB_TIMEOUT | 30.0 | HTTP request timeout (seconds) |
Architecture
Your Machine CortexDB Cloud
┌──────────┐ stdio ┌───────────┐ HTTPS ┌──────────┐
│ Cursor / │ ◄─────────► │ cortexdb │ ◄─────────► │ CortexDB │
│ Claude / │ MCP JSON │ -mcp │ REST API │ Server │
│ VS Code │ │ (bridge) │ │ │
└──────────┘ └───────────┘ └──────────┘
The MCP server is a lightweight bridge (~5 MB) that translates MCP protocol calls into CortexDB REST API calls. All data is stored on CortexDB's servers — nothing is stored locally.