Use CortexDB as a memory node in Flowise visual workflows.
Flowise Integration
CortexDB provides a custom Flowise node for adding persistent memory to your visual AI workflows.
Installation
- Install the CortexDB Flowise node:
cd ~/.flowise/nodes npm install cortexdbai - Restart Flowise
- The CortexDB Memory node appears in the Memory category
Configuration
Drag the CortexDB Memory node onto your canvas and configure:
| Field | Description |
|---|---|
| Base URL | CortexDB server URL (e.g., https://api.cortexdb.ai) |
| API Key | Your CortexDB API key |
| Tenant ID | Tenant identifier |
Usage
Connect the CortexDB Memory node to any LLM chain or agent node. The node automatically:
- Recalls relevant context before the LLM processes the input
- Stores the conversation turn after the LLM responds
- Injects memory context into the system prompt
Example Flow
[Chat Input] → [CortexDB Memory] → [ChatOpenAI] → [Chat Output]
The memory node enriches the conversation with relevant historical context from CortexDB.
Under the Hood
The Flowise node calls the CortexDB REST API:
# Recall (before LLM processing)
curl -X POST https://api.cortexdb.ai/v1/recall \
-H "Authorization: Bearer your-api-key" \
-H "Content-Type: application/json" \
-d '{"query": "user input text", "tenant_id": "my-app"}'
# Remember (after LLM response)
curl -X POST https://api.cortexdb.ai/v1/remember \
-H "Authorization: Bearer your-api-key" \
-H "Content-Type: application/json" \
-d '{"content": "conversation turn text", "tenant_id": "my-app"}'