⚡ 5-minute setup

Get Started with Memory Spine in 5 Minutes

Learn how to add persistent memory to your AI agent. Install the SDK, configure MCP, and store your first memory — all in under 5 minutes.

📝 Prerequisites

  • Python 3.8+ or Node.js 18+ installed
  • Any MCP-compatible AI model (Claude, GPT-4, Gemini, or open-source)
  • A terminal or command-line interface
1

Install Memory Spine

Install the Memory Spine SDK in your project. Choose Python or Node.js:

Terminal
pip install memory-spine
Terminal
npm install @chaozcode/memory-spine
Expected output
Successfully installed memory-spine-3.0.0
2

Configure MCP

Add Memory Spine to your MCP configuration file so your AI model can access all 32 memory tools. Create or update mcp-config.json:

mcp-config.json
{ "mcpServers": { "memory-spine": { "command": "memory-spine", "args": ["--port", "YOUR_PORT"], "env": { "MEMORY_SPINE_DB": "./memories.db" } } } }
What this does
Registers Memory Spine as an MCP server on your configured port. Your AI model will auto-discover all 32 memory tools (store, search, pin, tag, handoff, etc.).
3

Store Your First Memory

Use memory_store to save persistent context. This memory will survive across sessions, conversations, and agent handoffs.

store_memory.py
from memory_spine import MemorySpine # Connect to your Memory Spine instance spine = MemorySpine(port=YOUR_PORT) # Store a memory with content and tags result = spine.memory_store( content="User prefers dark mode and Python code examples. " "They are building a chatbot for customer support.", tags=["user-preference", "project-context"] ) print(f"Stored memory: {result['id']}")
storeMemory.ts
import { MemorySpine } from "@chaozcode/memory-spine"; // Connect to your Memory Spine instance const spine = new MemorySpine({ port: YOUR_PORT }); // Store a memory with content and tags const result = await spine.memoryStore({ content: "User prefers dark mode and Python code examples. " + "They are building a chatbot for customer support.", tags: ["user-preference", "project-context"], }); console.log(`Stored memory: ${result.id}`);
Expected output
Stored memory: mem_a1b2c3d4e5f6
4

Search Memories

Use memory_search with a natural language query. Memory Spine returns semantically relevant results in under 25ms — no keyword matching required.

search_memory.py
# Search by meaning, not just keywords results = spine.memory_search( query="what kind of project is the user working on?", limit=5 ) for memory in results: print(f"[{memory['score']:.2f}] {memory['content']}")
searchMemory.ts
// Search by meaning, not just keywords const results = await spine.memorySearch({ query: "what kind of project is the user working on?", limit: 5, }); for (const memory of results) { console.log(`[${memory.score.toFixed(2)}] ${memory.content}`); }
Expected output
[0.94] User prefers dark mode and Python code examples. They are building a chatbot for customer support.
5

Use in Your Agent

Combine everything into a complete AI agent that loads context from Memory Spine before every response. This agent remembers past conversations and builds knowledge over time.

agent.py — Complete AI Agent with Memory
from memory_spine import MemorySpine spine = MemorySpine(port=YOUR_PORT) def agent_respond(user_message: str) -> str: # 1. Recall relevant context from memory context = spine.memory_search( query=user_message, limit=5 ) # 2. Build context string for the LLM memory_context = "\n".join( m["content"] for m in context ) # 3. Generate response (use your LLM of choice) response = call_llm( system=f"You have this context:\n{memory_context}", user=user_message ) # 4. Store the interaction as a new memory spine.memory_store( content=f"User asked: {user_message}\n" f"Agent replied: {response}", tags=["conversation", "auto-stored"] ) return response # Run it reply = agent_respond("How's my chatbot project going?") print(reply)
agent.ts — Complete AI Agent with Memory
import { MemorySpine } from "@chaozcode/memory-spine"; const spine = new MemorySpine({ port: YOUR_PORT }); async function agentRespond(userMessage: string): Promise<string> { // 1. Recall relevant context from memory const context = await spine.memorySearch({ query: userMessage, limit: 5, }); // 2. Build context string for the LLM const memoryContext = context .map((m) => m.content) .join("\n"); // 3. Generate response (use your LLM of choice) const response = await callLLM({ system: `You have this context:\n${memoryContext}`, user: userMessage, }); // 4. Store the interaction as a new memory await spine.memoryStore({ content: `User asked: ${userMessage}\n` + `Agent replied: ${response}`, tags: ["conversation", "auto-stored"], }); return response; } // Run it const reply = await agentRespond("How's my chatbot project going?"); console.log(reply);
Expected output
Based on your previous conversations, your customer support chatbot project is progressing well. You mentioned preferring dark mode and Python examples. Would you like to continue where we left off?

Frequently Asked Questions

How long does it take to set up Memory Spine?

Most developers are up and running in under 5 minutes. Install the SDK with pip install memory-spine or npm install @chaozcode/memory-spine, add the MCP config block, and you can store and search memories immediately. No database setup, no infrastructure provisioning — Memory Spine handles storage, indexing, and vector search out of the box.

Does Memory Spine work with GPT-4, Claude, and Gemini?

Yes. Memory Spine uses the open Model Context Protocol (MCP), which is supported by Claude, GPT-4, Gemini, and most modern AI models. Any model that can call MCP tools can use Memory Spine’s 32 memory operations including store, search, pin, tag, and agent handoff.

Is Memory Spine free?

Memory Spine offers a generous free tier with 5,000 memory vectors, core MCP tools, and semantic search — no credit card required. Paid plans start at $19/month for 25,000 vectors and the full 32-tool suite. See pricing.

Ready to give your AI a memory?

Start free. No credit card required. 5,000 vectors included.

Start Free →

From the Blog

Tutorials and guides to help you get the most out of Memory Spine.

How to Give Your AI Agent Persistent Memory

Step-by-step tutorial with Python code examples and architecture diagrams.

Read →

MCP Tools Explained: 32 Ways to Enhance Your Agent

A comprehensive guide to every MCP tool — store, search, pin, graph, and more.

Read →