🧠 AI Memory

Your AI conversations, organized and searchable

MCP v1.4.0 — Local: 12 Tools • API: 6 Tools

AI Memory MCP Server

Connect your AI assistants to your conversation history via the Model Context Protocol. Let Claude, Cursor, and other AI tools search your past conversations.

What is MCP?

The Model Context Protocol (MCP)is an open standard that lets AI assistants connect to external data sources and tools. With AI Memory's MCP Server, your AI assistant can search through your entire conversation history — across ChatGPT, Claude, DeepSeek, and Gemini.

Note: Tools below are for the Hosted API endpoint (aimemory.pro/api/mcp). For the standalone Local Serverwith 9 tools (save_memory, update_memory, memory_stats, export_memories, import_memories), see the "Standalone MCP Server" section below.

🔍 search_memory

Full-text search across all your saved conversations with platform filtering.

📝 add_memory

Save new conversations or memory snippets to your knowledge base.

🧠 get_context

Retrieve relevant context from past conversations for a given topic.

📋 list_memories

Browse recent conversations with optional platform filtering and pagination.

💬 get_conversation

Retrieve a full conversation with all messages by ID.

🗑️ delete_memory

Delete a specific conversation or all data for your session.

Setup Guide: Claude Desktop

Add the following to your Claude Desktop MCP configuration file:

Config file location:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "ai-memory": {
      "url": "https://aimemory.pro/api/mcp",
      "transport": "http"
    }
  }
}

Restart Claude Desktop after saving the config. You can now ask Claude to search your conversation history!

Setup Guide: Cursor

In Cursor, go to Settings → MCP → Add New MCP Server and configure:

Name:AI Memory
Type:HTTP
URL:https://aimemory.pro/api/mcp

Once connected, Cursor can search your AI conversation history when writing code or answering questions.

Other MCP Clients

AI Memory works with any MCP-compatible client. Use these connection details:

Endpoint: https://aimemory.pro/api/mcp

Protocol: MCP 2024-11-05 (JSON-RPC 2.0)

Transport: Streamable HTTP (POST)

Authentication: Session cookie required (upload data first at aimemory.pro)

Compatible with: Windsurf, Cline, Continue, Zed, and 100+ other MCP clients.

Standalone MCP Server (Local)

Run AI Memory as a local MCP server on your machine — fully offline, no cloud dependency. Install via pip and connect to any MCP client.

🆕 New: Install via pip — one command setup

# Install
pip install aimemory-mcp-server

# Run (stdio mode)
aimemory-mcp-server

Claude Desktop Config (Local)

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server",
      "env": {
        "AIMEMORY_DB": "/path/to/your/aimemory.db"
      }
    }
  }
}

Cursor Config (Local)

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server",
      "env": {
        "AIMEMORY_DB": "/path/to/your/aimemory.db"
      }
    }
  }
}

7 Local Tools

🔍 search_memories

Full-text search with FTS5 syntax

💾 save_memory

Save new memories with tags & source

📋 list_memories

Browse memories with tag filtering

📖 get_memory

Retrieve a single memory by ID

✏️ update_memory

Edit existing memory content & tags

🗑️ delete_memory

Permanently remove a memory

📊 memory_stats

Get total count, tags, recent activity

Source code: GitHub • Package: pip install aimemory-mcp-server

API Reference

Initialize Connection

POST /api/mcp
Content-Type: application/json

{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "initialize"
}

Search Conversations

POST /api/mcp
Content-Type: application/json

{
  "jsonrpc": "2.0",
  "id": 2,
  "method": "tools/call",
  "params": {
    "name": "search_memory",
    "arguments": {
      "query": "machine learning best practices",
      "platform": "chatgpt",
      "limit": 5
    }
  }
}

Add Memory

POST /api/mcp
Content-Type: application/json

{
  "jsonrpc": "2.0",
  "id": 3,
  "method": "tools/call",
  "params": {
    "name": "add_memory",
    "arguments": {
      "title": "Key insights on RAG architecture",
      "content": "Today I learned that...",
      "platform": "manual",
      "tags": ["rag", "architecture", "insights"]
    }
  }
}

Frequently Asked Questions

Is the MCP Server free?

Yes, the MCP Server is available on the free plan. All conversation search and retrieval features are free forever.

Is my data sent to a cloud server?

The MCP endpoint at aimemory.pro accesses conversations stored in your account. If you use the web upload feature, your data is stored server-side in an encrypted database. You can also run AI Memory locally for 100% offline usage.

Can I self-host the MCP Server?

Yes! AI Memory is open source. Clone the repository and run it locally. The MCP Server endpoint is available at /api/mcp on any deployment.

Which platforms are supported?

AI Memory supports ChatGPT, Claude, DeepSeek, and Gemini. You can import conversations from all platforms and search them through the MCP interface.

Ready to connect your AI to your memory?

Upload your conversations first, then connect via MCP.