โšกpip install aimemory-mcp-server

Give your AI
persistent memory.

An open-source MCP server that connects Claude Desktop, Cursor, ChatGPT, and 113+ AI clients to your entire conversation history. One install. Works everywhere.

PyPI package coming soon โ€” pip install aimemory-mcp-server will work once published

113+
MCP Clients
9
Tools Included
10s
Install Time
100%
Free & Open Source

Install in 10 seconds

One command. No configuration files. No accounts.

$pip install git+https://github.com/jingchang0623-crypto/aimemory.git#subdirectory=mcp-server
$aimemory-mcp-server

โœ“ Python 3.8+ ย  โœ“ Works on macOS, Linux, Windows ย  โœ“ No API keys needed

๐Ÿ“ฆ PyPI package coming soon โ€” use GitHub install for now

12 powerful memory tools

Everything your AI assistant needs to remember, search, and manage your conversations.

๐Ÿ”

search_memories

Full-text search across all your saved conversations with FTS5 syntax support. Filter by platform, date, or tags.

๐Ÿ’พ

save_memory

Save new conversations, insights, or memory snippets with automatic tagging and source attribution.

๐Ÿ“‹

list_memories

Browse your memory library with tag filtering, date ranges, and pagination. See what your AI remembers.

๐Ÿ“Œ

get_memory

Retrieve a specific memory by ID. Perfect for referencing exact conversations or insights your AI saved earlier.

โœ๏ธ

update_memory

Edit existing memories โ€” update content, add tags, correct details. Keep your knowledge base accurate.

๐Ÿ—‘๏ธ

delete_memory

Remove outdated or irrelevant memories permanently. Full control over your data.

๐Ÿ“Š

memory_stats

Get total memory count, recent activity, and tag distribution. Monitor what your AI remembers at a glance.

๐Ÿ“ค

export_memories

Backup all memories to JSON. Export your entire knowledge base for migration or safekeeping.

๐Ÿ“ฅ

import_memories

Import memories from JSON backup. Batch-load conversations with automatic duplicate detection.

๐Ÿ“ฆ

batch_save_memories

Save multiple memories at once. Perfect for extracting key takeaways from a conversation in one call.

๐Ÿท๏ธ

get_all_tags

List all unique tags with usage counts. Discover what categories of memories you have stored.

๐Ÿงน

clear_all_memories

Delete all memories. โš ๏ธ Use with caution โ€” export first if you want to keep a backup.

Works with your favorite AI tools

One server, every client. Same config everywhere.

๐ŸŸฃ

Claude Desktop

Most Popular

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server"
    }
  }
}

Restart Claude Desktop. Ask: "Search my memory for React performance tips"

โšก

Cursor

Go to Settings โ†’ MCP โ†’ Add New MCP Server:

Name:AI Memory
Type:stdio
Command:aimemory-mcp-server
๐ŸŒŠ

Windsurf

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server"
    }
  }
}
๐Ÿ’™

VS Code + Cline / Continue

In your MCP settings, add:

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server"
    }
  }
}

Also works with: Zed, Continue, Augment, Aider, and 100+ more MCP clients

View full documentation โ†’

Two ways to use it

Choose local or cloud โ€” both give your AI persistent memory.

Recommended

๐Ÿ  Local Server

Runs entirely on your machine. Your data never leaves your computer.

โœ“100% offline โ€” no internet required
โœ“SQLite database on your disk
โœ“stdio transport โ€” zero config
โœ“Full CRUD: search, save, update, delete
pip install aimemory-mcp-server
Cloud

โ˜๏ธ Hosted Server

Use our hosted endpoint. Upload conversations via the web UI, then search from any MCP client.

โœ“No installation needed
โœ“Access from any device
โœ“HTTP transport (Streamable)
โœ“Session-based security
โœ“Extra tools: get_context, get_conversation
Endpoint: aimemory.pro/api/mcp

Why MCP changes everything

The Model Context Protocol is the USB-C of AI โ€” one standard, infinite connections.

๐Ÿ”Œ

Universal Standard

One server works with Claude, Cursor, ChatGPT, and 113+ clients. No custom integrations per platform.

๐Ÿ”—

Cross-Platform Memory

Your AI remembers conversations from ChatGPT, Claude, DeepSeek, and Gemini โ€” all searchable from one place.

๐Ÿš€

First-Mover Advantage

Cross-platform memory format is an industry gap. AI Memory is building the "SMTP of AI memory" โ€” and you can use it today.

How AI Memory MCP compares

The only MCP server with cross-platform conversation search.

FeatureAI Memory MCPMem0Custom MCP
Setup Time10 seconds30+ minutesHours
Cross-Platform Searchโœ… 4 platformsโŒ Single sourceโš ๏ธ Manual
MCP Standardโœ… NativeโŒ REST API onlyโœ… DIY
Offline Supportโœ… FullโŒ Cloud onlyโš ๏ธ Depends
PricingFreeFree tier + $24M VCDev time
Consumer-Friendlyโœ… Zero configโŒ Developer-onlyโŒ Code required

Frequently asked questions

What is an MCP server for AI memory?

An MCP (Model Context Protocol) server for AI memory is a tool that gives AI assistants like Claude Desktop and Cursor persistent memory by connecting them to your conversation history. AI Memory's MCP server lets any MCP-compatible AI search, save, and retrieve memories across all your AI conversations.

How do I install the AI Memory MCP server?

Install with one command: pip install aimemory-mcp-server. Then add the configuration to your MCP client (Claude Desktop, Cursor, etc.) and restart. The server runs locally on your machine with full offline support.

Which AI tools support MCP servers?

Over 113 AI clients support MCP servers, including Claude Desktop, Cursor, Windsurf, VS Code (with Cline/Continue), Zed, and many more. The Model Context Protocol is an open standard supported by the entire AI ecosystem.

Is the AI Memory MCP server free?

Yes, the AI Memory MCP server is completely free and open-source. You can install it via pip, run it locally, and connect it to any MCP client at no cost. There are no usage limits or hidden fees.

Does the MCP server work offline?

Yes, the standalone MCP server runs entirely on your local machine. Your conversation data stays in a local SQLite database โ€” no cloud connection required. You can also use the hosted version at aimemory.pro/api/mcp for cloud-based access.

What is the difference between AI Memory MCP and Mem0?

AI Memory MCP server is a free, open-source tool focused on managing and searching existing AI conversations. Mem0 is a B2B API platform ($24M funded) for building custom memory layers in applications. AI Memory is consumer-friendly with zero setup, while Mem0 requires developer integration.

Give your AI persistent memory today

One pip install. Works with Claude, Cursor, ChatGPT, and 113+ tools.