MCP Server

MemoryLake MCP Server

Native Model Context Protocol — connect any MCP-compatible AI to MemoryLake in under 2 minutes.

MemoryLake MCP Server

What is the MemoryLake MCP server

Model Context Protocol is the open standard from Anthropic for connecting AI assistants to external data and tools. MemoryLake ships a native MCP server — meaning any MCP-compatible client gets persistent memory by adding one config block.

No SDK install. No wrappers. No custom CLI.

What MCP clients work with this

Every client that speaks MCP. Today that includes:

Claude Code (Anthropic)
Claude Desktop (Anthropic)
Cursor
Windsurf (Codeium)
Cline (formerly Claude Dev)
Continue.dev
Zed
Any custom MCP client built with the official SDKs

If a tool says "supports MCP," MemoryLake works in it.

What you get

One memory, every MCP tool

Claude Code in the morning, Cursor in the afternoon — same memory, picked up where you left off.

Multimodal under MCP

The protocol carries text, but our server handles file references, images, and structured data internally.

Auto-capture, auto-recall

Important context is captured automatically; relevant memory is loaded on session start.

Team workspaces

Multiple developers sharing the same project memory across their MCP clients.

Free for personal use

No credit card. Generous quota for individual developers.

Project-scoped tokens

Issue tokens scoped to a single project; revoke any time.

Install in 2 minutes

  1. 1

    Sign up at MemoryLake

    Create a free account. Grab your MCP token from the dashboard.

  2. 2

    Add the MCP server to your client

    Most MCP clients use a JSON config — add the block below.

  3. 3

    Restart your client and start working

    Memory loads automatically at session start.

{
  "mcpServers": {
    "memorylake": {
      "url": "https://mcp.memorylake.ai",
      "headers": {
        "Authorization": "Bearer YOUR_MCP_TOKEN"
      }
    }
  }
}

Specific config locations by client

Claude Desktop~/Library/Application Support/Claude/claude_desktop_config.json (macOS)
Claude CodeSettings → MCP Servers
CursorSettings → MCP
WindsurfSettings → Extensions → MCP
ClineVS Code settings → Cline: MCP Servers

What people use it for

"Cursor in the morning, Claude Code in the afternoon — same project memory across both. Game-changer."
"Onboarded a new dev on a complex codebase by sharing the team MemoryLake project. Their Cline picked up everything we taught Claude Code over months."
"Switched from Cursor to Windsurf to Cline while evaluating tools. Memory followed me, no migration needed."

Available MCP methods

The server exposes standard MCP resources/ and tools/ methods:

resources/listList your memories as MCP resources
resources/readRead a specific memory
tools/call: search_memorySemantic search across your memory store
tools/call: add_memoryStore a new memory
tools/call: detect_conflictsSurface conflicting memories

FAQ

Is the MCP server open source?

The MCP server reference implementation is open source. The underlying memory engine is a managed service.

What is the difference between the MCP server and the REST API?

The REST API is for any backend application. The MCP server is specifically for AI client tools that already speak MCP. Both talk to the same memory store.

Can I run the MCP server locally / self-host?

Self-hosted deployments are available for enterprise customers — contact us.

Does the MCP server support streaming?

Yes — standard MCP streaming for long retrievals.

Does my MCP client see my MemoryLake credentials?

The bearer token sits in the client's config. We recommend using project-scoped tokens (not master account tokens) for client configs.

Is this listed in Anthropic's official MCP servers directory?

Yes — see github.com/modelcontextprotocol/servers.

Docs and resources