huaqingxu b1198b343b add
2026-03-06 23:23:06 +08:00
add
2026-03-06 23:09:44 +08:00
2026-03-06 20:52:16 +08:00
2026-03-06 20:52:16 +08:00
add
2026-03-06 23:09:44 +08:00
add
2026-03-06 23:23:06 +08:00
add
2026-03-06 23:09:44 +08:00
2026-03-06 20:52:16 +08:00
2026-03-06 20:52:16 +08:00
add
2026-03-06 23:09:44 +08:00
2026-03-06 20:52:16 +08:00
2026-03-06 20:52:16 +08:00

Mem0 REST API Server

Mem0 provides a REST API server (written using FastAPI). Users can perform all operations through REST endpoints. The API also includes OpenAPI documentation, accessible at /docs when the server is running.

Features

  • Create memories: Create memories based on messages for a user, agent, or run.
  • Retrieve memories: Get all memories for a given user, agent, or run.
  • Search memories: Search stored memories based on a query.
  • Update memories: Update an existing memory.
  • Delete memories: Delete a specific memory or all memories for a user, agent, or run.
  • Reset memories: Reset all memories for a user, agent, or run.
  • OpenAPI Documentation: Accessible via /docs endpoint.

Technology Stack

  • Vector Store: ChromaDB - 高性能向量数据库
  • Graph Store: SQLite - 轻量级图存储
  • LLM & Embedder: Ollama - 本地大模型服务

Quick Start

方式 1: Docker Compose最简单

# 1. 复制环境变量配置
cp .env.example .env

# 2. 编辑 .env 文件,根据需要修改配置
# vi .env 或 nano .env

# 3. 启动所有服务(包括 Ollama
docker-compose --profile gpu up -d

# 4. 访问 API 文档
open http://localhost:8888/docs

注意: 首次启动需要下载 Ollama 模型,请等待约 60 秒。

方式 2: Docker 命令部署

# 1. 构建镜像
docker build -t mem0-server .

# 2. 运行容器(通过环境变量传入配置)
docker run -d \
  --name mem0-server \
  -p 8000:8000 \
  -v chroma_data:/app/chroma_db \
  -v sqlite_data:/app/graph_store.db \
  -v history_data:/app/history \
  --add-host=host.docker.internal:host-gateway \
  -e CHROMA_DB_PATH=/app/chroma_db \
  -e CHROMA_COLLECTION_NAME=memories \
  -e SQLITE_DB_PATH=/app/graph_store.db \
  -e OLLAMA_HOST=http://host.docker.internal:11434 \
  -e OLLAMA_LLM_MODEL=llama3.2 \
  -e OLLAMA_EMBEDDER_MODEL=nomic-embed-text \
  -e REDIS_HOST=host.docker.internal \
  -e REDIS_PORT=6379 \
  mem0-server

环境变量配置

所有配置都可以通过环境变量传入:

变量名 说明 默认值
CHROMA_DB_PATH ChromaDB 数据路径 /app/chroma_db
CHROMA_COLLECTION_NAME ChromaDB 集合名称 memories
SQLITE_DB_PATH SQLite 数据库路径 /app/graph_store.db
OLLAMA_HOST Ollama 服务地址 http://localhost:11434
OLLAMA_LLM_MODEL LLM 模型名称 llama3.2
OLLAMA_EMBEDDER_MODEL Embedder 模型名称 nomic-embed-text
HISTORY_DB_PATH 历史数据库路径 /app/history/history.db

详细部署文档请查看 DEPLOYMENT.md

Prerequisites

Before running the server, ensure you have:

  1. Ollama installed and running on your host machine

    # Install Ollama from https://ollama.ai
    ollama pull llama3.2
    ollama pull nomic-embed-text
    
  2. Docker and Docker Compose installed

API Endpoints

  • POST /memories - Create new memories
  • GET /memories - Retrieve all memories
  • GET /memories/{memory_id} - Get a specific memory
  • POST /search - Search memories
  • PUT /memories/{memory_id} - Update a memory
  • DELETE /memories/{memory_id} - Delete a memory
  • DELETE /memories - Delete all memories for an identifier
  • POST /reset - Reset all memories
  • GET /docs - OpenAPI documentation

验证

测试 ollama

curl http://localhost:11434/api/embeddings
-d '{ "model":"nomic-embed-text", "prompt":"hello world" }'

测试 mem0

Description
No description provided
Readme 64 KiB
Languages
Python 88%
Dockerfile 10.3%
Makefile 1.7%