Files
mem0-server/README.md
huaqingxu 19db078e72 add
2026-03-06 23:39:52 +08:00

145 lines
3.9 KiB
Markdown
Raw Permalink Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
# Mem0 REST API Server
Mem0 provides a REST API server (written using FastAPI). Users can perform all operations through REST endpoints. The API also includes OpenAPI documentation, accessible at `/docs` when the server is running.
## Features
- **Create memories:** Create memories based on messages for a user, agent, or run.
- **Retrieve memories:** Get all memories for a given user, agent, or run.
- **Search memories:** Search stored memories based on a query.
- **Update memories:** Update an existing memory.
- **Delete memories:** Delete a specific memory or all memories for a user, agent, or run.
- **Reset memories:** Reset all memories for a user, agent, or run.
- **OpenAPI Documentation:** Accessible via `/docs` endpoint.
## Technology Stack
- **Vector Store**: ChromaDB - 高性能向量数据库
- **Graph Store**: SQLite - 轻量级图存储
- **LLM & Embedder**: Ollama - 本地大模型服务
## Quick Start
### 方式 1: Docker Compose最简单
```bash
# 1. 复制环境变量配置
cp .env.example .env
# 2. 编辑 .env 文件,根据需要修改配置
# vi .env 或 nano .env
# 3. 启动所有服务(包括 Ollama
docker-compose --profile gpu up -d
# 4. 访问 API 文档
open http://localhost:8888/docs
```
**注意**: 首次启动需要下载 Ollama 模型,请等待约 60 秒。
### 方式 2: Docker 命令部署
```bash
# 1. 构建镜像
docker build -t mem0-server .
# 2. 运行容器(通过环境变量传入配置)
docker run -d \
--name mem0-server \
-p 8000:8000 \
-v chroma_data:/app/chroma_db \
-v sqlite_data:/app/graph_store.db \
-v history_data:/app/history \
--add-host=host.docker.internal:host-gateway \
-e CHROMA_DB_PATH=/app/chroma_db \
-e CHROMA_COLLECTION_NAME=memories \
-e SQLITE_DB_PATH=/app/graph_store.db \
-e OLLAMA_HOST=http://host.docker.internal:11434 \
-e OLLAMA_LLM_MODEL=llama3.2 \
-e OLLAMA_EMBEDDER_MODEL=nomic-embed-text \
-e REDIS_HOST=host.docker.internal \
-e REDIS_PORT=6379 \
mem0-server
```
## 环境变量配置
所有配置都可以通过环境变量传入:
| 变量名 | 说明 | 默认值 |
|--------|------|--------|
| `CHROMA_DB_PATH` | ChromaDB 数据路径 | `/app/chroma_db` |
| `CHROMA_COLLECTION_NAME` | ChromaDB 集合名称 | `memories` |
| `SQLITE_DB_PATH` | SQLite 数据库路径 | `/app/graph_store.db` |
| `OLLAMA_HOST` | Ollama 服务地址 | `http://localhost:11434` |
| `OLLAMA_LLM_MODEL` | LLM 模型名称 | `llama3.2` |
| `OLLAMA_EMBEDDER_MODEL` | Embedder 模型名称 | `nomic-embed-text` |
| `HISTORY_DB_PATH` | 历史数据库路径 | `/app/history/history.db` |
详细部署文档请查看 [DEPLOYMENT.md](./DEPLOYMENT.md)
## Prerequisites
Before running the server, ensure you have:
1. **Ollama installed and running** on your host machine
```bash
# Install Ollama from https://ollama.ai
ollama pull llama3.2
ollama pull nomic-embed-text
```
2. **Docker and Docker Compose** installed
## API Endpoints
- `POST /memories` - Create new memories
- `GET /memories` - Retrieve all memories
- `GET /memories/{memory_id}` - Get a specific memory
- `POST /search` - Search memories
- `PUT /memories/{memory_id}` - Update a memory
- `DELETE /memories/{memory_id}` - Delete a memory
- `DELETE /memories` - Delete all memories for an identifier
- `POST /reset` - Reset all memories
- `GET /docs` - OpenAPI documentation
# 验证
## 测试 ollama
curl http://localhost:11434/api/embeddings \
-d '{
"model":"nomic-embed-text",
"prompt":"hello world"
}'
## 测试 mem0
健康检查:
curl http://localhost:8000/health
返回:
{"status":"ok"}
创建记忆:
curl -X POST http://localhost:8000/v2/memories \
-H "Content-Type: application/json" \
-d '{
"user_id":"test",
"text":"测试用例创建"
}'
搜索记忆:
curl -X POST http://localhost:8000/v2/memories/search \
-H "Content-Type: application/json" \
-d '{
"user_id":"test",
"query":"测试"
}'
## 测试 neo4j