Installation
Quick Install
pip install robotmem
This installs the core library with ONNX embedding support (pure CPU, no external services needed).
System Requirements
| Requirement | Version | Notes |
|---|---|---|
| Python | >= 3.10 | Required |
| SQLite | >= 3.35 | Bundled with Python; 3.43+ enables contentless_delete optimization |
| fastembed | >= 0.4 | Auto-installed; ONNX embedding (~67MB model, auto-downloaded) |
| sqlite-vec | >= 0.1.6 | Auto-installed; vector search extension |
| mcp | >= 1.0 | Auto-installed; Model Context Protocol SDK |
Optional Dependencies
# Web management UI
pip install robotmem[web]
# Ollama embedding backend (instead of ONNX)
pip install robotmem[ollama]
# CJK (Chinese/Japanese/Korean) full-text search
pip install robotmem[cjk]
# Development (testing)
pip install robotmem[dev]
Running as MCP Server
Claude Code
Add to your Claude Code MCP settings (~/.claude/settings.json):
{
"mcpServers": {
"robotmem": {
"command": "python",
"args": ["-m", "robotmem"]
}
}
}
Claude Desktop
Add to Claude Desktop's config file:
{
"mcpServers": {
"robotmem": {
"command": "python",
"args": ["-m", "robotmem"]
}
}
}
Direct Python
from robotmem import learn, recall, save_perception, start_session, end_session
Note: When imported directly, functions are thin wrappers around the same logic as the MCP tools.
Running the Web UI
# Default: http://127.0.0.1:6889
python -m robotmem web
# Custom port
python -m robotmem web --port 8080
# Custom host (expose to network)
python -m robotmem web --host 0.0.0.0 --port 6889
The Web UI provides:
- Memory browser with pagination and filtering
- Full-text search across all memories
- Session timeline view
- Collection management
- Health diagnostics (/api/doctor)
Verify Installation
1. Check MCP Server
python -m robotmem
# Should start the MCP server (waiting for stdio input)
# Press Ctrl+C to exit
2. Check Web UI
python -m robotmem web
# Open http://127.0.0.1:6889 in browser
# Visit http://127.0.0.1:6889/api/doctor for health check
3. Check Embedding
from robotmem.embed_onnx import FastEmbedEmbedder
import asyncio
emb = FastEmbedEmbedder()
ok = asyncio.run(emb.check_availability())
print(f"ONNX embedding available: {ok}")
# First run downloads model (~67MB) to ~/.cache/fastembed/
Data Location
All data is stored in ~/.robotmem/ by default:
~/.robotmem/
├── memory.db # SQLite database (memories, sessions, tags)
└── config.json # Configuration overrides (optional)
Override with ROBOTMEM_HOME environment variable:
export ROBOTMEM_HOME=/path/to/custom/dir
Troubleshooting
sqlite-vec fails to load
WARNING: sqlite-vec 加载失败
This means vector search is unavailable, but robotmem still works with BM25-only search. Fix:
pip install --upgrade sqlite-vec
fastembed model download fails
First run downloads the ONNX model. If it fails (network issue):
# Manual download
python -c "from fastembed import TextEmbedding; TextEmbedding('BAAI/bge-small-en-v1.5')"
Ollama embedding not connecting
# Ensure Ollama is running
ollama serve
# Pull the embedding model
ollama pull nomic-embed-text
# Test
curl http://localhost:11434/api/embed -d '{"model": "nomic-embed-text", "input": "test"}'