Robot Memory System

Your Robot Forgets Everything. Until Now.

The memory system that robots have been missing. Remember skills, learn from failures, build on experience —

pip install robotmem
Open Source (Apache 2.0)
9
Integrations
Cross-Session Recall
< 10ms
Recall Latency
100%
Offline Capable

Built for Real Robots

Not another chatbot memory. RobotMem is designed for physical agents that interact with the real world.

Multi-Modal Storage

Store all 5 perception types — visual, tactile, auditory, proprioceptive, and procedural — with numeric parameters and trajectories.

Cross-Session Learning

Robot remembers across reboots. Skills learned today are available tomorrow. No retraining needed.

Model Agnostic

Works with any framework — ROS, Isaac Gym, MuJoCo, dm_control. Not locked to any specific model or vendor.

Semantic Search

Find relevant experiences by meaning, not keywords. "How did I grasp the red cup?" returns the right memory.

Auto Deduplication

dHash for visual similarity, Jaccard for text overlap. Keeps memory clean without manual curation.

Trajectory & Parameters

Store force profiles, joint trajectories, and numeric parameters as structured data alongside natural language.

Plug Into Your Stack 9 integrations

Drop-in adapters for the tools you already use.

Three Steps to Robot Memory

Simple API. No infrastructure. Just save experiences and recall them when needed.

1

Experience

Robot performs an action. RobotMem captures the perception — what it saw, felt, and did — with full context.

2

Remember

Next session starts. RobotMem retrieves relevant past experiences via semantic search. Robot picks up where it left off.

3

Evolve

Over time, patterns emerge. Skills crystallize. The robot builds genuine expertise from accumulated experience.

Robot Memory in 13 Lines

Python API & MCP Server. Your robot remembers in minutes.

robot_controller.py
from robotmem import save_perception, recall

# Save a grasping experience
save_perception(
    description="Grasped red cup: force=12.5N, 30 steps",
    perception_type="procedural",
    data='{"actions": [[0.1, -0.3, 0.05]], "force_peak": 12.5}',
)

# Next session: recall similar experiences
memories = recall("how to grasp a cup")
for m in memories["memories"]:
    print(m["content"], m["_rrf_score"])
Terminal Output
$ python robot_controller.py
[recall] hybrid mode | 3 results | top score: 0.847
Grasped red cup: force=12.5N, 30 steps    score=0.847
FetchPush: success, dist 0.012m, 28 steps score=0.723
Push cube: force=11.8N, overshoot 0.03m   score=0.651

Built Different

Existing AI memory systems are designed for chatbots. RobotMem is designed for robots.

Capability RobotMem Mem0 Zep Letta
Target Use CaseRobotsChatbotsChatbotsChatbots
Multi-modal perception 5 types
Trajectory storage
Numeric parameters
Model agnostic
Offline capable Local ONNX Cloud Cloud
Visual dedup (dHash)
MCP protocol
Natural language storage
Score8 / 82 / 81 / 82 / 8

Real Experiments, Real Numbers

Tested in MuJoCo robotics environments. Not theoretical — measured.

+25%
FetchPush Success Rate
Baseline 42% → with memory 67%. Same policy, same environment — the only difference is persistent memory.
Cross-Environment Transfer
Train in FetchPush, transfer to FetchSlide: baseline 4% → with memory 12%. No model fine-tuning — just memory.
< 10ms
Recall Latency
Hybrid BM25 + vector search over 10,000 memories. Fast enough for real-time robot control loops.
0
Cloud Dependencies
100% offline. Local SQLite + ONNX embeddings. No API keys, no internet, no vendor lock-in.

Start Building Robots That Remember

Open source. Free forever. One pip install away.

pip install robotmem