with memory-augmented policy
About RobotMem
Giving robots the ability to remember, learn from experience, and transfer knowledge across environments — so they never start from zero.
"Memory is the foundation of intelligence. Without it, every episode is the first."
Why We Built RobotMem
Every time a robot powers off, it forgets everything it learned. Place it in a new environment, and it starts from scratch — relearning the same obstacles, re-mapping the same terrain, re-discovering the same strategies. Decades of robotics research have produced remarkable planners, controllers, and perception systems, yet persistent memory remained an afterthought.
RobotMem was born from a simple observation: biological agents remember. A dog that learned to navigate your living room doesn't re-learn it every morning. A child who burned their hand on a stove remembers the lesson for life. We asked: what if robots could do the same?
We built a memory system specifically designed for embodied agents — one that stores multi-modal perceptions (visual, tactile, force, proprioceptive, spatial), retrieves relevant experiences in under 10 milliseconds, and transfers knowledge across different environments and even different robot morphologies.
Key Results
These numbers come from controlled experiments comparing robots with and without persistent memory.
with memory-augmented policy
zero-shot in unseen layouts
real-time retrieval at control frequency
visual, tactile, force, proprioceptive, spatial
Technology Stack
RobotMem is designed for edge deployment — no cloud dependency, no GPU required. Everything runs locally on the robot's onboard computer.
SQLite + FTS5
Single-file embedded database with full-text search. Zero-config, battle-tested, works everywhere from Raspberry Pi to industrial PCs.
ONNX Runtime
Hardware-accelerated vector embeddings without framework lock-in. CPU-only inference with sub-millisecond encoding.
MCP Protocol
Model Context Protocol integration lets LLM-powered agents access robot memories natively through standardized tool calls.
Hybrid Search
BM25 lexical search + vector similarity combined with reciprocal rank fusion for precise experience retrieval.
Open Source
RobotMem is fully open-source under the Apache 2.0 license. No vendor lock-in, no usage fees, no telemetry. You own your robot's memories.
We believe persistent memory is a fundamental capability that every robot should have access to. By open-sourcing RobotMem, we aim to accelerate research and development in embodied AI memory systems.
Contributions are welcome — whether you're fixing a bug, adding support for a new perception modality, improving search algorithms, or sharing experiment results. Check our contribution guide to get started.
Get in Touch
Have questions, feedback, or want to collaborate? We'd love to hear from you.
Start Building Robots That Remember
Open source. Free forever. One pip install away.