From a solo developer's laptop to a federation of state-government data centers — the same Rust binary runs the same way. Below is the value story for every audience size, with deployment patterns, ROI math, and procurement-ready specs.
One person. One laptop. Multiple AIs. Memory that vanishes every conversation.
"Claude forgets between sessions. Cursor doesn't know what ChatGPT learned. I re-paste the same project context 30 times a day. My most expensive cognitive asset — the context I've built with these tools — evaporates every 4 hours."
One brew install ai-memory. Three lines in your MCP config. Every AI you use plugs into the same memory. Persistent across reboots, machines, AI vendors. Local-first — your data never leaves your laptop unless you explicitly federate.
brew install ai-memory # Add ai-memory to Claude / Cursor / Codex MCP config # That's it. # Optional: bump to autonomous tier for self-curating memory ollama pull gemma4:e2b ai-memory mcp --tier autonomous
A small group of humans + a growing fleet of AI sessions. Tribal knowledge dies when someone leaves.
"Senior engineer leaves. Six months of context goes with them. New hires re-discover landmines we already paid to find. Each engineer's Claude has different memory. The AI knows the codebase but can't share what it learned."
Federated 3-node ai-memory cluster on your VPC. Per-namespace shared memory: onboarding/, decisions/, infra/. Every team member's AI plugs in and shares understanding. mTLS handshake on every cross-node call. Zero per-seat cost.
Multi-team. Multi-namespace. Procurement caught on. Compliance is calling.
"AI usage exploded across teams. Each one signed up for a different SaaS memory product. Different vendors. Different DPAs. Different security postures. Audit asks 'where does our data live?' and we don't have one answer."
One ai-memory deployment per business unit. Per-team namespace. Governance hooks intercept destructive ops. Webhook integration ships to corporate SIEM. Single source of truth for data residency: "on this VPC, that's it."
Multi-region. Multi-jurisdiction. Multi-language compliance. Air-gapped business units.
"Each BU wants AI. Each lawyer wants air-gapped. Each auditor wants logs. Each region has different data-residency rules. Our AI vendors keep getting acquired. Every approved tool gets deprecated 18 months later."
Per-BU federated cluster. Cross-region quorum-aware sync. PII redaction hooks. Backup + restore + retention. Air-gap operable for sensitive BUs. Vendor-acquisition immune — Apache 2.0 OSS, you have the source.
AI mandate is real. Cloud ban is real. Foreign vendor scrutiny is real. Audit trail is mandatory.
"AI is a White House priority. Cloud is forbidden by policy. Most vendors are foreign. Every commercial product wants telemetry. Every offering needs FedRAMP authorization. We're caught between the modernization mandate and the security mandate."
Apache 2.0 OSS. Single Rust binary. Zero outbound calls. Public source code, auditable from git clone. Hardware-attested keys (v0.7 → HSM, TPM, Secure Enclave). FedRAMP / IL5 path via the AgenticMem Sovereign tier. Domestic supplier (US-based AlphaOne LLC) for procurement.
cargo audit runs in CI--features sqlcipherSome properties of ai-memory are non-negotiable benefits — they apply at every scale, with no upgrade or paid tier required.
SQLite file on disk. WAL-mode for concurrent reads. cp is a backup. Your data never leaves your hardware unless you explicitly turn on federation.
42ms p95 measured. Published budget. CI guard fails any PR that breaks it. Latency contract enforced, not aspirational.
Commercial-friendly license. Patent grant included. You can read every line, fork it, sell what you build on top.
1,886 lib tests + 49+ integration tests (v0.6.3.1; +281 net from v0.6.3 baseline). Zero ignored. CI matrix: macOS / Ubuntu / Windows. Every PR runs the full matrix.
No Python install. No Node toolchain. No Docker required. Just one ~30MB Rust binary. Five distribution targets (macOS x64/arm, Linux x64/arm, Windows x64).
Open Model Context Protocol. Every MCP-compliant host autodiscovers all 26 ai-memory tools. Future AI hosts work for free.
Soft-delete to archive tier. Restore anytime. Hard delete only via explicit purge with governance approval.
Curator daemon auto-tags, detects contradictions, consolidates similar memories. Your memory store stays tidy without manual hygiene.
Pending actions queue. Webhook event stream. (v0.7) Ed25519 signatures on every write. Compliance officers see what they need.
Whether you're a solo developer or a federal agency, the install path is the same.