loading…
Search for a command to run...
loading…
Persistent cognitive memory for Claude Code. Cloud-first with semantic search, AI-powered extraction, and project scoping. Zero local databases.
Persistent cognitive memory for Claude Code. Cloud-first with semantic search, AI-powered extraction, and project scoping. Zero local databases.
npm version License: MIT Quantum Safe
Autonomous robots. Self-driving vehicles. Defense systems. Coding assistants. Any Ai system that needs to remember.
CogmemAi is a portable memory layer that gives any Ai system persistent recall across sessions, devices, users, and teams — and captures knowledge autonomously, even when your Ai forgets to save. 95.10% accuracy on LongMemEval — top published score on the field's hardest long-term memory benchmark. 91% on LoCoMo, above human performance (87.9%). Quantum-safe encryption. Works with Claude Code, Cursor, Windsurf, Cline, Continue, and any MCP-compatible tool. Switch editors, switch models, switch machines — your knowledge stays. Not just one score on a test — the most complete Ai memory system available.
Every memory system has the same hidden failure mode: the Ai has to choose to save, and under pressure it doesn't. You can bake instructions into system prompts. You can nudge. But when your Ai is head-down on a coding task, it forgets to save — and the decisions you made two hours ago vanish when the context compacts.
CogmemAi v3.15 moves the decision out of the Ai's hands entirely. Your coding sessions are captured at the infrastructure level — decisions, file changes, bug fixes, and deployments land in memory without a single prompt. At session end, an intelligence pass distills them into structured memories: the right types, the right importance scores, the right scopes. Your Ai never sees this happen.
The result: a day of heavy coding produces 15–20 quality memories instead of 3. Future sessions pick up seamlessly. Your Ai stops re-litigating architectural choices you already made. Stop reminding your Ai to remember. It just does.
CogmemAi now thinks before it speaks. Before your Ai assistant suggests any action, approach, or recommendation, CogmemAi checks its memory first — automatically, on every topic.
preflight tool — A fast, lightweight recall designed to be called before every suggestion. Your assistant checks what it already knows about a topic before opening its mouth. "Let's try approach X" → first checks if X was already tried, rejected, or completed. Sub-200ms, near-zero cost.The result: your Ai assistant stops suggesting things you've already tried, people you've already contacted, and approaches you've already rejected. Your brain is no longer the safety net for what your tools should already know.
CogmemAi now automatically detects patterns across your memories and extracts factual principles. While skills tell your Ai HOW to behave ("always use Zustand"), principles tell it what's TRUE about your project ("this codebase never validates inputs at service boundaries"). Principles are extracted from clusters of 5+ related memories, scored by confidence, and injected into every session. Use extract_principles to trigger manually or let it happen automatically.
CogmemAi now supports Streamable HTTP transport — connect from any MCP client without installing anything. No npm, no config files, no Node.js required. Just point your client to https://hifriendbot.com/mcp/ with your API key and start using persistent memory immediately. Same 35 tools, same Intelligence Engine, same benchmark-topping accuracy — zero setup friction.
CogmemAi is the first quantum-safe Ai memory system. All memories are encrypted at rest with quantum-resistant encryption — both in cloud mode and local mode. Your data is protected against today's threats and tomorrow's quantum computers. Encryption is automatic, zero-config, and enabled by default. No setup required.
CogmemAi now runs three ways — pick the one that fits your workflow:
| Cloud (default) | Local | Hybrid | |
|---|---|---|---|
| Best for | Full intelligence, team collaboration, cross-device portability | Zero-config start, offline-only environments | Local speed + cloud brains, travel/unreliable networks |
| Setup | npx cogmemai-mcp setup (choose Cloud) |
npx cogmemai-mcp setup (choose Local) |
npx cogmemai-mcp setup (choose Hybrid) |
| API key needed | Yes (free) | Yes (free) — like a license key, your data stays local | Yes (free) |
| Search | Semantic (by meaning) | Full-text search (FTS5) | Semantic with local fallback |
| Intelligence Engine | Full — auto-linking, contradiction detection, memory decay, auto-skills, query synthesis | FTS5 search + CRUD — data stays on your machine | Full — with offline resilience |
| Team collaboration | Yes | No | Yes |
| Cross-device sync | Automatic | No — data stays on your machine | Automatic with local cache |
| Offline support | Requires internet | Full offline | Falls back to local when offline |
| Encryption | Quantum-safe (server) | Quantum-safe (local) | Quantum-safe (both) |
Cloud mode is the recommended experience. It gives you the full Intelligence Engine — semantic search that finds memories by meaning, auto-linking knowledge graph, contradiction detection, self-improving recall, auto-skills, query synthesis, and team collaboration. Everything that makes CogmemAi more than just a database.
Local mode keeps your data on your machine. A free API key is required for registration (like a software license key), but all your data stays local. Full-text search (FTS5) provides quality recall. Works offline after initial setup. When you're ready for semantic search and the full Intelligence Engine, upgrading to cloud takes one command.
Hybrid mode is for developers who travel or work on unreliable networks. Saves to both local and cloud simultaneously. Reads from cloud when available, falls back to local when offline. Unsynced memories automatically push to cloud when connectivity returns.
CogmemAi now gets smarter every time you use it. The Intelligence Engine is a self-improving memory system that learns what matters, connects related knowledge automatically, and synthesizes answers from your entire memory. Auto-Skills takes it further — CogmemAi doesn't just remember, it learns how to behave.
CogmemAi scores 95.10% accuracy on LongMemEval — the top published score on the field's hardest long-term memory benchmark — and 91% accuracy on LoCoMo with a 100% retrieval hit rate, above human performance (87.9%). Two benchmarks, two #1-tier scores. CogmemAi finds the right memories when you need them.
Connect directly — no npm, no setup, no config files. Just add the remote endpoint to your MCP client with your API key:
Endpoint: https://hifriendbot.com/mcp/
Auth: Bearer token (your cm_ API key)
Get your free API key at hifriendbot.com/developer.
Works with any MCP client that supports Streamable HTTP transport (Claude Desktop, Cursor, and more).
npx cogmemai-mcp setup
The setup wizard walks you through three choices: Cloud (recommended — full Ai intelligence), Local (data stays on your machine), or Hybrid (both). Pick your mode, enter your API key if needed, and you're ready in under 60 seconds.
Don't have an API key yet? Get one free at hifriendbot.com/developer. Or choose Local mode to start immediately with no account.
Every time you start a new session, you lose context. You re-explain your tech stack, your architecture decisions, your coding preferences. Built-in memory in tools like Claude Code is a flat file with no search, no structure, and no intelligence.
CogmemAi gives your Ai assistant a real memory system:
CogmemAi offers three storage modes, but cloud is where the magic happens. The Intelligence Engine — semantic search, auto-linking knowledge graph, contradiction detection, self-improving recall, auto-skills, and query synthesis — runs server-side. In cloud mode, your MCP server is a thin HTTP client with zero local databases, zero RAM issues, zero maintenance. All memories are encrypted at rest, so your data is just as secure as local storage — with cross-device portability and team features on top.
Your memory follows you everywhere. Memories created in Claude Code are instantly available in Cursor, Windsurf, Cline, and any MCP-compatible tool. Switch between Opus, Sonnet, Haiku, or any model your editor supports — your memories persist regardless. New laptop? New OS? Log in and your full project knowledge is waiting. A local SQLite file dies with your machine. Cloud memory is permanent.
The privacy argument is a myth. Some memory tools market "local-first" as a privacy advantage. But think about what happens next: every memory your Ai reads gets sent to the model provider (Anthropic, OpenAI, Google) as part of the prompt. Your data leaves your machine at inference time no matter where it's stored. A local SQLite file doesn't protect your memories — it just makes them harder to search, slower to access, and impossible to share. CogmemAi encrypts at rest, transmits over HTTPS, and adds intelligence that local storage simply can't match.
Teams and collaboration. Cloud memory is the only way to share project knowledge across teammates. When one developer saves an architecture decision or documents a bug fix, every team member's Ai assistant knows about it instantly. No syncing, no merge conflicts, no stale local databases. Whether it's two developers or twenty, everyone's assistant has the same up-to-date context. This is impossible with local-only memory solutions.
When your Ai assistant compacts your context, conversation history gets compressed and context is lost. CogmemAi handles this automatically — your context is preserved before compaction and seamlessly restored afterward. No re-explaining, no manual prompting.
The npx cogmemai-mcp setup command configures everything automatically.
CogmemAi includes a Claude Skill that teaches Claude best practices for memory management — when to save, importance scoring, memory types, and session workflows.
Claude Code:
/skill install https://github.com/hifriendbot/cogmemai-mcp/tree/main/skill/cogmemai-memory
Claude.ai: Upload the skill/cogmemai-memory folder in Settings > Skills.
npx cogmemai-mcp setup # Interactive setup wizard
npx cogmemai-mcp setup <key> # Setup with API key
npx cogmemai-mcp verify # Test connection and show usage
npx cogmemai-mcp --version # Show installed version
npx cogmemai-mcp help # Show all commands
If you prefer to configure manually instead of using npx cogmemai-mcp setup:
Option A — Per project (add .mcp.json to your project root):
{
"mcpServers": {
"cogmemai": {
"command": "cogmemai-mcp",
"env": {
"COGMEMAI_API_KEY": "cm_your_api_key_here"
}
}
}
}
For local mode (free API key required for registration, data stays local):
{
"mcpServers": {
"cogmemai": {
"command": "cogmemai-mcp",
"env": {
"COGMEMAI_MODE": "local",
"COGMEMAI_API_KEY": "cm_your_api_key_here"
}
}
}
}
Option B — Global (available in every project):
# Cloud (default)
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_API_KEY=cm_your_api_key_here --scope user
# Local (free API key required, data stays local)
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_API_KEY=cm_your_api_key_here -e COGMEMAI_MODE=local --scope user
# Hybrid (both)
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_API_KEY=cm_your_api_key_here -e COGMEMAI_MODE=hybrid --scope user
Automatic setup:
npx cogmemai-mcp setup
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
}
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
}
Open VS Code Settings > Cline > MCP Servers, add:
{
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
Add to ~/.continue/config.yaml:
mcpServers:
- name: cogmemai
command: npx
args: ["-y", "cogmemai-mcp"]
env:
COGMEMAI_API_KEY: cm_your_api_key_here
CogmemUI is a free multi-model Ai workspace with built-in CogmemAi memory. Add your CogmemAi API key in Settings > API Keys and your memory is instantly available. CogmemUI also supports connecting any MCP-compatible tool server via Settings > MCP Servers — add endpoints, auto-discover tools, and use them in chat.
Get your free API key at hifriendbot.com/developer.
CogmemAi provides 35 tools that your Ai assistant uses automatically:
| Tool | Description |
|---|---|
preflight |
Think Before You Speak. Fast recall to check prior context before making any suggestion |
save_memory |
Store a fact explicitly (architecture decision, preference, etc.) |
recall_memories |
Search memories using natural language (semantic search) |
extract_memories |
Ai extracts facts from a conversation exchange automatically |
get_project_context |
Load top memories at session start (with smart ranking, health score, and session replay) |
list_memories |
Browse memories with filters (paginated, with untyped filter) |
update_memory |
Update content, importance, scope, type, category, subject, and tags |
delete_memory |
Permanently delete a memory |
bulk_delete |
Delete up to 100 memories at once |
bulk_update |
Update up to 50 memories at once (content, type, category, tags, etc.) |
get_usage |
Check your usage stats and tier info |
export_memories |
Export all memories as JSON for backup or transfer |
import_memories |
Bulk import memories from a JSON array |
ingest_document |
Feed in a document (README, API docs) to auto-extract memories |
save_session_summary |
Save a summary of what was accomplished in this session |
list_tags |
View all tags in use across your memories |
link_memories |
Connect related memories with named relationships |
get_memory_links |
Explore the knowledge graph around a memory |
get_memory_versions |
View edit history of a memory |
get_analytics |
Memory health dashboard with self-tuning insights (filterable by project) |
promote_memory |
Promote a project memory to global scope |
consolidate_memories |
Merge related memories into comprehensive summaries using Ai |
save_task |
Create a persistent task with status and priority tracking |
get_tasks |
Retrieve tasks for the current project — pick up where you left off |
update_task |
Change task status, priority, or description as you work |
save_correction |
Store a "wrong approach → right approach" pattern to avoid repeated mistakes |
set_reminder |
Set a reminder that surfaces at the start of your next session |
get_stale_memories |
Find memories that may be outdated for review or cleanup |
get_file_changes |
See what files changed since your last session |
feedback_memory |
Signal whether a recalled memory was useful or irrelevant to improve future recall |
generate_skills |
Trigger skill generation from your corrections and preferences — or preview candidates with dry run |
save_rule |
Save a mandatory rule that surfaces in every session — bypasses all scoring and decay |
list_rules |
List all mandatory rules for the current project and/or globally |
delete_rule |
Delete a mandatory rule by ID |
extract_principles |
Trigger Wisdom Engine to detect factual patterns across memory clusters |
Build your own integrations with the CogmemAi API:
npm install cogmemai-sdk — npm · GitHubpip install cogmemai — PyPI · GitHubMemories are categorized for better organization and retrieval:
| Free | Pro | Team | Enterprise | |
|---|---|---|---|---|
| Price | $0 | $14.99/mo | $39.99/mo | $99.99/mo |
| Memories | 500 | 2,000 | 10,000 | 50,000 |
| Extractions/mo | 500 | 2,000 | 5,000 | 20,000 |
| Projects | 5 | 20 | 50 | 200 |
Start free. Upgrade when you need more. Or pay per operation with USDC on-chain — no credit card required.
Read our full privacy policy.
| Variable | Required | Description |
|---|---|---|
COGMEMAI_API_KEY |
Cloud/Hybrid | Your API key (starts with cm_). Not needed for local mode. |
COGMEMAI_MODE |
No | Storage mode: cloud (default), local (data stays on your machine), or hybrid |
COGMEMAI_LOCAL_DB |
No | Path to local database (default: ~/.cogmemai/local.db). Used in local and hybrid modes. |
COGMEMAI_API_URL |
No | Custom API URL (default: hifriendbot.com) |
COGMEMAI_ENCRYPTION_KEY |
No | Custom encryption passphrase for local mode. If not set, a key is auto-generated. |
COGMEMAI_LOCAL_ENCRYPTION |
No | Set to off to disable local encryption (not recommended). |
MIT — see LICENSE
Built by HiFriendbot — Better Friends, Better Memories, Better Ai. 🛡️ Quantum Safe.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"hifriendbot-cogmemai-mcp": {
"command": "npx",
"args": []
}
}
}