loading…
Search for a command to run...
loading…
MCP server for Threadline — persistent memory and context layer for AI agents. inject() before your LLM call, update() after. Relevance-scored injection, grant-
MCP server for Threadline — persistent memory and context layer for AI agents. inject() before your LLM call, update() after. Relevance-scored injection, grant-based access, user-owned context.
MCP server for Threadline — the memory governance layer for AI agents.
Use Threadline's persistent, user-consented memory in any MCP-compatible client: Cursor, Claude Desktop, or your own agent.
npm install -g threadline-mcp
Get your API key at threadline.to/dashboard.
Add to claude_desktop_config.json:
{
"mcpServers": {
"threadline": {
"command": "threadline-mcp",
"env": {
"THREADLINE_API_KEY": "tl_live_your_key_here"
}
}
}
}
Add to your MCP config in Cursor settings:
{
"threadline": {
"command": "threadline-mcp",
"env": {
"THREADLINE_API_KEY": "tl_live_your_key_here"
}
}
}
THREADLINE_API_KEY=tl_live_your_key_here threadline-mcp
injectInject user context into a base system prompt before an LLM call.
{
"userId": "user-uuid",
"basePrompt": "You are a helpful assistant."
}
Returns an enriched prompt with relevant facts about the user automatically inserted.
updateUpdate a user's context after an LLM interaction. Extracts and stores structured facts for future sessions.
{
"userId": "user-uuid",
"userMessage": "I prefer concise answers and I'm building in TypeScript.",
"agentResponse": "Got it, keeping it brief."
}
Your MCP client (Cursor / Claude Desktop)
│
▼
threadline-mcp (this package)
│
▼
api.threadline.to
│
┌────┴────┐
▼ ▼
Supabase Redis
(context) (<50ms)
inject() — fetches stored context, scores by recency + relevance, returns enriched promptupdate() — two-stage extraction pipeline classifies and stores new facts across 7 scopesДобавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"threadline-mcp": {
"command": "npx",
"args": []
}
}
}Web content fetching and conversion for efficient LLM usage.
Retrieval from AWS Knowledge Base using Bedrock Agent Runtime.
Provides auto-configuration for setting up an MCP server in Spring Boot applications.
A very streamlined mcp client that supports calling and monitoring stdio/sse/streamableHttp, and can also view request responses through the /logs page. It also