loading…
Search for a command to run...
loading…
MCP server that exposes any LynxPrompt instance to LLMs, enabling browsing, searching, and managing AI configuration blueprints and prompt hierarchies.
MCP server that exposes any LynxPrompt instance to LLMs, enabling browsing, searching, and managing AI configuration blueprints and prompt hierarchies.
A tiny bridge that exposes any LynxPrompt instance as an MCP server, enabling LLMs to browse, search, and manage AI configuration blueprints.
| Type | What for | MCP URI / Tool id |
|---|---|---|
| Resources | Browse blueprints, hierarchies, and user info read-only | lynxprompt://blueprintslynxprompt://blueprint/{id}lynxprompt://hierarchieslynxprompt://hierarchy/{id}lynxprompt://user |
| Tools | Create, update, delete blueprints and manage hierarchies | search_blueprintscreate_blueprintupdate_blueprintdelete_blueprintcreate_hierarchydelete_hierarchy |
Everything is exposed over a single JSON-RPC endpoint (/mcp).
LLMs / Agents can: initialize -> readResource -> listTools -> callTool ... and so on.
services:
lynxprompt-mcp:
image: drumsergio/lynxprompt-mcp:latest
ports:
- "127.0.0.1:8080:8080"
environment:
- LYNXPROMPT_URL=https://lynxprompt.com
- LYNXPROMPT_TOKEN=lp_xxx
Security note: The HTTP transport listens on
127.0.0.1:8080by default. If you need to expose it on a network, place it behind a reverse proxy with authentication.
npx lynxprompt-mcp
Or install globally:
npm install -g lynxprompt-mcp
lynxprompt-mcp
This downloads the pre-built Go binary from GitHub Releases for your platform and runs it with stdio transport. Requires at least one published release.
git clone https://github.com/GeiserX/lynxprompt-mcp
cd lynxprompt-mcp
# (optional) create .env from the sample
cp .env.example .env && $EDITOR .env
go run ./cmd/server
| Variable | Default | Description |
|---|---|---|
LYNXPROMPT_URL |
https://lynxprompt.com |
LynxPrompt instance URL (without trailing /) |
LYNXPROMPT_TOKEN |
(required) | API token in lp_xxx format |
LISTEN_ADDR |
127.0.0.1:8080 |
HTTP listen address (Docker sets 0.0.0.0:8080) |
TRANSPORT |
(empty = HTTP) | Set to stdio for stdio transport |
Put them in a .env file (from .env.example) or set them in the environment.
Tested with Inspector and it is currently fully working. Before making a PR, make sure this MCP server behaves well via this medium.
{
"schema_version": "v1",
"name_for_human": "LynxPrompt-MCP",
"name_for_model": "lynxprompt_mcp",
"description_for_human": "Browse, search, and manage AI configuration blueprints from LynxPrompt.",
"description_for_model": "Interact with a LynxPrompt instance that stores AI configuration blueprints. First call initialize, then reuse the returned session id in header \"Mcp-Session-Id\" for every other call. Use readResource to fetch URIs that begin with lynxprompt://. Use listTools to discover available actions and callTool to execute them.",
"auth": { "type": "none" },
"api": {
"type": "jsonrpc-mcp",
"url": "http://localhost:8080/mcp",
"init_method": "initialize",
"session_header": "Mcp-Session-Id"
},
"logo_url": "https://lynxprompt.com/logo.png",
"contact_email": "[email protected]",
"legal_info_url": "https://github.com/GeiserX/lynxprompt-mcp/blob/main/LICENSE"
}
LynxPrompt -- AI configuration blueprint management
MCP-GO -- modern MCP implementation
GoReleaser -- painless multi-arch releases
Feel free to dive in! Open an issue or submit PRs.
LynxPrompt-MCP follows the Contributor Covenant Code of Conduct.
| Project | Description |
|---|---|
| LynxPrompt | Self-hosted platform for AI IDE/Tools Rules and Commands via WebUI and CLI |
| lynxprompt-vscode | VS Code extension for LynxPrompt AI configuration file management |
| lynxprompt-action | GitHub Action to sync and validate AI IDE configuration files with LynxPrompt |
| n8n-nodes-lynxprompt | n8n community node for LynxPrompt AI configuration blueprints |
| homebrew-lynxprompt | Homebrew tap for LynxPrompt CLI |
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"lynxprompt-mcp": {
"command": "npx",
"args": []
}
}
}