loading…
Search for a command to run...
loading…
Enables LLMs to search, browse, and retrieve detailed information on 6,500+ AI applications from the HyperStore catalog.
Enables LLMs to search, browse, and retrieve detailed information on 6,500+ AI applications from the HyperStore catalog.
Plug 6,500+ AI apps into any LLM via the Model Context Protocol.
PyPI Glama Smithery MCP Registry CI License: MIT
HyperStore is a curated directory of 6,500+ AI applications, developed by HyperGPT. This MCP server exposes the HyperStore catalog to any LLM client — Claude, ChatGPT, Cursor, Windsurf, Cline, Zed, Gemini, and anything else that speaks MCP.
Ask your LLM:
"Find me a free AI tool that summarises PDFs." "Compare ChatGPT, Claude, and Gemini side-by-side." "Show me the top 5 image-generation apps with an API."
The LLM calls HyperStore MCP behind the scenes and answers with up-to-date, curated results.
8 tools:
| Tool | Purpose |
|---|---|
search_apps |
Full-text keyword search |
ai_search |
Embedding-based semantic search |
get_app |
Full app detail (features, screenshots, pricing) |
list_apps |
Paginated apps with filters (category, pricing) |
list_categories |
Browse all 30+ categories |
category_apps |
Apps within a category |
browse_apps |
A-Z directory listing |
get_homepage |
Trending + top categories overview |
3 resources:
hyperstore://app/{slug} — markdown rendering of any apphyperstore://category/{slug} — top apps in a categoryhyperstore://catalog — full category index3 prompts:
find_tool_for_task — guided discovery for a taskcompare_apps — side-by-side app comparisondiscover_category — explore a topicuvx (zero install, recommended)Requires uv. One command and you're done:
uvx hyperstore-mcp
pipxpipx install hyperstore-mcp
hyperstore-mcp
docker run --rm -p 8080:8080 ghcr.io/deficlow/hyperstore-mcp
# Now MCP Streamable HTTP at http://localhost:8080/mcp
Use our managed Streamable HTTP server:
https://mcp.store.hypergpt.ai/mcp
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Restart Claude → tools appear in the 🛠 menu.
claude mcp add hyperstore -- uvx hyperstore-mcp
.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
settings.json:
{
"cline.mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
~/.config/zed/settings.json:
{
"context_servers": {
"hyperstore": {
"command": {
"path": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
}
~/.gemini/settings.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Settings → Connectors → Add custom connector:
https://mcp.store.hypergpt.ai/mcpfrom openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4.1",
tools=[{
"type": "mcp",
"server_label": "hyperstore",
"server_url": "https://mcp.store.hypergpt.ai/mcp",
"require_approval": "never",
}],
input="Find me 3 free AI tools for writing unit tests.",
)
print(response.output_text)
from anthropic import Anthropic
client = Anthropic()
response = client.messages.create(
model="claude-opus-4-7",
max_tokens=1024,
mcp_servers=[{
"type": "url",
"url": "https://mcp.store.hypergpt.ai/mcp",
"name": "hyperstore",
}],
messages=[{"role": "user", "content": "Top 5 AI image generators?"}],
)
See examples/ for ready-to-paste configs for every supported client.
For self-hosting, use the Docker image.
For direct invocation without Docker, the CLI accepts --transport http|sse
(see hyperstore-mcp --help).
When self-hosting, these environment variables can be set (see .env.example for the full list):
| Variable | Default | Purpose |
|---|---|---|
MCP_HOST |
0.0.0.0 |
Bind host (http/sse transports) |
MCP_PORT |
8080 |
Bind port (http/sse transports) |
LOG_LEVEL |
INFO |
Logging level (DEBUG, INFO, WARNING, ERROR) |
git clone https://github.com/deficlow/HyperStore-MCP
cd HyperStore-MCP
uv sync --all-extras
uv run pytest
uv run hyperstore-mcp # stdio mode for local testing
Inspect the running server with the official MCP Inspector:
npx @modelcontextprotocol/inspector uvx hyperstore-mcp
HyperStore MCP is a thin async wrapper around the HyperStore public REST API. It is read-only — no credentials, no writes, no PII. The same data that powers the website powers the MCP server. Updates land in your LLM the moment they land on the site.
LLM client ──MCP──▶ hyperstore-mcp ──HTTPS──▶ store.hypergpt.ai/api
MIT © HyperGPT
Выполни в терминале:
claude mcp add hyperstore-mcp -- npx