loading…
Search for a command to run...
loading…
Description: An MCP server with 15 tools covering web search, scraping, extraction, crawling, and autonomous data gathering via the SearchClaw API. Tagline: "Th
Description: An MCP server with 15 tools covering web search, scraping, extraction, crawling, and autonomous data gathering via the SearchClaw API. Tagline: "The complete web data pipeline for AI agents — Search, Extract, Crawl in One API."
The complete web data pipeline for AI agents. Search, Extract, Crawl in One API.
A Model Context Protocol (MCP) server with 15 tools covering web search, scraping, extraction, crawling, and autonomous data gathering via the SearchClaw API.
npx searchclaw-mcp
Or install globally:
npm install -g searchclaw-mcp
Get your API key from searchclaw.dev and add the server to your MCP client config.
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"searchclaw": {
"command": "npx",
"args": ["-y", "searchclaw-mcp"],
"env": {
"SEARCHCLAW_API_KEY": "your-api-key"
}
}
}
}
claude mcp add searchclaw -- npx -y searchclaw-mcp
Then set your API key in your environment:
export SEARCHCLAW_API_KEY="your-api-key"
Add to .cursor/mcp.json in your project:
{
"mcpServers": {
"searchclaw": {
"command": "npx",
"args": ["-y", "searchclaw-mcp"],
"env": {
"SEARCHCLAW_API_KEY": "your-api-key"
}
}
}
}
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"searchclaw": {
"command": "npx",
"args": ["-y", "searchclaw-mcp"],
"env": {
"SEARCHCLAW_API_KEY": "your-api-key"
}
}
}
}
| Tool | Description | Credits |
|---|---|---|
search |
Web search — organic results | 1 |
search_ai |
RAG-ready search with context and sources (primary tool for AI agents) | 2 |
news |
Search recent news articles | 1 |
images |
Search for images with metadata | 1 |
suggest |
Autocomplete / query suggestions | 1 |
| Tool | Description | Credits |
|---|---|---|
extract |
Extract structured data from a URL using a JSON schema | 5 |
markdown |
Convert any URL to clean markdown | 2 |
screenshot |
Capture a full-page screenshot of a URL | 2 |
| Tool | Description | Credits |
|---|---|---|
crawl |
Async bulk crawl with optional structured extraction | 1/page |
job_status |
Check status of an async crawl job | 0 |
map |
Discover all URLs on a domain (sitemap alternative) | 2 |
| Tool | Description | Credits |
|---|---|---|
pipeline |
Search + extract in one call — the killer feature for RAG pipelines | 3+ |
browse |
Interactive browser actions (click, type, scroll) | 5 |
agent |
Autonomous data gathering from natural language prompts | variable |
| Tool | Description | Credits |
|---|---|---|
usage |
Check your remaining API credits | 0 |
| Feature | SearchClaw | Firecrawl | Tavily |
|---|---|---|---|
| Web search | ✅ | ❌ | ✅ |
| AI search (RAG) | ✅ | ❌ | ✅ |
| Extract with schema | ✅ | ✅ | ❌ |
| Crawl | ✅ | ✅ | ❌ |
| Pipeline (search + extract) | ✅ | ❌ | ❌ |
| Autonomous agent | ✅ | ❌ | ❌ |
| Browser actions | ✅ | ❌ | ❌ |
| News & image search | ✅ | ❌ | ✅ |
| Screenshot | ✅ | ✅ | ❌ |
| MCP server | ✅ | ✅ | ✅ |
| Free tier | ✅ 500 credits/mo | ❌ | ✅ 1000 credits/mo |
Get started at searchclaw.dev.
MIT
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"searchclaw": {
"command": "npx",
"args": []
}
}
}