loading…
Search for a command to run...
loading…
MCP server for Screaming Frog SEO Spider headless crawls without GUI. 8 tools for crawling sites, exporting data, and managing crawl storage. Cross-platform (Ma
MCP server for Screaming Frog SEO Spider headless crawls without GUI. 8 tools for crawling sites, exporting data, and managing crawl storage. Cross-platform (Mac + Windows). Forked from bzsasson/screaming-frog-mcp with 6 critical bug fixes.
MCP server that lets Claude run Screaming Frog SEO Spider headless crawls, export data, and manage crawl storage — without anyone opening the GUI.
Type a URL into Claude. Screaming Frog runs in the background. You get the data back. That's it.
Forked from bzsasson/screaming-frog-mcp v0.1.0 with bug fixes. The original had issues that made it unusable in practice — pipe deadlocks that hung crawls, false GUI detection that blocked everything after the first run, a delete command that could wipe your entire crawl database. All fixed.
| Bug | Fix |
|---|---|
| Pipe deadlock | stdout/stderr redirected to log files instead of PIPE. Crawls no longer hang when SF produces large output. |
| GUI detection | Uses psutil instead of ps aux. Works on Mac and Windows. Headless CLI processes no longer get mistaken for the GUI. |
| Stale crawl cleanup | SF leaves a temp crawl.seospider file inside its own app bundle when a crawl gets interrupted. Every crawl after that fails. Now auto-cleaned before each run. |
| Delete safety | delete_crawl(".") used to resolve to the root data directory and wipe everything. Fixed. |
| Export dir leak | Failed exports left temp directories on disk. Now cleaned up. |
| Input validation | Stricter character allowlists for CLI arguments and db_id. |
uvx --from git+https://github.com/marykovziridze/screaming-frog-mcp screaming-frog-mcp
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"screaming-frog": {
"command": "uvx",
"args": ["--from", "git+https://github.com/marykovziridze/screaming-frog-mcp", "screaming-frog-mcp"]
}
}
}
Install uv first:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Add to C:\Users\[name]\AppData\Roaming\Claude\claude_desktop_config.json:
{
"mcpServers": {
"screaming-frog": {
"command": "uvx",
"args": ["--from", "git+https://github.com/marykovziridze/screaming-frog-mcp", "screaming-frog-mcp"],
"env": {
"SF_CLI_PATH": "C:\\Program Files (x86)\\Screaming Frog SEO Spider\\ScreamingFrogSEOSpiderCli.exe"
}
}
}
}
Restart Claude Desktop after editing the config.
| Tool | What it does |
|---|---|
sf_check |
Verify SF is installed and licensed |
crawl_site |
Start a headless crawl |
crawl_status |
Check crawl progress |
list_crawls |
List saved crawls in SF's database |
export_crawl |
Export crawl data as CSV |
read_crawl_data |
Read and filter exported CSV data |
delete_crawl |
Delete a saved crawl |
storage_summary |
Show disk usage of crawl storage |
| Variable | Default | Notes |
|---|---|---|
SF_CLI_PATH |
Mac: auto-detected | Set manually on Windows or custom installs |
crawl.seospider file in your SF install directory and delete it manually.MIT — see LICENSE
Original MCP server by Boaz Sasson.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"screaming-frog-mcp": {
"command": "npx",
"args": []
}
}
}PRs, issues, code search, CI status
Database, auth and storage
Reference / test server with prompts, resources, and tools.
Secure file operations with configurable access controls.