loading…
Search for a command to run...
loading…
MCP server for ScraperAPI web scraping with JavaScript rendering, geotargeting, premium proxies, and auto-parsing support.
MCP server for ScraperAPI web scraping with JavaScript rendering, geotargeting, premium proxies, and auto-parsing support.
The ScraperAPI MCP server enables LLM clients to retrieve and process web scraping requests using the ScraperAPI services.
┌───────────────┐ ┌───────────────────────┐ ┌───────────────┐
│ LLM Client │────▶│ Scraper MCP Server │────▶│ AI Model │
└───────────────┘ └───────────────────────┘ └───────────────┘
│
▼
┌──────────────────┐
│ ScraperAPI API │
└──────────────────┘
The ScraperAPI MCP Server is designed to run as a local server on your machine, your LLM client will launch it automatically when configured.
Install the package:
pip install scraperapi-mcp-server
Add this to your client configuration file:
{
"mcpServers": {
"ScraperAPI": {
"command": "python",
"args": ["-m", "scraperapi_mcp_server"],
"env": {
"API_KEY": "<YOUR_SCRAPERAPI_API_KEY>"
}
}
}
}
Add this to your client configuration file:
{
"mcpServers": {
"ScraperAPI": {
"command": "docker",
"args": [
"run",
"-i",
"-e",
"API_KEY=${API_KEY}",
"--rm",
"scraperapi-mcp-server"]
}
}
}
[!TIP]
If your command is not working (for example, you see a
package not founderror when trying to start the server), double-check the path you are using. To find the correct path, activate your virtual environment first, then run:which <YOUR_COMMAND>
scrapeurl (string, required): URL to scraperender (boolean, optional): Whether to render the page using JavaScript. Defaults to False. Set to True only if the page requires JavaScript rendering to display its content.country_code (string, optional): Activate country geotargeting (ISO 2-letter code)premium (boolean, optional): Activate premium residential and mobile IPsultra_premium (boolean, optional): Activate advanced bypass mechanisms. Can not combine with premiumdevice_type (string, optional): Set request to use mobile or desktop user agentsoutput_format (string, optional): Allows you to instruct the API on what the response file type should be.autoparse (boolean, optional): Activate auto parsing for select websites. Defaults to False. Set to True only if you want the output format in csv or json.<URL>. If you receive a 500 server error identify the website's geo-targeting and add the corresponding country_code to overcome geo-restrictions. If errors continues, upgrade the request to use premium proxies by adding premium=true. For persistent failures, activate ultra_premium=true to use enhanced anti-blocking measures.<URL> to extract <SPECIFIC_DATA>? If the request returns missing/incomplete<SPECIFIC_DATA>, set render=true to enable JS Rendering.API_KEY: Your ScraperAPI API key.Claude Desktop:
Claude Code:
.claude/settings.json with the JSON configuration file, or run:claude mcp add scraperapi -e API_KEY=<YOUR_SCRAPERAPI_API_KEY> -- python -m scraperapi_mcp_server
More here
mcp_config.json file will openMore here
cline_mcp_settings.jsonMore here
Clone the repository:
git clone https://github.com/scraperapi/scraperapi-mcp
cd scraperapi-mcp
Install dependencies:
poetry install
# Create virtual environment and activate it
python -m venv .venv
source .venv/bin/activate # MacOS/Linux
# OR
.venv/Scripts/activate # Windows
# Install the local package in editable mode
pip install -e .
# Build the Docker image locally
docker build -t scraperapi-mcp-server .
python -m scraperapi_mcp_server
# Run the Docker container with your API key
docker run -e API_KEY=<YOUR_SCRAPERAPI_API_KEY> scraperapi-mcp-server
python3 -m scraperapi_mcp_server --debug
This project uses pytest for testing.
poetry install --with dev
pip install -e .
pip install pytest pytest-mock pytest-asyncio
# Run All Tests
pytest
# Run Specific Test
pytest <TEST_FILE_PATH>
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"scraperapi-scraperapi-mcp": {
"command": "npx",
"args": []
}
}
}PRs, issues, code search, CI status
Database, auth and storage
Reference / test server with prompts, resources, and tools.
Secure file operations with configurable access controls.