loading…
Search for a command to run...
loading…
Direct access to 40+ scraping and search tools. Extract structured data from Google (Search, Maps, Trends), Amazon, Airbnb, Social Media, and any web page direc
Direct access to 40+ scraping and search tools. Extract structured data from Google (Search, Maps, Trends), Amazon, Airbnb, Social Media, and any web page directly into your AI agent.
MCP Compatible Tools Transport
Model Context Protocol server for HasData scraping and search APIs. Connect any MCP-compatible AI client to 40 ready-to-use data tools.
Claude Desktop. Add to claude_desktop_config.json:
{
"mcpServers": {
"hasdata": {
"type": "http",
"url": "https://mcp.hasdata.com/api/mcp",
"headers": {
"x-api-key": "<your-api-key>"
}
}
}
}
Claude Code:
claude mcp add hasdata -t http https://mcp.hasdata.com/api/mcp --header "x-api-key: <your-api-key>"
Cursor. Add to ~/.cursor/mcp.json or .cursor/mcp.json:
{
"mcpServers": {
"hasdata": {
"url": "https://mcp.hasdata.com/api/mcp",
"headers": {
"x-api-key": "<your-api-key>"
}
}
}
}
Any other MCP client that supports streamable HTTP with custom headers:
| Field | Value |
|---|---|
| URL | https://mcp.hasdata.com/api/mcp |
| Transport | HTTP (streamable) |
| Header | x-api-key: <your-api-key> |
Get your API key from the HasData dashboard. Requests without a valid key return 401 Unauthorized.
"Find the cheapest flights from NYC to London next month and compare prices."
"Monitor iPhone 16 prices on Amazon across the last 7 days."
"Pull all 5-star reviews for this product ASIN and summarize the common themes."
"Get me all software engineer job postings in Berlin from Indeed and Glassdoor."
"Find the top-rated Italian restaurants in Chicago using Google Maps."
"What are people searching for related to 'AI agents' on Google Trends this week?"
No scraping code. No proxies to manage. No parsing logic. Just ask.
| Client | Supported |
|---|---|
| Claude Desktop | ✅ |
| Claude Code | ✅ |
| Claude.ai (web) | ✅ |
| Cursor | ✅ |
| Windsurf | ✅ |
| VS Code (GitHub Copilot) | ✅ |
| Gemini CLI | ✅ |
| Custom agents (OpenAI, LangChain, etc.) | ✅ |
Any client that supports the MCP streamable HTTP transport with custom headers will work.
40 tools across search, e-commerce, maps, travel, real estate, and more.
| Tool | Description |
|---|---|
web_scraping_web_scraping |
Scrape any URL with optional parameters |
| Tool | Description |
|---|---|
google_serp_serp |
Google Search results |
google_serp_serp_light |
Google Search results (lightweight) |
google_serp_news |
Google News results |
google_serp_shopping |
Google Shopping results |
google_serp_images_images |
Google Image Search results |
google_serp_events |
Google Events results |
google_serp_product |
Google product details |
google_serp_immersive_product |
Google immersive product details |
google_serp_ai_overview |
Google AI Overview results |
google_serp_ai_mode |
Google AI Mode results |
google_maps_search |
Google Maps search |
google_maps_place |
Place details by placeId |
google_maps_reviews |
Place reviews |
google_maps_photos |
Place photos |
google_maps_contributor_reviews |
Reviews by contributor ID |
google_trends_search |
Google Trends data |
google_travel_flights |
Google Flights results |
bing_serp |
Bing Search results |
| Tool | Description |
|---|---|
amazon_search |
Amazon search results |
amazon_product |
Amazon product details by ASIN |
amazon_reviews |
Amazon product reviews |
shopify_products |
Shopify store products |
shopify_collections |
Shopify store collections |
| Tool | Description |
|---|---|
zillow_listing |
Zillow listing search |
zillow_property |
Zillow property details |
redfin_listing |
Redfin listing search |
redfin_property |
Redfin property details |
| Tool | Description |
|---|---|
indeed_listing |
Indeed job listings |
indeed_job |
Indeed job details |
glassdoor_listing |
Glassdoor job listings |
glassdoor_job |
Glassdoor job details |
| Tool | Description |
|---|---|
airbnb_listing |
Airbnb listings by location and dates |
airbnb_property |
Airbnb listing details |
yelp_search |
Yelp search results |
yelp_place |
Yelp place details |
yellowpages_search |
YellowPages search results |
yellowpages_place |
YellowPages place details |
| Tool | Description |
|---|---|
instagram_profile |
Instagram public profile details |
A lot of scraping APIs hand you back raw HTML and call it a day. With JavaScript-heavy sites, half of them just time out. HasData's tools return parsed JSON specific to each site, so the model gets structured fields it can actually work with, not a blob of markup to wade through.
A few things worth knowing before you start:
amazon_product knows about ASINs, variants, and seller data. google_maps_place knows about hours, coordinates, and ratings. You're not calling a generic scraper and hoping for the best.x-api-key header goes on every call. There's no session or token exchange.Competitive intelligence. Point amazon_search and google_serp_shopping at the same query and let the model compare pricing, ranking, and review counts across both. Runs in one prompt.
Lead generation. google_maps_search by keyword and city returns business names, addresses, phone numbers, and ratings. Stack it with yellowpages_search for broader coverage.
Market research. google_trends_search plus google_serp_news in the same session gives you both the search volume signal and the editorial context behind it.
Real estate. zillow_listing accepts filters for beds, baths, price range, and listing type. The model can pull a filtered list and summarize it without you touching a browser.
Recruiting. indeed_listing and glassdoor_listing both take location and keyword. Run both and the model can deduplicate and rank by recency.
Travel. google_travel_flights returns structured itineraries with prices, stops, and duration. Combine it with airbnb_listing for a full trip plan in one conversation.
Each tool call uses HasData credits the same way a direct API call would. See Credits and Concurrency for details.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"hasdata-mcp": {
"command": "npx",
"args": []
}
}
}