loading…
Search for a command to run...
loading…
Enables conversion of webpages to clean markdown with content quality scoring and multi-page crawling for documentation sites. Supports Claude Code, Cursor, and
Enables conversion of webpages to clean markdown with content quality scoring and multi-page crawling for documentation sites. Supports Claude Code, Cursor, and Windsurf with native LangChain and LlamaIndex export formats.
MCP server for converting webpages to markdown in Claude Code, Cursor, and Windsurf. Content quality scoring. Multi-page crawl. LangChain + LlamaIndex exports.
Get your free API key at app.unweb.info (500 credits/month, no credit card).
Add to ~/.claude/settings.json:
{
"mcpServers": {
"unweb": {
"command": "npx",
"args": ["-y", "@mbsoftsystems/unweb-mcp"],
"env": { "UNWEB_API_KEY": "unweb_your_key_here" }
}
}
}
Add to .cursor/mcp.json:
{
"mcpServers": {
"unweb": {
"command": "npx",
"args": ["-y", "@mbsoftsystems/unweb-mcp"],
"env": { "UNWEB_API_KEY": "unweb_your_key_here" }
}
}
}
Same format in your Windsurf MCP configuration file.
| Tool | Description | Credits |
|---|---|---|
convert_url |
Convert a webpage URL to clean markdown with quality score | 1 |
convert_html |
Convert raw HTML string to markdown | 1 |
crawl_start |
Start crawling a documentation site (path-bounded BFS) | 1/page |
crawl_status |
Check crawl job progress | 0 |
crawl_download |
Download all crawled pages as concatenated markdown | 0 |
Convert any webpage to clean CommonMark markdown:
"Convert https://docs.stripe.com/api/charges to markdown"
Returns the markdown content plus a quality score (0-100) indicating extraction confidence. Scores below 40 indicate the page likely needs JavaScript rendering.
Convert HTML you already have — API responses, scraped content, generated markup:
"Convert this HTML to markdown:
<h1>Title</h1><p>Content</p>"
Crawl entire documentation sites:
"Crawl https://docs.example.com starting from /guides/ and get all pages as markdown"
The crawler runs a path-bounded BFS, converting each page. Use crawl_status to check progress and crawl_download to get all pages concatenated with separators:
--- Page: guides/getting-started.md ---
# Getting Started
Content here...
--- Page: guides/authentication.md ---
# Authentication
Content here...
Export formats: raw-md (default), langchain (JSONL for LangChain), llamaindex (JSON for LlamaIndex).
| Feature | UnWeb | Firecrawl | Jina Reader |
|---|---|---|---|
| Content quality score | 0-100 on every response | No | No |
| Multi-page crawl | Yes | Yes | No |
| LangChain/LlamaIndex export | Native | No | No |
| Convert raw HTML | Yes | No (URL only) | No (URL only) |
| Free tier | 500 credits/month (recurring) | 500 credits (one-time) | Rate-limited |
| Cheapest paid | $12/month | $16/month | Token-based |
| Plan | Credits/month | Price |
|---|---|---|
| Free | 500 | $0 |
| Starter | 2,000 | $12/month |
| Pro | 15,000 | $39/month |
| Scale | 60,000 | $99/month |
pip install unwebnpm install @mbsoftsystems/unwebMIT
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"unweb-mcp-server": {
"command": "npx",
"args": []
}
}
}PRs, issues, code search, CI status
Database, auth and storage
Reference / test server with prompts, resources, and tools.
Secure file operations with configurable access controls.