loading…
Search for a command to run...
loading…
AI-powered code generator for the Arbitrum ecosystem. 19 MCP tools for Stylus smart contracts, SDK bridging, full-stack dApps, and Orbit chain deployment, backe
AI-powered code generator for the Arbitrum ecosystem. 19 MCP tools for Stylus smart contracts, SDK bridging, full-stack dApps, and Orbit chain deployment, backed by RAG-based retrieval over Arbitrum documentation.
GitHub stars License: MIT MCP Tools Python ARBuilder MCP server
AI-powered development assistant for the Arbitrum ecosystem. ARBuilder transforms natural language prompts into:
Hosted (no setup):
# Claude Code
claude mcp add arbbuilder -- npx -y mcp-remote https://arbuilder.app/mcp --header "Authorization: Bearer YOUR_API_KEY"
Or add to ~/.cursor/mcp.json (Cursor / VS Code):
{
"mcpServers": {
"arbbuilder": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://arbuilder.app/mcp",
"--header", "Authorization: Bearer YOUR_API_KEY"]
}
}
}
Get your API key at arbuilder.app
Self-hosted — see Setup below.
ARBuilder uses a Retrieval-Augmented Generation (RAG) pipeline with hybrid search (vector + BM25 + cross-encoder reranking) to provide context-aware code generation. Available as a hosted service at arbuilder.app or self-hosted via MCP server.
┌─────────────────────────────────────────────────────────────────────────┐
│ ARBuilder │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ DATA PIPELINE │
│ ┌──────────┐ ┌──────────┐ ┌───────────┐ ┌──────────────────┐ │
│ │ Scraper │───▶│Processor │───▶│ Embedder │───▶│ ChromaDB │ │
│ │ crawl4ai │ │ 3-layer │ │ BGE-M3 │ │ (local vectors) │ │
│ │ + GitHub │ │ filters │ │ 1024-dim │ │ │ │
│ └──────────┘ └──────────┘ └───────────┘ └────────┬─────────┘ │
│ │ │
│ RETRIEVAL │ │
│ ┌──────────────────────────────────────────────────────────▼─────────┐ │
│ │ Hybrid Search Engine │ │
│ │ ┌──────────┐ ┌──────────┐ ┌────────────┐ │ │
│ │ │ Vector │ │ BM25 │ │CrossEncoder│ RRF Fusion │ │
│ │ │ Search │───▶│ Keywords │───▶│ Reranker │──▶ + MMR │ │
│ │ └──────────┘ └──────────┘ └────────────┘ │ │
│ └────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ GENERATION ▼ │
│ ┌───────────────────────────────────────────────────────────────────┐ │
│ │ MCP Server (19 tools) │ │
│ │ │ │
│ │ Stylus Contracts Arbitrum-SDK Full dApp Builder │ │
│ │ ┌──────────────┐ ┌─────────────┐ ┌──────────────────────┐ │ │
│ │ │ generate_ │ │ generate_ │ │ generate_backend │ │ │
│ │ │ stylus_code │ │ bridge_code │ │ generate_frontend │ │ │
│ │ │ ask_stylus │ │ generate_ │ │ generate_indexer │ │ │
│ │ │ get_context │ │ messaging │ │ generate_oracle │ │ │
│ │ │ gen_tests │ │ ask_bridging│ │ orchestrate_dapp │ │ │
│ │ │ get_workflow │ │ │ │ │ │ │
│ │ │ validate_code│ │ │ │ │ │ │
│ │ └──────────────┘ └─────────────┘ └──────────────────────┘ │ │
│ │ │ │
│ │ Orbit Chain │ │
│ │ ┌──────────────────────┐ │ │
│ │ │ generate_orbit_config│ │ │
│ │ │ generate_orbit_deploy│ │ │
│ │ │ gen_validator_setup │ │ │
│ │ │ ask_orbit │ │ │
│ │ │ orchestrate_orbit │ │ │
│ │ └──────────────────────┘ │ │
│ └───────────────────────────────────────────────────────────────────┘ │
│ │ │
│ IDE INTEGRATION ▼ │
│ ┌───────────────────────────────────────────────────────────────────┐ │
│ │ Cursor / VS Code / Claude Desktop / Any MCP Client │ │
│ │ <- via local stdio or remote mcp-remote proxy -> │ │
│ └───────────────────────────────────────────────────────────────────┘ │
│ │
│ HOSTED SERVICE (Cloudflare Workers) │
│ ┌──────────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
│ │ Workers AI │ │ Vectorize│ │ D1 │ │ KV │ │
│ │ BGE-M3 + │ │ 1024-dim │ │ Users │ │ Source registry│ │
│ │ Reranker │ │ index │ │ API keys│ │ + Ingest state │ │
│ └──────────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
│ │
│ INGESTION PIPELINE (Worker-native, cron every 6h) │
│ ┌──────────┐ ┌──────────┐ ┌───────────┐ ┌──────────────┐ │
│ │ scraper │───▶│ chunker │───▶│ Workers AI│───▶│ Vectorize │ │
│ │ HTML/ │ │ doc+code │ │ BGE-M3 │ │ upsert │ │
│ │ GitHub │ │ splitter │ │ embedding │ │ │ │
│ └──────────┘ └──────────┘ └───────────┘ └──────────────┘ │
│ │ ▲ │
│ │ >30 files │ embed messages │
│ ▼ │ │
│ ┌─────────────────────────┴───┐ │
│ │ CF Queue (async path) │ │
│ │ embed │ continue │finalize │ │
│ └─────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────────────┘
ArbBuilder/
├── sources.json # Single source of truth for all data sources
├── scraper/ # Data collection module
│ ├── config.py # Thin wrapper around sources.json (backward-compat helpers)
│ ├── scraper.py # Web scraping with crawl4ai
│ ├── github_scraper.py # GitHub repository cloning
│ └── run.py # Pipeline entry point
├── src/
│ ├── preprocessing/ # Text cleaning and chunking
│ │ ├── cleaner.py # Text normalization
│ │ ├── chunker.py # Document chunking with token limits
│ │ └── processor.py # Main preprocessing pipeline
│ ├── embeddings/ # Embedding and vector storage
│ │ ├── embedder.py # OpenRouter embedding client
│ │ ├── vectordb.py # ChromaDB wrapper with hybrid search (BM25 + vector)
│ │ └── reranker.py # CrossEncoder, MMR, LLM reranking
│ ├── templates/ # Code generation templates
│ │ ├── stylus_templates.py # M1: Stylus contract templates
│ │ ├── backend_templates.py # M3: NestJS/Express templates
│ │ ├── frontend_templates.py # M3: Next.js + wagmi templates
│ │ ├── indexer_templates.py # M3: Subgraph templates
│ │ ├── oracle_templates.py # M3: Chainlink templates
│ │ └── orbit_templates.py # M4: Orbit chain deployment templates
│ ├── utils/ # Shared utilities
│ │ ├── version_manager.py # SDK version management
│ │ ├── env_config.py # Centralized env var configuration
│ │ ├── abi_extractor.py # Stylus ABI extraction from Rust code
│ │ └── compiler_verifier.py # Docker-based cargo check verification
│ ├── mcp/ # MCP server for IDE integration
│ │ ├── server.py # MCP server (tools, resources, prompts)
│ │ ├── tools/ # MCP tool implementations (19 tools)
│ │ │ ├── get_stylus_context.py # M1
│ │ │ ├── generate_stylus_code.py # M1
│ │ │ ├── ask_stylus.py # M1
│ │ │ ├── generate_tests.py # M1
│ │ │ ├── get_workflow.py # M1
│ │ │ ├── validate_stylus_code.py # M1
│ │ │ ├── generate_bridge_code.py # M2
│ │ │ ├── generate_messaging_code.py # M2
│ │ │ ├── ask_bridging.py # M2
│ │ │ ├── generate_backend.py # M3
│ │ │ ├── generate_frontend.py # M3
│ │ │ ├── generate_indexer.py # M3
│ │ │ ├── generate_oracle.py # M3
│ │ │ ├── orchestrate_dapp.py # M3
│ │ │ ├── generate_orbit_config.py # M4
│ │ │ ├── generate_orbit_deployment.py # M4
│ │ │ ├── generate_validator_setup.py # M4
│ │ │ ├── ask_orbit.py # M4
│ │ │ └── orchestrate_orbit.py # M4
│ │ ├── resources/ # Static knowledge (11 resources)
│ │ │ ├── stylus_cli.py # M1
│ │ │ ├── workflows.py # M1
│ │ │ ├── networks.py # M1
│ │ │ ├── coding_rules.py # M1
│ │ │ ├── sdk_rules.py # M2
│ │ │ ├── backend_rules.py # M3
│ │ │ ├── frontend_rules.py # M3
│ │ │ ├── indexer_rules.py # M3
│ │ │ └── oracle_rules.py # M3
│ │ └── prompts/ # Workflow templates
│ └── rag/ # RAG pipeline (TBD)
├── tests/
│ ├── mcp_tools/ # MCP tool test cases and benchmarks
│ │ ├── test_get_stylus_context.py
│ │ ├── test_generate_stylus_code.py
│ │ ├── test_ask_stylus.py
│ │ ├── test_generate_tests.py
│ │ ├── test_m2_e2e.py # M2 end-to-end tests
│ │ ├── test_m3_tools.py # M3 full dApp tests
│ │ ├── test_orbit_tools.py # M4 orbit tests
│ │ └── benchmark.py # Evaluation framework
│ └── test_retrieval.py # Retrieval quality tests
├── docs/
│ └── mcp_tools_spec.md # MCP tools specification
├── apps/web/ # Hosted service (Cloudflare Workers + Next.js)
│ ├── src/app/
│ │ ├── layout.tsx # Root layout + SEO meta + JSON-LD structured data
│ │ ├── page.tsx # Landing page (M1-M4 feature sections)
│ │ ├── robots.ts # robots.txt generation
│ │ ├── sitemap.ts # sitemap.xml generation
│ │ ├── llms.txt/ # LLM discovery endpoint
│ │ └── playground/ # Interactive tool playground (18 hosted tools)
│ ├── src/lib/
│ │ ├── scraper.ts # Web doc scraping (HTMLRewriter)
│ │ ├── github.ts # GitHub repo scraping (Trees/Contents API)
│ │ ├── chunker.ts # Document + code chunking
│ │ ├── ingestPipeline.ts # Ingestion orchestrator (sync + async queue paths)
│ │ └── vectorize.ts # Search + embedding utilities
│ ├── src/app/api/admin/ # Admin APIs (sources, ingest, migrate)
│ ├── worker.ts # Worker entry + cron + queue consumer handler
│ └── wrangler.prod.jsonc # Production config (D1, KV, Vectorize, Queue)
├── scripts/
│ ├── run_benchmarks.py # Benchmark runner
│ ├── diff-migrate.ts # Push chunks to CF Vectorize
│ ├── sync_sources.ts # Sync sources.json to CF KV registry
│ └── ingest_m3_sources.py # M3 source ingestion
├── data/
│ ├── raw/ # Raw scraped data (docs + curated repos)
│ ├── processed/ # Pre-processed chunks
│ └── chroma_db/ # ChromaDB vector store (generated locally, not in repo)
├── environment.yml # Conda environment specification
├── pyproject.toml # Project metadata and dependencies
└── .env # Environment variables (not committed)
# Create and activate the environment
conda env create -f environment.yml
conda activate arbbuilder
Note: If you plan to refresh the knowledge base by scraping (optional), also install playwright:
playwright install chromium
Copy the example environment file and configure your API keys:
cp .env.example .env
Edit .env with your credentials:
OPENROUTER_API_KEY=your-api-key
NVIDIA_API_KEY=your-nvidia-api-key
DEFAULT_MODEL=deepseek/deepseek-v3.2
DEFAULT_EMBEDDING=baai/bge-m3
DEFAULT_CROSS_ENCODER=nvidia/llama-3.2-nv-rerankqa-1b-v2
The repository includes all data needed:
data/raw/): Documentation pages + curated GitHub reposdata/processed/): Chunks ready for embeddingImportant: The ChromaDB vector database must be generated locally (it's not included in the repo due to binary compatibility issues across systems).
# Generate the vector database (required before using MCP tools)
python -m src.embeddings.vectordb
Test that the MCP server starts correctly:
# Run the MCP server directly (press Ctrl+C to exit)
python -m src.mcp.server
You should see:
ARBuilder MCP Server started
Capabilities: 19 tools, 11 resources, 5 prompts
If you want to re-scrape the latest documentation and code:
# Run full pipeline (web scraping + GitHub cloning)
python -m scraper.run
# Then preprocess the raw data
python -m src.preprocessing.processor
# And re-ingest into ChromaDB
python -m src.embeddings.vectordb --reset
Data Quality Filters: The pipeline applies a 3-layer filtering system to remove junk data (vendored crates, auto-generated TypeChain files, hex bytecode, lock files, and cross-repo duplicates). See docs/DATA_CURATION_POLICY.md for details.
Audit and clean up data sources:
# Audit: compare repos on disk vs config
python scripts/audit_data.py
# Show what orphan repos would be deleted
python scripts/audit_data.py --prune
# Actually delete orphan repos
python scripts/audit_data.py --prune --confirm
# Include ChromaDB stats in audit
python scripts/audit_data.py --chromadb
# GitHub scraper also supports audit/prune
python -m scraper.github_scraper --audit
python -m scraper.github_scraper --prune --dry-run
Fork community Stylus repos and migrate them to SDK 0.10.0:
# Dry run: show what would change without modifying anything
python scripts/fork_and_migrate.py --all --dry-run
# Migrate all 13 Stylus repos
python scripts/fork_and_migrate.py --all
# Migrate a specific repo
python scripts/fork_and_migrate.py --repo OffchainLabs/stylus-hello-world
# Re-verify already-forked repos after manual fixes
python scripts/fork_and_migrate.py --all --verify-only
Reports are saved to reports/fork_migration_*.json.
Run ARBuilder locally with your own API keys. No rate limits.
Step 1: Configure your IDE
Add the following to your MCP configuration file:
Cursor (~/.cursor/mcp.json):
{
"mcpServers": {
"arbbuilder": {
"command": "/path/to/miniconda3/envs/arbbuilder/bin/python3",
"args": ["-m", "src.mcp.server"],
"env": {
"OPENROUTER_API_KEY": "your-api-key",
"PYTHONPATH":"/path/to/ArbBuilder"
}
}
}
}
Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"arbbuilder": {
"command": "python",
"args": ["-m", "src.mcp.server"],
"cwd": "/path/to/ArbBuilder",
"env": {
"OPENROUTER_API_KEY": "your-api-key"
}
}
}
}
Step 2: Restart your IDE
After saving the configuration, restart Cursor or Claude Desktop. The ARBuilder tools will be available to the AI assistant.
Step 3: Start building!
Ask your AI assistant:
Use our hosted API - no local setup required. Available at arbuilder.app.
{
"mcpServers": {
"arbbuilder": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://arbuilder.app/mcp",
"--header", "Authorization: Bearer YOUR_API_KEY"]
}
}
}
The hosted service includes:
Hosted (Worker-native): The hosted service at arbuilder.app has a built-in ingestion pipeline that runs automatically via cron (every 6 hours). Sources can also be manually ingested via the admin UI at /admin.
The pipeline uses two paths based on source size:
continue messages for additional file batches and a finalize message to update source status. This stays within the 50 subrequest/invocation limit on the Free plan.Local (Python pipeline): For self-hosted setups, run the full data collection pipeline:
conda activate arbbuilder
# Run full pipeline (web scraping + GitHub cloning)
python -m scraper.run
# Preprocess and push to CF Vectorize
python -m src.preprocessing.processor
AUTH_SECRET=xxx npx tsx scripts/diff-migrate.ts --full
All data sources are defined in sources.json — the single source of truth for both the local Python pipeline and the hosted CF Worker ingestion. The file contains 84 curated sources (53 documentation pages + 31 GitHub repos) across 4 milestones.
Versioned repos (with multiple SDK branches) use a versions array:
{
"url": "https://github.com/ARBuilder-Forks/stylus-hello-world",
"versions": [
{ "sdkVersion": "0.10.0", "branch": "main" },
{ "sdkVersion": "0.9.0", "branch": "v0.9.0" }
]
}
Sync to hosted service:
ARBBUILDER_ADMIN_SECRET=xxx npx tsx scripts/sync_sources.ts
ARBBUILDER_ADMIN_SECRET=xxx npx tsx scripts/sync_sources.ts --dry-run
ARBBUILDER_ADMIN_SECRET=xxx npx tsx scripts/sync_sources.ts --remove-stale
**Stylus Contracts/projects ** — 17 docs + 19 repos
Curation Policy:
sources.jsonforkedFrom provenance trackingStylus SDK Version Support:
| Version | Status | Notes |
|---|---|---|
| 0.10.0 | Main (default) | Latest stable, recommended for new projects |
| 0.9.x | Supported | Separate branches in forked repos |
| 0.8.x | Supported | Minimum supported version |
| < 0.8.0 | Deprecated | Excluded from knowledge base |
Multi-Version Strategy:
main for 0.10.0, v0.9.0 for original)generate_stylus_code and ask_stylus accept target_version to produce code for any supported SDK versionArbitrum SDK — 6 docs + 5 repos
Full dApp Builder — 30 docs + 11 repos
Orbit SDK — 5 Python MCP tools + 9 TypeScript templates
generate_orbit_config, generate_orbit_deployment, generate_validator_setup, ask_orbit, orchestrate_orbit@arbitrum/chain-sdk ^0.25.0 + viem ^1.20.0 for prepareChainConfig(), createRollup(), createTokenBridge(), prepareNodeConfig()deployment.json — downstream scripts (token bridge, node config) chain automaticallydeployment.json BEFORE receipt fetch, with try/catch for block numberapprove-token.ts with correct RollupCreator addresses, ERC-20 deploy guidanceoffchainlabs/nitro-node:v3.9.4-7f582c3, bind mounts (./data/arbitrum), no user: root--validation.wasm.allowed-wasm-module-roots prevents crash-loops on startupdatool keygen from nitro-node imageThe MCP endpoint at /mcp is free to use and designed for IDE integration:
https://arbuilder.app/mcp
arb_ API key from dashboardView all ingested sources and code templates at arbuilder.app/transparency.
This public page provides:
Public API endpoints (no authentication required):
GET /api/public/sources - List all active sourcesGET /api/public/templates - List all code templatesGET /api/public/templates?code=true - Templates with full source codeDirect API routes at /api/v1/tools/* are for internal testing only:
AUTH_SECRET in Authorization headerARBuilder exposes a full MCP server with 19 tools, 11 resources, and 5 prompts for Cursor/VS Code integration.
Stylus Development (6 tools)
| Tool | Description |
|---|---|
get_stylus_context |
RAG retrieval for docs and code examples |
generate_stylus_code |
Generate Stylus contracts from prompts |
ask_stylus |
Q&A, debugging, concept explanations |
generate_tests |
Generate unit/integration/fuzz tests |
get_workflow |
Build/deploy/test workflow guidance |
validate_stylus_code |
Compile-check code via Docker cargo check with Stylus-specific fix guidance |
Arbitrum SDK - Bridging & Messaging (3 tools)
| Tool | Description |
|---|---|
generate_bridge_code |
Generate ETH/ERC20 bridging code (L1<->L2, L1->L3, L3->L2) |
generate_messaging_code |
Generate cross-chain messaging code (L1<->L2, L2<->L3) |
ask_bridging |
Q&A about bridging patterns and SDK usage |
Full dApp Builder (5 tools)
| Tool | Description |
|---|---|
generate_backend |
Generate NestJS/Express backends with Web3 integration |
generate_frontend |
Generate Next.js + wagmi + RainbowKit frontends |
generate_indexer |
Generate The Graph subgraphs for indexing |
generate_oracle |
Generate Chainlink oracle integrations |
orchestrate_dapp |
Scaffold complete dApps with multiple components |
Orbit Chain Integration (5 tools)
| Tool | Description |
|---|---|
generate_orbit_config |
Generate Orbit chain configuration (prepareChainConfig, AnyTrust, custom gas tokens) |
generate_orbit_deployment |
Generate rollup and token bridge deployment scripts (createRollup, createTokenBridge) |
generate_validator_setup |
Manage validators, batch posters, and AnyTrust DAC keysets |
ask_orbit |
Q&A about Orbit chain deployment, configuration, and operations |
orchestrate_orbit |
Scaffold complete Orbit chain deployment projects with all scripts |
{
"workflow_type": "deploy",
"network": "arbitrum_sepolia",
"include_troubleshooting": true
}
Returns step-by-step commands:
# Check balance
cast balance YOUR_ADDRESS --rpc-url https://sepolia-rollup.arbitrum.io/rpc
# Deploy contract
cargo stylus deploy --private-key-path=./key.txt --endpoint=https://sepolia-rollup.arbitrum.io/rpc
MCP Resources provide static knowledge that AI IDEs can load automatically:
Stylus Resources
| Resource URI | Description |
|---|---|
stylus://cli/commands |
Complete cargo-stylus CLI reference |
stylus://workflows/build |
Step-by-step build workflow |
stylus://workflows/deploy |
Deployment workflow with network configs |
stylus://workflows/test |
Testing workflow (unit, integration, fuzz) |
stylus://config/networks |
Arbitrum network configurations |
stylus://rules/coding |
Stylus coding guidelines and patterns |
Arbitrum SDK Resources
| Resource URI | Description |
|---|---|
arbitrum://rules/sdk |
Arbitrum SDK bridging and messaging guidelines |
Full dApp Builder Resources
| Resource URI | Description |
|---|---|
dapp://rules/backend |
NestJS/Express Web3 backend patterns |
dapp://rules/frontend |
Next.js + wagmi + RainbowKit patterns |
dapp://rules/indexer |
The Graph subgraph development patterns |
dapp://rules/oracle |
Chainlink oracle integration patterns |
MCP Prompts provide reusable templates for common workflows:
| Prompt | Description | Arguments |
|---|---|---|
build-contract |
Build workflow guidance | project_path, release_mode |
deploy-contract |
Deploy workflow guidance | network, key_method |
debug-error |
Error diagnosis workflow | error_message, context |
optimize-gas |
Gas optimization workflow | contract_code, focus |
generate-contract |
Contract generation workflow | description, contract_type |
User: "Deploy my contract to Arbitrum Sepolia"
↓
AI IDE calls get_workflow(workflow_type="deploy", network="arbitrum_sepolia")
↓
Returns structured commands + troubleshooting
↓
AI IDE presents commands to user (user executes locally)
The MCP server provides knowledge about commands, not command execution. This ensures:
See docs/mcp_tools_spec.md for full specification.
ARBuilder uses template-based code generation to ensure generated code compiles correctly. Instead of generating from scratch, it customizes verified working templates from official Stylus examples.
Available Templates:
| Template | Type | Description |
|---|---|---|
| Counter | utility | Simple storage with getter/setter operations |
| VendingMachine | defi | Mappings with time-based rate limiting |
| SimpleERC20 | token | Basic ERC20 with transfer, approve, transferFrom |
| AccessControl | utility | Owner-only functions with ownership transfer |
| DeFiVault | defi | Cross-contract calls (sol_interface!), transfer_eth, Call::new_in(self) |
| NftRegistry | nft | Dynamic arrays (push), sol! events with camelCase, mint/transfer |
Stylus SDK Version Support:
| Version | Status | Notes |
|---|---|---|
| 0.10.0 | Main (default) | Recommended for new projects |
| 0.9.x | Supported | Use target_version: "0.9.0" for 0.9.x output. Separate branches in forks |
| 0.8.x | Supported | Minimum supported version |
| < 0.8.0 | Deprecated | Warning shown, may not compile |
Pass target_version to tools for version-specific output:
User: "Generate a counter contract for SDK 0.9.0"
AI uses: generate_stylus_code(prompt="...", target_version="0.9.0")
Returns: Code using msg::sender(), .getter(), print_abi() patterns
Ask your AI assistant to generate contracts:
User: "Create an ERC20 token called MyToken with 1 million supply"
AI uses: generate_stylus_code tool
Returns: Complete Rust contract based on SimpleERC20 template with proper imports, storage, and methods
Search the knowledge base for documentation and code examples:
User: "Show me how to implement a mapping in Stylus"
AI uses: get_stylus_context tool
Returns: Relevant documentation and code snippets from official examples
Ask questions about Stylus development:
User: "Why am I getting 'storage not initialized' error?"
AI uses: ask_stylus tool
Returns: Explanation with solution based on documentation context
Create test suites for your contracts:
User: "Write unit tests for this counter contract: [paste code]"
AI uses: generate_tests tool
Returns: Comprehensive test module with edge cases
Get step-by-step deployment guidance:
User: "How do I deploy to Arbitrum Sepolia?"
AI uses: get_workflow tool
Returns: Commands for checking balance, deploying, and verifying
AI-powered Stylus contract development with RAG-based context retrieval:
cargo check with up to 3 auto-fix attempts and Stylus-specific error guidanceself.vm() API)# Example: Generate a Stylus contract
echo '{"method": "tools/call", "id": 1, "params": {"name": "generate_stylus_code", "arguments": {"prompt": "Create an ERC20 token with mint and burn"}}}' | python -m src.mcp.server
# Example: Ask a Stylus question
echo '{"method": "tools/call", "id": 1, "params": {"name": "ask_stylus", "arguments": {"question": "How do I use mappings in Stylus?"}}}' | python -m src.mcp.server
Cross-chain bridging and messaging support:
# Example: Generate ETH deposit code
echo '{"method": "tools/call", "id": 1, "params": {"name": "generate_bridge_code", "arguments": {"bridge_type": "eth_deposit", "amount": "0.5"}}}' | python -m src.mcp.server
Complete dApp scaffolding with all components:
cargo check loop catches and auto-fixes compilation errorssetup.sh, deploy.sh, and start.sh for one-command workflowssetup.sh uses a scaffold-first, backfill pattern with official CLI tools (cargo stylus new, create-next-app, @nestjs/cli) to fill in config files our templates don't generate, with graceful fallback if tools aren't installedBackend Templates:
Frontend Templates:
Indexer Templates:
Oracle Templates:
# Example: Generate full dApp scaffold
echo '{"method": "tools/call", "params": {"name": "orchestrate_dapp", "arguments": {"prompt": "Create a token staking dApp", "components": ["contract", "backend", "frontend", "indexer"]}}}' | python -m src.mcp.server
# Example: Generate backend only
echo '{"method": "tools/call", "params": {"name": "generate_backend", "arguments": {"prompt": "Create a staking API", "framework": "nestjs"}}}' | python -m src.mcp.server
# Example: Generate frontend with contract ABI
echo '{"method": "tools/call", "params": {"name": "generate_frontend", "arguments": {"prompt": "Create token dashboard", "contract_abi": "[...]"}}}' | python -m src.mcp.server
Orbit chain deployment and management support:
prepareChainConfig() scripts for Rollup or AnyTrust chainscreateRollup() scripts with crash-proof deployment.json output (saves before receipt fetch)createTokenBridge() scripts with automatic ERC-20 approval for custom gas token chainsgenerate-das-keys.sh), keyset encoding, UpgradeExecutor-routed setValidKeyset(), hash verificationprepareNodeConfig() with post-processing — private key restoration, staker disable for single-key setups, deployed-at injection, DAS URL fixmanage-governance.ts)test-chain.ts)# Example: Scaffold a complete Orbit chain deployment project
echo '{"method": "tools/call", "params": {"name": "orchestrate_orbit", "arguments": {"prompt": "Deploy an AnyTrust chain on Arbitrum Sepolia", "chain_name": "my-orbit-chain", "chain_id": 412346, "is_anytrust": true, "parent_chain": "arbitrum-sepolia"}}}' | python -m src.mcp.server
# Example: Generate chain configuration
echo '{"method": "tools/call", "params": {"name": "generate_orbit_config", "arguments": {"prompt": "Configure a custom gas token chain", "native_token": "0x...", "parent_chain": "arbitrum-sepolia"}}}' | python -m src.mcp.server
# Example: Ask about Orbit deployment
echo '{"method": "tools/call", "params": {"name": "ask_orbit", "arguments": {"question": "How do I deploy an Orbit chain with a custom gas token?"}}}' | python -m src.mcp.server
# Run all unit tests
pytest tests/ -m "not integration"
# Run retrieval quality tests
pytest tests/test_retrieval.py -v
# Run MCP tool tests (requires tool implementations)
pytest tests/mcp_tools/ -v
# Run template selection and validation tests
pytest tests/test_templates.py -v -m "not integration"
# Run template compilation tests (requires Rust toolchain + cargo-stylus)
pytest tests/test_templates.py -v -m integration
Template compilation tests require:
rustup install 1.87.0rustup target add wasm32-unknown-unknown --toolchain 1.87.0cargo install --locked cargo-stylus# Run all benchmarks
python scripts/run_benchmarks.py
# Run only P0 (critical) tests
python scripts/run_benchmarks.py --priority P0
# Run benchmarks for a specific tool
python scripts/run_benchmarks.py --tool get_stylus_context
Benchmark reports are saved to benchmark_results/.
black .
ruff check .
If you encounter errors like Error generating embeddings: RetryError or KeyError during vector database ingestion:
1. Check OpenRouter API Key
# Verify your .env file has a valid API key
cat .env | grep OPENROUTER_API_KEY
Ensure:
baai/bge-m3 is available on OpenRouter2. Rate Limiting Issues
If you see HTTPStatusError with status 429, you're being rate limited. Solutions:
# Run with smaller batch size
python -m src.embeddings.vectordb --batch-size 25
# Or modify max_workers in vectordb.py to 1 for sequential processing
3. Enable Debug Logging
Add this to your script or at the start of your session to see detailed logs:
import logging
logging.basicConfig(level=logging.INFO)
# For more verbose output:
# logging.basicConfig(level=logging.DEBUG)
"Execution context was destroyed" errors
This is a browser navigation issue during scraping. The scraper will automatically retry. If it persists:
Git clone failures
If repository cloning fails:
# Check your network connection
ping github.com
# Try cloning manually to diagnose
git clone --depth 1 https://github.com/OffchainLabs/stylus-hello-world
# If behind a proxy, configure git
git config --global http.proxy http://proxy:port
Timeout errors
For slow connections, increase timeouts in the scraper config or reduce concurrent requests:
python -m scraper.run --max-concurrent 1
"Collection is empty" error
If you see collection is empty when using get_stylus_context tool:
# The vector database must be generated locally (it's not included in the repo)
# Run this command to populate the database:
python -m src.embeddings.vectordb
# If that doesn't work, try resetting first:
python -m src.embeddings.vectordb --reset
Import errors with opentelemetry
If you see TypeError: 'NoneType' object is not subscriptable when importing chromadb:
# This is usually a conda environment issue
# Make sure you're in the correct environment
conda activate arbbuilder
# Or reinstall chromadb
pip uninstall chromadb
pip install chromadb
Database corruption
If the vector database seems corrupted:
# Reset and re-ingest
python -m src.embeddings.vectordb --reset
| Workflow | Trigger | Purpose |
|---|---|---|
qa.yml |
PRs to main, push to main | TypeScript type check, Python lint, Python tests |
maintenance.yml |
Weekly (Mon 6AM UTC) + manual | SDK monitoring, health checks, discovery, re-verification, auto-remediation |
refresh-rag.yml |
Manual | Full RAG refresh: scrape, process, migrate to Vectorize |
deploy-staging.yml |
Manual | Deploy to staging environment |
release-chunks.yml |
GitHub release | Build and publish pre-processed chunks + embeddings |
| Job | Trigger | What It Does |
|---|---|---|
sdk-monitor |
Weekly + manual | Checks crates.io/npm for new SDK versions |
health-check |
Weekly + manual | Checks all repos for archived/deleted status |
discover |
Manual only | Searches GitHub for new community repos |
reverify |
On SDK update or manual | Re-verifies all repos with verify_source.py --all |
remediate |
Manual only | Auto-removes archived/deleted repos from sources.json |
sync-sources |
Weekly + manual | Syncs sources.json to CF KV registry |
create-issue |
When problems found | Creates GitHub issue with maintenance label |
We welcome contributions! See CONTRIBUTING.md for guidelines on how to get started.
MIT License - see LICENSE for details.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"quantum3-labs-arbuilder": {
"command": "npx",
"args": []
}
}
}PRs, issues, code search, CI status
Database, auth and storage
Reference / test server with prompts, resources, and tools.
Secure file operations with configurable access controls.