loading…
Search for a command to run...
loading…
An MCP server that allows AI assistants to query databases using natural language by leveraging Vanna AI. It supports secure SSH tunneling, model training on da
An MCP server that allows AI assistants to query databases using natural language by leveraging Vanna AI. It supports secure SSH tunneling, model training on database schemas, and persistent storage of SQL patterns via ChromaDB.
Purpose:
A Model Context Protocol (MCP) server that provides a natural language interface to SQL databases using Vanna AI. It allows AI assistants (like Claude or Gemini) to generate and execute SQL queries based on plain English questions.
Why this exists:
To enable seamless natural language data exploration. By bridging LLMs with your private SQL databases (via an SSH tunnel), this server allows you to "chat with your data" without manual SQL writing, all while keeping your database credentials secure on your remote server (like a DGX).
This project is designed as a completely free, local, open-source alternative to expensive enterprise data platforms (comparable to the AI-to-SQL features found in Databricks or Snowflake).
[!WARNING] Ollama Performance Note: Running with local Ollama (
LLM_TYPE=ollama) can be slow and potentially unreliable depending on your hardware. If you experience timeouts or inconsistent SQL generation, consider using a cloud provider like Gemini or OpenAI for production use.
Follow the Mac & Local Ollama Setup Guide (50 Steps) for a detailed walkthrough from zero to a running server.
git clone https://github.com/lilgreml1n/vanna-mcp
cd vanna-mcp
uv pip install -e .
Create a .env file in the root directory:
# --- LLM CONFIG ---
LLM_TYPE=ollama # choices: ollama, lmstudio, claude, gemini, openai
# OLLAMA CONFIG
OLLAMA_MODEL=codellama
# CLAUDE CONFIG
# ANTHROPIC_API_KEY=your_key_here
# CLAUDE_MODEL=claude-3-5-sonnet-20241022
# GEMINI CONFIG
# GEMINI_API_KEY=your_key_here
# GEMINI_MODEL=gemini-1.5-flash
# OPENAI CONFIG
# OPENAI_API_KEY=your_key_here
# OPENAI_MODEL=gpt-4o
ask_database: Convert a question to SQL and optionally execute it.train_vanna: Provide DDL or example SQL to "teach" the model your schema.get_tables: List all available tables.get_schema: Get column details for a specific table.execute_sql: Run manual SQL for verification.If you don't have a database ready, you can set up a sample inventory database for testing:
mysql -u root -p < setup_test_db.sql
This creates an inventory_db with an inventory table and 20 sample items.The script pre-loads diverse items so you can test immediately:
.env:
Point your .env to this new database:MYSQL_DATABASE=inventory_db
# ... rest of your MySQL credentials ...
This project has been rigorously tested across two primary environments to ensure scalability from local development to production-grade AI compute.
Our Goal: You should be able to point any AI (Claude, Gemini, or Copilot) at this README and be running in minutes.
Add this to your ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"vanna-ai": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network=host", "-v", "vanna_chroma_data:/app/chroma_data", "--env-file", "/path/to/your/vanna-mcp/.env", "vanna-mcp-server"]
}
}
}
In your gemini-cli-mcp project, update main_mcp.py to point to this server:
MCP_SERVERS = {
"vanna-ai": {
"command": "docker",
"args": ["run", "-i", "--rm", "--network=host", "vanna-mcp-server"]
}
}
docker run -i --rm --network=host vanna-mcp-server.@mcp /ask_database "Show me all items in bin B-12"Once connected, try these prompts:
ebay_orders)This server is typically deployed on a DGX or high-performance server as part of the SparkForge ecosystem. It serves as the "brain" for other MCP proxies like inventory-mcp.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"vanna-ai-mcp-server": {
"command": "npx",
"args": []
}
}
}Query your database in natural language
Read-only database access with schema inspection.
Interact with Redis key-value stores.
Database interaction and business intelligence capabilities.