loading…
Search for a command to run...
loading…
Enables AI assistants to perform comprehensive Snowflake database operations including DDL, DML, and warehouse management. It allows users to query data, manage
Enables AI assistants to perform comprehensive Snowflake database operations including DDL, DML, and warehouse management. It allows users to query data, manage database objects, and configure permissions using natural language commands.
A powerful Model Context Protocol (MCP) server that provides comprehensive Snowflake database operations, Cortex AI services, and data management tools for AI assistants like Claude.
Clone the repository
git clone https://github.com/mcp-tg/snowflake-developer.git
cd snowflake-developer
Set up environment
# Copy environment template
cp .env.example .env
# Edit .env with your Snowflake credentials
# Required: SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PAT (or SNOWFLAKE_PASSWORD)
Install UV (if not already installed)
# On macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
The easiest way to test your setup is using the MCP Inspector:
# Run the development inspector script
./dev-inspector.sh
This will:
Note: The script automatically handles UV package installation, so you don't need to manually install dependencies.
test_snowflake_connection and click RunOption 1: Direct from GitHub (no local clone needed)
{
"mcpServers": {
"snowflake-developer": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/mcp-tg/snowflake-developer.git",
"main.py"
],
"env": {
"SNOWFLAKE_ACCOUNT": "your-account",
"SNOWFLAKE_USER": "your-username",
"SNOWFLAKE_PAT": "your-pat-token"
}
}
}
}
Option 2: Local installation
{
"mcpServers": {
"snowflake-developer": {
"command": "uv",
"args": [
"run",
"--directory",
"/path/to/snowflake-developer",
"python",
"main.py"
],
"env": {
"SNOWFLAKE_ACCOUNT": "your-account",
"SNOWFLAKE_USER": "your-username",
"SNOWFLAKE_PAT": "your-pat-token"
}
}
}
}
Setup Instructions:
git clone https://github.com/mcp-tg/snowflake-developer.git~/Library/Application Support/Claude/claude_desktop_config.json (macOS)/path/to/snowflake-developer with your actual pathNote: Cursor doesn't support environment variables in MCP configuration. You'll need to use the local installation option or set environment variables globally on your system.
Option 1: Direct from GitHub (requires global env vars)
{
"mcpServers": {
"snowflake-developer": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/mcp-tg/snowflake-developer.git",
"main.py"
]
}
}
}
Requires setting SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, and SNOWFLAKE_PAT as system environment variables.
Option 2: Local installation (recommended for Cursor)
{
"mcpServers": {
"snowflake-developer": {
"command": "uv",
"args": ["run", "/path/to/snowflake-developer/main.py"]
}
}
}
Use a local .env file in the project directory with your credentials.
Tools for managing database structure:
| Tool | Description | Example in Inspector | Natural Language Query |
|---|---|---|---|
alter_database |
Rename databases | database_name: OLD_DBnew_name: NEW_DB |
"Rename database OLD_DB to NEW_DB" |
alter_schema |
Rename or move schemas | schema_name: TEST_DB.OLD_SCHEMAnew_name: NEW_SCHEMA |
"Rename OLD_SCHEMA to NEW_SCHEMA in TEST_DB" |
alter_table |
Modify table structure | table_name: TEST_DB.PUBLIC.USERSalter_type: ADDcolumn_name: created_atdata_type: TIMESTAMP |
"Add a created_at timestamp column to TEST_DB.PUBLIC.USERS table" |
create_database |
Create a new database | database_name: TEST_DB |
"Create a new database called TEST_DB" |
create_schema |
Create a schema in a database | database_name: TEST_DBschema_name: ANALYTICS |
"Create a schema named ANALYTICS in TEST_DB database" |
create_table |
Create a table with columns | database_name: TEST_DBschema_name: PUBLICtable_name: USERScolumns: [{"name": "id", "type": "INT"}, {"name": "email", "type": "VARCHAR(255)"}] |
"Create a USERS table in TEST_DB.PUBLIC with id as INT and email as VARCHAR(255)" |
drop_database_object |
Drop any database object | object_type: TABLEobject_name: TEST_DB.PUBLIC.OLD_TABLE |
"Drop the table TEST_DB.PUBLIC.OLD_TABLE" |
execute_ddl_statement |
Run custom DDL SQL | ddl_statement: CREATE VIEW TEST_DB.PUBLIC.ACTIVE_USERS AS SELECT * FROM TEST_DB.PUBLIC.USERS WHERE status = 'active' |
"Create a view called ACTIVE_USERS that shows only active users" |
Tools for working with data:
| Tool | Description | Example in Inspector | Natural Language Query |
|---|---|---|---|
delete_data |
Delete rows from a table | table_name: TEST_DB.PUBLIC.USERSwhere_clause: status = 'deleted' |
"Delete all users with status 'deleted'" |
execute_dml_statement |
Run custom DML SQL | dml_statement: UPDATE TEST_DB.PUBLIC.USERS SET last_login = CURRENT_TIMESTAMP() WHERE id = 1 |
"Update the last login timestamp for user with id 1" |
insert_data |
Insert rows into a table | table_name: TEST_DB.PUBLIC.USERSdata: {"id": 1, "email": "[email protected]", "name": "John Doe"} |
"Insert a new user with id 1, email [email protected], and name John Doe into the USERS table" |
merge_data |
Synchronize data between tables | target_table: TEST_DB.PUBLIC.USERSsource_table: TEST_DB.STAGING.NEW_USERSmerge_condition: target.id = source.idmatch_actions: [{"action": "UPDATE", "columns": ["email", "name"], "values": ["source.email", "source.name"]}]not_match_actions: [{"action": "INSERT", "columns": ["id", "email", "name"], "values": ["source.id", "source.email", "source.name"]}] |
"Merge new users from staging table into production users table, updating existing records and inserting new ones" |
query_data |
Query data from tables | table_name: TEST_DB.PUBLIC.USERScolumns: ["id", "email", "name"]where_clause: status = 'active'limit: 10 |
"Show me the first 10 active users with their id, email, and name" |
update_data |
Update existing rows | table_name: TEST_DB.PUBLIC.USERSdata: {"status": "inactive"}where_clause: last_login < '2023-01-01' |
"Set status to inactive for all users who haven't logged in since January 2023" |
Tools for Snowflake-specific operations:
| Tool | Description | Example in Inspector | Natural Language Query |
|---|---|---|---|
alter_warehouse |
Modify warehouse settings | warehouse_name: COMPUTE_WHwarehouse_size: MEDIUMauto_suspend: 300 |
"Change COMPUTE_WH to MEDIUM size and auto-suspend after 5 minutes" |
describe_database_object |
Get object details | object_name: TEST_DB.PUBLIC.USERS |
"Describe the structure of TEST_DB.PUBLIC.USERS table" |
execute_sql_query |
Run any SQL query | query: SELECT CURRENT_USER(), CURRENT_WAREHOUSE() |
"Show me my current user and warehouse" |
grant_privileges |
Grant permissions | privileges: ["SELECT", "INSERT"]on_type: TABLEon_name: TEST_DB.PUBLIC.USERSto_type: ROLEto_name: ANALYST_ROLE |
"Grant SELECT and INSERT on TEST_DB.PUBLIC.USERS table to ANALYST_ROLE" |
revoke_privileges |
Revoke permissions | privileges: ["SELECT"]on_type: TABLEon_name: TEST_DB.PUBLIC.USERSfrom_type: ROLEfrom_name: ANALYST_ROLE |
"Revoke SELECT on TEST_DB.PUBLIC.USERS table from ANALYST_ROLE" |
set_context |
Set database/schema/warehouse/role | context_type: DATABASEcontext_name: TEST_DB |
"Use TEST_DB as the current database" |
show_database_objects |
List database objects | object_type: DATABASES |
"Show me all databases" |
test_snowflake_connection |
Test connection to Snowflake | (no parameters) | "Test my Snowflake connection" |
The server uses a simple per-operation connection pattern:
.env files to version controlContributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
This MCP server is built using FastMCP, a modern Python framework that simplifies building Model Context Protocol servers. FastMCP provides:
# Traditional MCP server setup
class MyMCPServer:
def __init__(self):
self.tools = {}
def register_tool(self, name, handler, schema):
# Manual registration and validation
pass
# FastMCP - Clean and Simple
from fastmcp import FastMCP
mcp = FastMCP("MyServer")
@mcp.tool()
def my_tool(param: str) -> str:
"""Tool with automatic type validation and documentation."""
return f"Result: {param}"
@mcp.resource("my://resource/{id}")
async def my_resource(id: str, ctx: Context) -> dict:
"""Resource with built-in async support and context."""
return {"data": f"Resource {id}"}
# Install FastMCP
pip install fastmcp
# Or with UV (recommended)
uv add fastmcp
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"snowflake-developer-mcp-server": {
"command": "npx",
"args": []
}
}
}