loading…
Search for a command to run...
loading…
A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants
A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants
PyPI • codecov • PyPI Downloads • License: MIT
lint test MCP Compatible made-with-python python-3.13+ Ruff Checked with mypy prek oxfmt Ask DeepWiki
A Model Context Protocol (MCP) server / MCP server that connects AI assistants to Snowflake — enabling SQL queries, schema exploration, and data insights directly from your LLM client.
Highlights:
production, staging, and development environments in one file--exclude-json-results flag — reduces LLM context window usage--exclude_toolsThe fastest way to try it — using uvx with a TOML connection file:
# 1. Create a connections file
cat > ~/snowflake_connections.toml << 'EOF'
[myconn]
account = "your_account"
user = "your_user"
password = "your_password"
warehouse = "COMPUTE_WH"
database = "MY_DB"
schema = "PUBLIC"
role = "MYROLE"
EOF
# 2. Run the server
uvx --python=3.13 --from mcp-snowflake-server-nsp mcp_snowflake_server \
--connections-file ~/snowflake_connections.toml \
--connection-name myconn
Add to your MCP client config (e.g. claude_desktop_config.json) using snowflake_connections.toml:
"mcpServers": {
"snowflake": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "myconn"
]
}
}
Install in VS Code Install in VS Code Insiders
Or add manually to your MCP client config (e.g. .vscode/mcp.json) using .env file (see Authentication):
"snowflake": {
// Snowflake MCP server
"type": "stdio",
"command": "uvx",
"args": [
"--from", "mcp-snowflake-server-nsp",
"--python=3.13",
"mcp_snowflake_server"
],
"envFile": "${workspaceFolder}/.env"
}
Add to your MCP client config (e.g. opencode.jsonc) with .env file (see Authentication):
"snowflake": {
"type": "local",
"command": [
"uvx",
"--from",
"mcp-snowflake-server-nsp",
"--python=3.13",
"mcp_snowflake_server",
],
"enabled": true,
"timeout": 300000,
}
| URI | Description |
|---|---|
memo://insights |
A continuously updated memo aggregating data insights appended via append_insight. |
context://table/{table_name} |
(Prefetch mode only) Per-table schema summaries including columns and comments. |
| Tool | Description | Requires |
|---|---|---|
read_query |
Execute SELECT queries. Input: query (string). |
— |
write_query |
Execute INSERT, UPDATE, or DELETE queries. Input: query (string). |
--allow_write |
create_table |
Execute CREATE TABLE statements. Input: query (string). |
--allow_write |
| Tool | Description | Input |
|---|---|---|
list_databases |
List all databases in the Snowflake instance. | — |
list_schemas |
List all schemas within a database. | database (string) |
list_tables |
List all tables within a database and schema. | database, schema (strings) |
describe_table |
Describe columns of a table (name, type, nullability, default, comment). | table_name as database.schema.table |
| Tool | Description | Input |
|---|---|---|
append_insight |
Add a data insight to the memo://insights resource. |
insight (string) |
Set credentials via environment variables or CLI flags (see Configuration Reference):
SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake"
SNOWFLAKE_PASSWORD="secret"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
SNOWFLAKE_PRIVATE_KEY_FILE="/absolute/path/to/key.p8"
SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase" # Optional — only if key is encrypted
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
Or via CLI: --private_key_file /path/to/key.p8 --private_key_file_pwd passphrase
SNOWFLAKE_AUTHENTICATOR="externalbrowser"
Or in a TOML connection entry: authenticator = "externalbrowser"
Use the OAuth 2.0 client credentials flow to authenticate with a client ID and secret (no user interaction required):
SNOWFLAKE_AUTHENTICATOR="oauth_client_credentials"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_OAUTH_CLIENT_ID="your_client_id"
SNOWFLAKE_OAUTH_CLIENT_SECRET="your_client_secret"
SNOWFLAKE_OAUTH_TOKEN_REQUEST_URL="https://your-idp.example.com/oauth/token"
SNOWFLAKE_OAUTH_SCOPE="session:role:MY_ROLE" # Optional
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
Use a pre-fetched OAuth bearer token:
SNOWFLAKE_AUTHENTICATOR="oauth"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_TOKEN="eyJhbGciOiJSUzI1NiJ9..."
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
Manage multiple environments in a single file. See example_connections.toml for a full template.
[production]
account = "your_account"
user = "your_user"
password = "your_password"
authenticator = "snowflake"
warehouse = "COMPUTE_WH"
database = "PROD_DB"
schema = "PUBLIC"
role = "ACCOUNTADMIN"
[development]
account = "your_account"
user = "dev_user"
authenticator = "externalbrowser"
warehouse = "DEV_WH"
database = "DEV_DB"
schema = "PUBLIC"
role = "DEVELOPER"
[reporting]
account = "your_account"
user = "reporting_user"
authenticator = "snowflake_jwt"
private_key_file = "/path/to/private_key.pem"
private_key_file_pwd = "passphrase" # Optional
warehouse = "REPORTING_WH"
database = "REPORTING_DB"
schema = "REPORTS"
role = "REPORTING_ROLE"
[analytics_oauth]
account = "your_account"
authenticator = "oauth_client_credentials"
oauth_client_id = "your_client_id"
oauth_client_secret = "your_client_secret"
oauth_token_request_url = "https://your-idp.example.com/oauth/token"
oauth_scope = "session:role:ANALYTICS_ROLE" # Optional
warehouse = "ANALYTICS_WH"
database = "ANALYTICS_DB"
schema = "PUBLIC"
role = "ANALYTICS_ROLE"
Pass the file with --connections-file and select a profile with --connection-name. Both flags are required together.
The package is published on PyPI as mcp-snowflake-server-nsp.
"mcpServers": {
"snowflake_production": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/path/to/snowflake_connections.toml",
"--connection-name", "production"
// Optional flags — see Configuration Reference
]
},
"snowflake_staging": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/path/to/snowflake_connections.toml",
"--connection-name", "staging"
]
}
}
"mcpServers": {
"snowflake": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--account", "your_account",
"--warehouse", "your_warehouse",
"--user", "your_user",
"--password", "your_password",
"--role", "your_role",
"--database", "your_database",
"--schema", "your_schema"
// Optional: "--private_key_file", "/absolute/path/key.p8"
// Optional: "--private_key_file_pwd", "passphrase"
// Optional flags — see Configuration Reference
]
}
}
Install Visual Studio Code
Install uv:
curl -LsSf https://astral.sh/uv/install.sh | sh
Create a .env file with your Snowflake credentials (or use a TOML connection file — see Authentication):
SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_ROLE="MYROLE"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_AUTHENTICATOR="snowflake"
SNOWFLAKE_PASSWORD="secret"
# Key-pair alternative:
# SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
# SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8
# SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"
# Browser SSO alternative:
# SNOWFLAKE_AUTHENTICATOR="externalbrowser"
(Optional) Edit runtime_config.json to exclude specific databases, schemas, or tables (see Exclusion Patterns).
Test locally:
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
Add to .vscode/mcp.json:
"snowflake-local": {
"type": "stdio",
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "development"
// Optional flags — see Configuration Reference
],
}
"snowflake-local": {
"type": "stdio",
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
// Optional flags — see Configuration Reference / .env.example file
],
"envFile": "/absolute/path/to/.env"
}
Install Claude AI Desktop App
Install uv:
curl -LsSf https://astral.sh/uv/install.sh | sh
Create a .env file with your Snowflake credentials (or use a TOML connection file — see Authentication):
SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_ROLE="MYROLE"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_AUTHENTICATOR="snowflake"
SNOWFLAKE_PASSWORD="secret"
# Key-pair alternative:
# SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
# SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8
# SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"
# Browser SSO alternative:
# SNOWFLAKE_AUTHENTICATOR="externalbrowser"
(Optional) Edit runtime_config.json to exclude specific databases, schemas, or tables (see Exclusion Patterns).
Test locally:
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
Add to claude_desktop_config.json:
"mcpServers": {
"snowflake_local": {
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "development"
// Optional flags — see Configuration Reference
]
}
}
"mcpServers": {
"snowflake_local": {
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server"
// Optional flags — see Configuration Reference
]
}
}
A Dockerfile is included for containerised deployments:
# Build
docker build -t mcp-snowflake-server .
# Run (pass credentials as environment variables)
docker run --rm \
-e SNOWFLAKE_USER="[email protected]" \
-e SNOWFLAKE_ACCOUNT="myaccount" \
-e SNOWFLAKE_AUTHENTICATOR="snowflake" \
-e SNOWFLAKE_PASSWORD="secret" \
-e SNOWFLAKE_WAREHOUSE="COMPUTE_WH" \
-e SNOWFLAKE_DATABASE="MY_DB" \
-e SNOWFLAKE_SCHEMA="PUBLIC" \
-e SNOWFLAKE_ROLE="MYROLE" \
mcp-snowflake-server
# Or override the entrypoint arguments directly
docker run --rm mcp-snowflake-server \
--account your_account \
--user your_user \
--authenticator snowflake \
--password your_password \
--warehouse COMPUTE_WH \
--database MY_DB \
--schema PUBLIC \
--role MYROLE
All connection parameters can also be set as environment variables (SNOWFLAKE_<PARAM_UPPER>).
| Flag | Env var | Default | Description |
|---|---|---|---|
--account |
SNOWFLAKE_ACCOUNT |
— | Snowflake account identifier |
--user |
SNOWFLAKE_USER |
— | Snowflake username |
--password |
SNOWFLAKE_PASSWORD |
— | Password (not required for key-pair / SSO) |
--warehouse |
SNOWFLAKE_WAREHOUSE |
— | Virtual warehouse to use |
--database |
SNOWFLAKE_DATABASE |
(required) | Default database |
--schema |
SNOWFLAKE_SCHEMA |
(required) | Default schema |
--role |
SNOWFLAKE_ROLE |
— | Role to assume |
--private_key_file |
SNOWFLAKE_PRIVATE_KEY_FILE |
— | Absolute path to .p8 private key file |
--private_key_file_pwd |
SNOWFLAKE_PRIVATE_KEY_FILE_PWD |
— | Passphrase for encrypted private key |
--connections-file |
— | — | Path to TOML connections file |
--connection-name |
— | — | Connection profile name in TOML file (required with --connections-file) |
--allow_write |
— | false |
Enable write_query and create_table tools |
--prefetch / --no-prefetch |
— | false |
Pre-load table schema as context://table/* resources (disables list_tables / describe_table) |
--exclude_tools |
— | [] |
Space-separated list of tool names to disable |
--exclude-json-results |
— | false |
Omit embedded JSON resources from responses (reduces context window usage) |
--log_dir |
— | — | Directory for log file output |
--log_level |
— | INFO |
Log verbosity: DEBUG, INFO, WARNING, ERROR, CRITICAL |
Edit runtime_config.json to exclude databases, schemas, or tables from all discovery tools. Patterns are matched case-insensitively as substrings.
{
"exclude_patterns": {
"databases": ["temp"],
"schemas": ["temp", "information_schema"],
"tables": ["temp"]
}
}
The server loads this file automatically at startup from the working directory.
# Install all dependencies (uv + bun) and set up Git hooks
make install
# Reinstall Git hooks if needed
make hooks
# Run all prek hooks across the repo (includes oxfmt, ruff, mypy)
make hooks-run
# Check formatting with oxfmt (non-destructive)
make fmt-check
# Auto-format all files with oxfmt
make fmt
# Lint & auto-fix with Ruff
make ruff
# Run tests
make test
# Run tests with terminal coverage report
make coverage
# Run tests and open HTML coverage report
make coverage-html
# Run the server locally
make run
Requires uv and bun. Python dev dependencies include ruff, mypy, pytest, pytest-asyncio, pytest-cov, and prek. The multi-language formatter oxfmt is managed via bun (package.json / bun.lock). Hook configuration lives in prek.toml; formatter configuration in .oxfmtrc.json.
Full AI-generated documentation: Ask DeepWiki
Test coverage sunburst:
This project is licensed under the MIT License. See the LICENSE file for the full text.
This repository is a fork of isaacwasserman/mcp-snowflake-server.
MseeP.ai Security Assessment Badge
nsphung.Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"mcp-snowflake-server-nsp": {
"command": "npx",
"args": []
}
}
}