loading…
Search for a command to run...
loading…
MCP server for DeepSeek AI models (Chat + Reasoner). Supports multi-turn sessions, model fallback with circuit breaker, function calling, thinking mode, JSON ou
MCP server for DeepSeek AI models (Chat + Reasoner). Supports multi-turn sessions, model fallback with circuit breaker, function calling, thinking mode, JSON output, multimodal input, and cost tracking.
MCP server for DeepSeek AI with chat, reasoning, multi-turn sessions, function calling, thinking mode, and cost tracking.
Compatible with Claude Code, Gemini CLI, Cursor, Windsurf, and any MCP-compatible client.
Officially listed on the MCP Registry, Smithery, Glama, LobeHub, and Fronteir AI.
Use the hosted endpoint directly — no npm install, no Node.js required. Bring your own DeepSeek API key:
Claude Code:
claude mcp add --transport http deepseek \
https://deepseek-mcp.tahirl.com/mcp \
--header "Authorization: Bearer YOUR_DEEPSEEK_API_KEY"
Cursor / Windsurf / VS Code:
{
"mcpServers": {
"deepseek": {
"url": "https://deepseek-mcp.tahirl.com/mcp",
"headers": {
"Authorization": "Bearer ${DEEPSEEK_API_KEY}"
}
}
}
}
Claude Code:
claude mcp add -s user deepseek npx @arikusi/deepseek-mcp-server -e DEEPSEEK_API_KEY=your-key-here
Gemini CLI:
gemini mcp add deepseek npx @arikusi/deepseek-mcp-server -e DEEPSEEK_API_KEY=your-key-here
Scope options (Claude Code):
-s user: Available in all your projects (recommended)-s local: Only in current project (default)-s project: Project-specific .mcp.json fileGet your API key: https://platform.deepseek.com
session_id parameterdeepseek://models, deepseek://config, deepseek://usage — query model info, config, and usage statsthinking: {type: "enabled"}json_mode: truedeepseek_sessions toolENABLE_MULTIMODAL=true)deepseek-mcp.tahirl.com/mcp — BYOK (Bring Your Own Key), no install neededTRANSPORT=httpIf you prefer to install manually:
npm install -g @arikusi/deepseek-mcp-server
git clone https://github.com/arikusi/deepseek-mcp-server.git
cd deepseek-mcp-server
npm install
npm run build
Once configured, your MCP client will have access to deepseek_chat and deepseek_sessions tools, plus 3 MCP resources.
Example prompts:
"Use DeepSeek to explain quantum computing"
"Ask DeepSeek Reasoner to solve: If I have 10 apples and buy 5 more..."
Your MCP client will automatically call the deepseek_chat tool.
If your MCP client doesn't support the add command, manually add to your config file:
{
"mcpServers": {
"deepseek": {
"command": "npx",
"args": ["@arikusi/deepseek-mcp-server"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key-here"
}
}
}
}
Config file locations:
~/.claude.json (add to projects["your-project-path"].mcpServers section)deepseek_chatChat with DeepSeek AI models with automatic cost tracking and function calling support.
Parameters:
messages (required): Array of conversation messagesrole: "system" | "user" | "assistant" | "tool"content: Message texttool_call_id (optional): Required for tool role messagesmodel (optional): "deepseek-chat" (default) or "deepseek-reasoner"temperature (optional): 0-2, controls randomness (default: 1.0). Ignored when thinking mode is enabled.max_tokens (optional): Maximum tokens to generate (deepseek-chat: max 8192, deepseek-reasoner: max 65536)stream (optional): Enable streaming mode (default: false)tools (optional): Array of tool definitions for function calling (max 128)tool_choice (optional): "auto" | "none" | "required" | {type: "function", function: {name: "..."}}thinking (optional): Enable thinking mode {type: "enabled"}json_mode (optional): Enable JSON output mode (supported by both models)session_id (optional): Session ID for multi-turn conversations. Previous context is automatically prepended.Response includes:
cost_usd and tool_calls fieldsExample:
{
"messages": [
{
"role": "user",
"content": "Explain the theory of relativity in simple terms"
}
],
"model": "deepseek-chat",
"temperature": 0.7,
"max_tokens": 1000
}
DeepSeek Reasoner Example:
{
"messages": [
{
"role": "user",
"content": "If I have 10 apples and eat 3, then buy 5 more, how many do I have?"
}
],
"model": "deepseek-reasoner"
}
The reasoner model will show its thinking process in <thinking> tags followed by the final answer.
Function Calling Example:
{
"messages": [
{
"role": "user",
"content": "What's the weather in Istanbul?"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City name"
}
},
"required": ["location"]
}
}
}
],
"tool_choice": "auto"
}
When the model decides to call a function, the response includes tool_calls with the function name and arguments. You can then send the result back using a tool role message with the matching tool_call_id.
Thinking Mode Example:
{
"messages": [
{
"role": "user",
"content": "Analyze the time complexity of quicksort"
}
],
"model": "deepseek-chat",
"thinking": { "type": "enabled" }
}
When thinking mode is enabled, temperature, top_p, frequency_penalty, and presence_penalty are automatically ignored.
JSON Output Mode Example:
{
"messages": [
{
"role": "user",
"content": "Return a json object with name, age, and city fields for a sample user"
}
],
"model": "deepseek-chat",
"json_mode": true
}
JSON mode ensures the model outputs valid JSON. Include the word "json" in your prompt for best results. Supported by both deepseek-chat and deepseek-reasoner.
Multi-Turn Session Example:
{
"messages": [
{
"role": "user",
"content": "What is the capital of France?"
}
],
"session_id": "my-session-1"
}
Use the same session_id across requests to maintain conversation context. Messages are stored in memory and prepended automatically. In HTTP transport each connected MCP session has its own isolated session store — a session_id created by one HTTP client is not visible to another (see HTTP Transport below).
deepseek_sessionsManage conversation sessions.
Parameters:
action (required): "list" | "clear" | "delete"session_id (optional): Required when action is "delete"Examples:
{"action": "list"}
{"action": "delete", "session_id": "my-session-1"}
{"action": "clear"}
MCP Resources provide read-only data about the server:
| Resource URI | Description |
|---|---|
deepseek://models |
Available models with capabilities, context limits, and pricing |
deepseek://config |
Current server configuration (API key masked) |
deepseek://usage |
Real-time usage statistics (requests, tokens, costs, sessions) |
When a model fails with a retryable error (429, 503, timeout), the server automatically falls back to the other model:
deepseek-chat fails → tries deepseek-reasonerdeepseek-reasoner fails → tries deepseek-chatThe circuit breaker protects against cascading failures:
CIRCUIT_BREAKER_THRESHOLD consecutive failures (default: 5), the circuit opens (fast-fail mode)CIRCUIT_BREAKER_RESET_TIMEOUT ms (default: 30000), it enters half-open state and sends a probe requestFallback can be disabled with FALLBACK_ENABLED=false.
Prompt templates (12 total):
Each prompt is optimized for the DeepSeek Reasoner model to provide detailed reasoning.
Both models run DeepSeek-V3.2 with unified pricing.
The server is configured via environment variables. All settings except DEEPSEEK_API_KEY are optional.
| Variable | Default | Description |
|---|---|---|
DEEPSEEK_API_KEY |
(required) | Your DeepSeek API key |
DEEPSEEK_BASE_URL |
https://api.deepseek.com |
Custom API endpoint |
DEFAULT_MODEL |
deepseek-chat |
Default model for requests |
SHOW_COST_INFO |
true |
Show cost info in responses |
REQUEST_TIMEOUT |
60000 |
Request timeout in milliseconds |
MAX_RETRIES |
2 |
Maximum retry count for failed requests |
SKIP_CONNECTION_TEST |
false |
Skip startup API connection test |
MAX_MESSAGE_LENGTH |
100000 |
Maximum message content length (characters) |
SESSION_TTL_MINUTES |
30 |
Session time-to-live in minutes |
MAX_SESSIONS |
100 |
Maximum number of concurrent sessions |
FALLBACK_ENABLED |
true |
Enable automatic model fallback on errors |
CIRCUIT_BREAKER_THRESHOLD |
5 |
Consecutive failures before circuit opens |
CIRCUIT_BREAKER_RESET_TIMEOUT |
30000 |
Milliseconds before circuit half-opens |
MAX_SESSION_MESSAGES |
200 |
Max messages per session (sliding window) |
ENABLE_MULTIMODAL |
false |
Enable multimodal (image) input support |
TRANSPORT |
stdio |
Transport mode: stdio or http |
HTTP_PORT |
3000 |
HTTP server port (when TRANSPORT=http) |
Example with custom config:
claude mcp add -s user deepseek npx @arikusi/deepseek-mcp-server \
-e DEEPSEEK_API_KEY=your-key \
-e SHOW_COST_INFO=false \
-e REQUEST_TIMEOUT=30000
deepseek-mcp-server/
├── worker/ # Cloudflare Worker (remote BYOK endpoint)
│ ├── src/index.ts # Worker entry point
│ ├── wrangler.toml # Cloudflare config
│ └── package.json
├── src/
│ ├── index.ts # Entry point, bootstrap
│ ├── server.ts # McpServer factory (auto-version)
│ ├── deepseek-client.ts # DeepSeek API wrapper (circuit breaker + fallback)
│ ├── config.ts # Centralized config with Zod validation
│ ├── cost.ts # Cost calculation and formatting
│ ├── schemas.ts # Zod input validation schemas
│ ├── types.ts # TypeScript types + type guards
│ ├── errors.ts # Custom error classes
│ ├── session.ts # In-memory session store (multi-turn)
│ ├── circuit-breaker.ts # Circuit breaker pattern
│ ├── usage-tracker.ts # Usage statistics tracker
│ ├── transport-http.ts # Streamable HTTP transport (Express)
│ ├── tools/
│ │ ├── deepseek-chat.ts # deepseek_chat tool (sessions + fallback)
│ │ ├── deepseek-sessions.ts # deepseek_sessions tool
│ │ └── index.ts # Tool registration aggregator
│ ├── resources/
│ │ ├── models.ts # deepseek://models resource
│ │ ├── config.ts # deepseek://config resource
│ │ ├── usage.ts # deepseek://usage resource
│ │ └── index.ts # Resource registration aggregator
│ └── prompts/
│ ├── core.ts # 5 core reasoning prompts
│ ├── advanced.ts # 5 advanced prompts
│ ├── function-calling.ts # 2 function calling prompts
│ └── index.ts # Prompt registration aggregator
├── dist/ # Compiled JavaScript
├── llms.txt # AI discoverability index
├── llms-full.txt # Full docs for LLM context
├── vitest.config.ts # Test configuration
├── package.json
├── tsconfig.json
└── README.md
npm run build
npm run watch
# Run all tests
npm test
# Watch mode
npm run test:watch
# With coverage report
npm run test:coverage
# Set API key
export DEEPSEEK_API_KEY="your-key"
# Run the server
npm start
The server will start and wait for MCP client connections via stdio.
A hosted BYOK (Bring Your Own Key) endpoint is available at:
https://deepseek-mcp.tahirl.com/mcp
Send your DeepSeek API key as Authorization: Bearer <key>. No server-side API key stored — your key is used directly per request. Powered by Cloudflare Workers (global edge, zero cold start).
Note: The
deepseek-reasonermodel may take over 30 seconds for complex queries. Some MCP clients (e.g. Claude Code) have built-in tool call timeouts that may interrupt long-running requests. For complex tasks,deepseek-chatis recommended.
# Test health
curl https://deepseek-mcp.tahirl.com/health
# Test MCP (requires auth)
curl -X POST https://deepseek-mcp.tahirl.com/mcp \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_KEY" \
-d '{"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{}},"id":1}'
Run your own HTTP endpoint:
TRANSPORT=http HTTP_PORT=3000 DEEPSEEK_API_KEY=your-key node dist/index.js
Test the health endpoint:
curl http://localhost:3000/health
The MCP endpoint is available at POST /mcp (Streamable HTTP protocol).
Session isolation (1.7.0+): In HTTP transport each connected MCP session
gets its own McpServer instance and its own SessionStore. Conversation
history, session listings, and deletions are scoped to the MCP session that
created them — one client cannot read, enumerate, or wipe another client's
sessions. STDIO transport is single-tenant by nature and unaffected.
# Build
docker build -t deepseek-mcp-server .
# Run
docker run -d -p 3000:3000 -e DEEPSEEK_API_KEY=your-key deepseek-mcp-server
# Or use docker-compose
DEEPSEEK_API_KEY=your-key docker compose up -d
The Docker image defaults to HTTP transport on port 3000 with a built-in health check.
Option 1: Use the correct installation command
# Make sure to include -e flag with your API key
claude mcp add deepseek npx @arikusi/deepseek-mcp-server -e DEEPSEEK_API_KEY=your-key-here
Option 2: Manually edit the config file
If you already installed without the API key, edit your config file:
~/.claude.json (Windows: C:\Users\USERNAME\.claude.json)"mcpServers" section under your project pathenv field with your API key:"deepseek": {
"type": "stdio",
"command": "npx",
"args": ["@arikusi/deepseek-mcp-server"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key-here"
}
}
dist/index.js is correctnpm run buildMake the file executable:
chmod +x dist/index.js
To share this MCP server with others:
npm loginnpm publish --access publicUsers can then install with:
npm install -g @arikusi/deepseek-mcp-server
Contributions are welcome! Please read our Contributing Guidelines before submitting PRs.
Found a bug or have a feature request? Please open an issue using our templates.
# Clone the repo
git clone https://github.com/arikusi/deepseek-mcp-server.git
cd deepseek-mcp-server
# Install dependencies
npm install
# Build in watch mode
npm run watch
# Run tests
npm test
# Lint
npm run lint
See CHANGELOG.md for version history and updates.
MIT License - see LICENSE file for details
Made by @arikusi
This is an unofficial community project and is not affiliated with DeepSeek.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"deepseek-mcp-server": {
"command": "npx",
"args": []
}
}
}Read, send and search emails from Claude
Send, search and summarize Slack messages
No-code MCP client for team chat platforms, such as Slack, Microsoft Teams, and Discord.
A community discord server dedicated to MCP by [Frank Fiegel](https://github.com/punkpeye)