loading…
Search for a command to run...
loading…
Enables local task management through the Model Context Protocol with tools for creating, updating, and deleting tasks. It also provides read-only resources for
Enables local task management through the Model Context Protocol with tools for creating, updating, and deleting tasks. It also provides read-only resources for viewing task summaries and full task lists.
An educational full-stack project that extends a Model Context Protocol (MCP) task-manager server with a React front-end that teaches MCP concepts hands-on. The React app acts as an MCP client, exposing Resources, Tools, and Prompts in a single UI with in-app explanations. AG Grid is the primary way tabular data is displayed, making this simultaneously a learning exercise for both MCP and AG Grid.
┌─────────────────────────────────────┐
│ MCP Clients │
│ Claude Desktop Cursor React App │
└────────┬──────────┬────────┬────────┘
│ │ │
STDIO STDIO HTTP/SSE
│ │ │
│ ┌──┴────────┴──┐
│ │ HTTP Proxy │ :3001
│ │ (Node/tsx) │
│ └──────┬───────┘
│ │ STDIO
└──────────────┤
┌──────┴───────┐
│ MCP Server │
│ (Node/TS) │
└──────────────┘
src/) — TypeScript/Node, STDIO transport. Exposes Resources, Tools, and Prompts for task management with deadlines.proxy/) — Spawns the MCP server as a subprocess and bridges it to the browser over HTTP + Server-Sent Events.client/) — MCP client UI with AG Grid as the primary data presentation layer, educational copy, and a natural-language chat interface.First, build the MCP server (required once, or after server changes):
npm run build
Then start all three processes in parallel:
npm run dev
If you already have stale local processes from an earlier run, use:
npm run dev:clean
This kills anything currently bound to ports 3001 (proxy) and 5173 (client), then starts everything.
This runs:
| Process | URL | Description |
|---|---|---|
| MCP Server | — | Runs via STDIO (spawned by proxy) |
| HTTP/SSE Proxy | http://localhost:3001 |
Bridges browser ↔ MCP server |
| React Client | http://localhost:5173 |
Educational MCP client UI |
Run each in a separate terminal if you prefer:
# Terminal 1 – MCP server (STDIO, consumed by proxy)
npm run start
# Terminal 2 – HTTP/SSE proxy
npm run dev --prefix proxy
# Terminal 3 – React client
npm run dev --prefix client
The chat tab requires Ollama running locally. If you skip this, the rest of the app works fine — only the AI chat tab is affected.
brew install ollama
Or download from ollama.com/download for other platforms.
ollama serve
On macOS, Ollama may already be running as a menu bar app after installation. You can verify with:
curl http://localhost:11434/v1/models
ollama pull llama3.1
The proxy reads LLM settings from proxy/.env. Copy the example and edit as needed:
cp proxy/.env.example proxy/.env
The defaults point to Ollama (http://localhost:11434, model llama3.1). To use a different model or provider, just edit proxy/.env — any OpenAI-compatible API works (Ollama, OpenAI, Anthropic-compatible, etc.). See proxy/.env.example for examples.
The MCP server works as a standalone app for use with other native MCP clients able to send STDIO, such as Claude desktop.
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"task-manager": {
"command": "node",
"args": ["/absolute/path/to/todo-mcp-server/build/index.js"]
}
}
}
Replace /absolute/path/to/ with your actual path.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"task-manager-mcp-server": {
"command": "npx",
"args": []
}
}
}