loading…
Search for a command to run...
loading…
Demonstrates an MCP server with calculate and text_stats tools over Streamable HTTP, integrating with LangChain/LangGraph and n8n agents.
Demonstrates an MCP server with calculate and text_stats tools over Streamable HTTP, integrating with LangChain/LangGraph and n8n agents.
This project recreates the demo objective:
@modelcontextprotocol/inspector.The OpenAI key pasted in the prompt was exposed in chat. Revoke it and create a new key before running the agent. Put the new key in .env; do not commit it.
Copy-Item .env.example .env
npm install
Edit .env and set:
OPENAI_API_KEY=your_new_key
npm run server
The MCP endpoint is:
http://127.0.0.1:3000/mcp
The health endpoint is:
http://127.0.0.1:3000/health
In one terminal, keep the server running:
npm run server
In another terminal, list the tools with the Inspector CLI:
npm run inspector:list-tools
On Windows, the current Inspector CLI can print the correct JSON response and then exit with a Node/libuv assertion. If you see the tools JSON containing calculate and text_stats, the MCP call itself succeeded.
Call a tool with the Inspector CLI:
npx --yes @modelcontextprotocol/inspector --cli http://127.0.0.1:3000/mcp --transport http --method tools/call --tool-name calculate --tool-arg operation=add --tool-arg "numbers=[2,3,4]"
You can also open the Inspector UI:
npm run inspector
Then select:
Transport: Streamable HTTP
URL: http://127.0.0.1:3000/mcp
npm run smoke
This lists the MCP tools and calls calculate.
Make sure the server is running and .env contains a valid rotated OPENAI_API_KEY.
npm run agent
Custom prompt:
npm run agent -- "Calcule 42 / 6 puis analyse le texte: Bonjour depuis MCP."
Follow docs/n8n-agent.md.
The n8n MCP Client Tool configuration is:
Endpoint: http://127.0.0.1:3000/mcp
Server Transport: HTTP Streamable
Authentication: None
Tools to Include: All
Add this to claude_desktop_config.json and restart Claude Desktop.
{
"mcpServers": {
"mcp-streamable-http-demo": {
"command": "npx",
"args": []
}
}
}