loading…
Search for a command to run...
loading…
An MCP-compatible server providing tools for retrieving order statuses, searching knowledge bases, and managing support tickets. It facilitates automated custom
An MCP-compatible server providing tools for retrieving order statuses, searching knowledge bases, and managing support tickets. It facilitates automated customer service interactions by exposing internal CRM and database functions through a JSON-RPC interface.
This project is a production-style starter for an AI customer support system built with FastAPI, a lightweight MCP-compatible tool server, and an OpenAI-compatible LLM client.
ai-support-agent/
├── backend/
│ ├── agent.py
│ ├── main.py
│ ├── mcp_server.py
│ ├── database/
│ │ └── models.py
│ └── tools/
│ ├── crm_tool.py
│ ├── kb_tool.py
│ ├── order_tool.py
│ └── ticket_tool.py
├── frontend/
│ └── simple_chat_ui.html
├── requirements.txt
└── README.md
FastAPI backend: serves the /chat API, health endpoint, and the simple browser UI.SupportAgent: orchestrates LLM responses and decides when to call tools.MCP server: exposes get_order_status, search_knowledge_base, create_support_ticket, and get_customer_details over JSON-RPC at /mcp.Connectors: isolated tool classes for orders, CRM, knowledge base, and ticketing.Database: SQLAlchemy models for customers, orders, and support_tickets.openai Python packagepip install -r requirements.txt
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_MODEL="gpt-4.1-mini"
export OPENAI_BASE_URL="https://api.openai.com/v1"
If OPENAI_API_KEY is not set, the app still runs in a rules-based fallback mode so you can test the flows locally.
uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000
Then open http://localhost:8000.
POST /chatRequest:
{
"customer_id": 1,
"message": "Where is my order #45231?",
"conversation_history": []
}
Response:
{
"response": "Order #45231 for Noise-Cancelling Headphones is currently shipped. Carrier: FedEx. Tracking: ZX991245US. Estimated delivery: 2026-03-15.",
"used_tools": [
{
"name": "get_order_status",
"arguments": {
"order_id": "45231"
},
"result": {
"order_id": 45231,
"customer_id": 1,
"item_name": "Noise-Cancelling Headphones",
"status": "shipped",
"tracking_number": "ZX991245US",
"shipping_carrier": "FedEx",
"estimated_delivery": "2026-03-15",
"total_amount": 199.99
}
}
],
"escalated": false,
"conversation_summary": "Customer 1 asked: Where is my order #45231?. Agent responded: ...",
"llm_mode": false
}
POST /mcpExample initialization request:
{
"jsonrpc": "2.0",
"id": "1",
"method": "initialize",
"params": {}
}
Example tool list request:
{
"jsonrpc": "2.0",
"id": "2",
"method": "tools/list",
"params": {}
}
Example tool call request:
{
"jsonrpc": "2.0",
"id": "3",
"method": "tools/call",
"params": {
"name": "get_order_status",
"arguments": {
"order_id": "45231"
}
}
}
The app creates and seeds a local SQLite database file named support_agent.db on startup with example customers, orders, and support tickets.
To migrate to PostgreSQL:
DATABASE_URL in backend/database/models.pyUser message:
Where is my order #45231?
Expected flow:
get_order_status(order_id).400400500Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"ai-customer-support-agent": {
"command": "npx",
"args": []
}
}
}