loading…
Search for a command to run...
loading…
A Model Context Protocol (MCP) server that enables AI assistants to query Tideways performance monitoring data and provide conversational performance insights f
A Model Context Protocol (MCP) server that enables AI assistants to query Tideways performance monitoring data and provide conversational performance insights for PHP applications.
CI/CD Pipeline Release Security Pipeline npm version npm downloads
A Model Context Protocol (MCP) server that enables AI assistants to query Tideways performance monitoring data and provide conversational performance insights for PHP applications.
About Tideways: Tideways is a powerful application performance monitoring (APM) platform designed specifically for PHP applications. For technical details, see the REST API documentation.
Forked from abuhamza/tideways-mcp-server by Mouhammed Diop.
Repository: 5hahiL/tideways-mcp-server License: MIT
metrics, issues, traces) - see API documentationThis is an MCP (Model Context Protocol) server designed exclusively for AI assistants. It cannot be used as a standalone CLI tool.
The server integrates with AI assistants through MCP configuration using the npm package tideways-mcp.
| Variable | Required | Default | Description |
|---|---|---|---|
TIDEWAYS_TOKEN |
✅ | - | Tideways API access token (see Security section) |
TIDEWAYS_ORG |
✅ | - | Tideways organization name |
TIDEWAYS_PROJECT |
✅ | - | Tideways project name |
TIDEWAYS_BASE_URL |
❌ | https://app.tideways.io/apps/api |
Tideways API base URL |
TIDEWAYS_RATE_LIMIT |
❌ | 2500 |
API requests per hour — match to your plan (Team/Pro: 2500, Standard: 1000, Basic: 250) |
TIDEWAYS_MAX_RETRIES |
❌ | 3 |
Maximum API retry attempts |
TIDEWAYS_REQUEST_TIMEOUT |
❌ | 30000 |
API request timeout (ms) |
LOG_LEVEL |
❌ | info |
Log level (debug, info, warn, error) |
This server only works with MCP-compatible AI assistants. It uses stdio transport.
Add to your Claude Desktop MCP configuration file:
Location:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json~/.config/claude/claude_desktop_config.jsonConfiguration (Recommended - using npx):
{
"mcpServers": {
"tideways": {
"command": "npx",
"args": ["tideways-mcp"],
"env": {
"TIDEWAYS_TOKEN": "your_token",
"TIDEWAYS_ORG": "your_org",
"TIDEWAYS_PROJECT": "your_project"
}
}
}
}
Alternative (if installed globally):
{
"mcpServers": {
"tideways": {
"command": "tideways-mcp",
"env": {
"TIDEWAYS_TOKEN": "your_token",
"TIDEWAYS_ORG": "your_org",
"TIDEWAYS_PROJECT": "your_project"
}
}
}
}
Cursor supports MCP through its settings. Add the server configuration in Cursor's MCP settings:
{
"mcpServers": {
"tideways": {
"command": "tideways-mcp",
"env": {
"TIDEWAYS_TOKEN": "your_token",
"TIDEWAYS_ORG": "your_org",
"TIDEWAYS_PROJECT": "your_project"
}
}
}
}
If using VS Code with an MCP-compatible extension:
{
"mcp.servers": {
"tideways": {
"command": "npx",
"args": ["tideways-mcp"],
"env": {
"TIDEWAYS_TOKEN": "your_token",
"TIDEWAYS_ORG": "your_org",
"TIDEWAYS_PROJECT": "your_project"
}
}
}
}
Once configured, you can ask your AI assistant questions like:
/api/users/{id} endpoint and identify bottlenecks"/dashboard and recommend code optimizations"All tools return raw JSON from the Tideways API. The AI assistant (Claude, Cursor, etc.) performs the actual analysis and interpretation of this data.
get_performance_metricsRetrieve aggregate performance metrics and system-wide statistics.
Parameters:
ts (optional): End timestamp in Y-m-d H:i format (e.g., "2025-08-12 18:30")m (optional): Number of minutes backward from timestamp (e.g., 60 for 1 hour, 1440 for 24 hours)env (optional): Filter by specific environments (optional): Filter by specific service nameConversational Examples:
"What's the current performance of my application?"
"Show me performance metrics for the last 6 hours"
"Get metrics for the API service in production"
"How is my web service performing in the staging environment?"
"Compare today's metrics with the last 24 hours"
Returns: Raw performance data from Tideways including response times, throughput, error rates, and transaction breakdowns.
get_performance_summaryRetrieve time-series performance summary data in 15-minute intervals for trend analysis.
Parameters:
s (optional): Service name to filter by (e.g., "web", "api", "worker"). Default: "web"Conversational Examples:
"Show me performance trends over the last few hours"
"Get the performance summary for my API service"
"How has my web service been performing recently?"
"Display trends for the worker service"
"Show me response time patterns for today"
Returns: Raw time-series data with 15-minute intervals showing response times, request counts, and error rates.
get_issuesRetrieve and analyze recent errors, exceptions, and performance issues.
Parameters:
issue_type (optional): "error", "slowsql", "deprecated", "all" (default: "all")status (optional): "open", "new", "resolved", "not_error", "ignored", "all" (default: "open")page (optional): Page number for pagination (default: 1)Conversational Examples:
"What errors are currently happening in my application?"
"Show me all open errors from the last 24 hours"
"Get slow SQL queries that need attention"
"Are there any new performance issues I should know about?"
"List all deprecated function calls in my code"
"Show me resolved errors to understand what was fixed"
Returns: Raw issue data from Tideways including error types, occurrence counts, affected endpoints, and stack traces where available.
get_tracesAnalyze individual trace samples for detailed bottleneck identification and performance debugging.
Parameters:
env (optional): Environment name (e.g., "production", "staging")s (optional): Service name (e.g., "web", "api", "worker")transaction_name (optional): Filter by specific transaction/endpoint namehas_callgraph (optional): Only return traces with detailed callgraph datasearch (optional): Word-based search on transaction_name, host, and URLmin_date (optional): Minimal date in YYYY-MM-DD HH:MM format (requires max_date)max_date (optional): Maximal date in YYYY-MM-DD HH:MM format (requires min_date)min_response_time_ms (optional): Minimum response time filtermax_response_time_ms (optional): Maximum response time filtersort_by (optional): "response_time", "date", "memory" (default: "response_time")sort_order (optional): "ASC", "DESC" (default: "DESC")Conversational Examples:
"Analyze traces for the /api/products endpoint and find bottlenecks"
"Show me the slowest requests from the last hour with details"
"Find traces with callgraph data for the checkout process"
"What's causing slow response times in my user registration flow?"
"Detect N+1 query problems in my product listing page"
"Analyze memory usage patterns in my API endpoints"
"Find database bottlenecks in the /dashboard endpoint"
"Show me traces where response time is over 2 seconds"
Returns: Raw trace data from Tideways including per-request timing, layer breakdown (SQL, Redis, HTTP, etc.), bottleneck flags, and callgraph data when has_callgraph: true is set. Use has_callgraph: true for the deepest debugging detail.
get_historical_dataRetrieve historical performance data for specific dates with configurable granularity.
Parameters:
date (required): Date in YYYY-MM-DD formatgranularity (optional): "day", "week", "month" (default: "day")Conversational Examples:
"Get historical performance data for August 1st, 2025"
"Show me weekly performance trends for last Monday"
"Compare this month's performance with last month"
"How did my application perform on 2025-07-15?"
"Get daily performance data for the past week"
"Show me monthly trends for the last quarter"
Returns: Raw historical performance data from Tideways for the specified date and granularity.
├── src/
│ ├── config/ # Configuration management
│ ├── lib/ # Core libraries
│ │ ├── errors.ts # Error handling utilities
│ │ ├── logger.ts # Structured logging
│ │ └── tideways-client.ts # Tideways API client
│ ├── tools/ # MCP tool implementations
│ │ ├── definitions.ts # Tool schema definitions
│ │ ├── registry.ts # Tool execution registry
│ │ └── handlers/ # Individual tool handlers
│ ├── types/ # TypeScript type definitions
│ ├── utils/ # Utility functions
│ ├── server.ts # Main MCP server implementation
│ └── index.ts # Application entry point
├── tests/ # Test suites
└── dist/ # Compiled JavaScript (generated)
# Run all tests
npm test
# Run tests with coverage
npm run test:coverage
# Run tests in watch mode
npm run test:watch
# Run type checking
npm run typecheck
# Build TypeScript to JavaScript
npm run build
# Clean build artifacts
npm run clean
# Run linter
npm run lint
# Fix linting issues
npm run lint:fix
# Format code
npm run format
src/server.ts): Main server implementing MCP protocol, handles tool definitions and routingsrc/lib/tideways-client.ts): HTTP client with rate limiting, retry logic, and security measuressrc/tools/): Modular tool system with individual handlers for each MCP toolsrc/lib/errors.ts): Centralized error handling with user-friendly messagessrc/lib/logger.ts): Structured JSON logging for monitoring and debuggingsrc/config/index.ts): Environment-based configuration managementAI Assistant ←→ MCP Protocol (stdio) ←→ TidewaysMCPServer → TidewaysClient → Tideways API
↓
Raw JSON Response → AI Assistant
This server uses a raw JSON approach for optimal performance:
JSON.stringify(apiData, null, 2) without formattingTIDEWAYS_RATE_LIMIT to match your Tideways plan (default: 2500/hr)Bearer [REDACTED]The server provides structured JSON logs for monitoring:
{
"timestamp": "2025-08-09T10:00:00.000Z",
"level": "info",
"message": "Tool called",
"context": {
"toolName": "get_performance_metrics",
"arguments": {"time_range": "24h"}
}
}
Authentication Error
Error: Authentication failed. Please check your API token.
TIDEWAYS_TOKEN is correct and has required scopes (metrics, issues, traces)Rate Limit Exceeded
Error: Rate limit exceeded. Please try again later.
TIDEWAYS_RATE_LIMIT to match your actual plan limitConnection Issues
Error: Network error: Unable to connect to Tideways API.
app.tideways.iocurl -H "Authorization: Bearer YOUR_TOKEN" https://app.tideways.io/apps/api/_tokenMCP Integration Issues
Error: MCP server not responding or connection failed
npx tideways-mcpEnable debug logging for detailed troubleshooting:
# When running directly
LOG_LEVEL=debug npx tideways-mcp
# In MCP configuration, add to env:
{
"env": {
"LOG_LEVEL": "debug",
"TIDEWAYS_TOKEN": "your_token",
...
}
}
Contributions welcome!
git checkout -b your-featurenpm testThis project is licensed under the MIT License - see the LICENSE file for details.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"tideways-mcp-server": {
"command": "npx",
"args": []
}
}
}