loading…
Search for a command to run...
loading…
Enables AI agents and LLMs to interact with the ICON mcp v105 API through standardized tools. It supports Model Context Protocol integration for seamless access
Enables AI agents and LLMs to interact with the ICON mcp v105 API through standardized tools. It supports Model Context Protocol integration for seamless access to ICON endpoints and asynchronous operations.
This is an MCP (Model Context Protocol) server that provides access to the ICON mcp v105 API. It enables AI agents and LLMs to interact with ICON mcp v105 through standardized tools.
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)Note: Replace example_tool with actual ICON mcp v105 API tools based on the documentation.
Clone this repository:
git clone https://github.com/Traia-IO/icon-mcp-v105-mcp-server.git
cd icon-mcp-v105-mcp-server
Run with Docker:
./run_local_docker.sh
.env file with your configuration:
PORT=8000
2. Start the server:
```bash
docker-compose up
Install dependencies using uv:
uv pip install -e .
Run the server:
uv run python -m server
## Usage
### Health Check
Test if the server is running:
```bash
python mcp_health_check.py
from traia_iatp.mcp.traia_mcp_adapter import create_mcp_adapter
# Connect to the MCP server
with create_mcp_adapter(
url="http://localhost:8000/mcp/"
) as tools:
# Use the tools
for tool in tools:
print(f"Available tool: {tool.name}")
# Example usage
result = await tool.example_tool(query="test")
print(result)
python mcp_health_check.pyTo add new tools, edit server.py and:
@mcp.tool() decorated functionsdeployment_params.json with the tool names in the capabilities arrayThe deployment_params.json file contains the deployment configuration for this MCP server:
{
"github_url": "https://github.com/Traia-IO/icon-mcp-v105-mcp-server",
"mcp_server": {
"name": "icon-mcp-v105-mcp",
"description": "V105 icon mcp",
"server_type": "streamable-http",
"capabilities": [
// List all implemented tool names here
"example_tool"
]
},
"deployment_method": "cloud_run",
"gcp_project_id": "traia-mcp-servers",
"gcp_region": "us-central1",
"tags": ["icon mcp v105", "api"],
"ref": "main"
}
Important: Always update the capabilities array when you add or remove tools!
This server is designed to be deployed on Google Cloud Run. The deployment will:
/mcp endpoint for client connectionsPORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)docker logs <container-id>Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"icon-mcp-v105": {
"command": "npx",
"args": []
}
}
}Web content fetching and conversion for efficient LLM usage.
Retrieval from AWS Knowledge Base using Bedrock Agent Runtime.
Provides auto-configuration for setting up an MCP server in Spring Boot applications.
A very streamlined mcp client that supports calling and monitoring stdio/sse/streamableHttp, and can also view request responses through the /logs page. It also