loading…
Search for a command to run...
loading…
A demo server that allows AI models to manage a personal reading list stored in a local SQLite database. It provides tools for searching, adding, and updating b
A demo server that allows AI models to manage a personal reading list stored in a local SQLite database. It provides tools for searching, adding, and updating books while demonstrating core Model Context Protocol features like resources and tools.
This is a functional Model Context Protocol (MCP) server built with the FastMCP framework. It provides a structured interface for an AI model to interact with a local SQLite database that tracks a personal reading list.
This project is a very simple demo designed to see the Model Context Protocol (MCP) in action. It serves as a minimal, "Hello World" style example to help you nail the basics of:
stdio transport.Initialize the Database: Create the SQLite database and populate it with sample data:
uv run python init_db.py
Run the Smoke Test: This script acts as a smoke test for your MCP server. It starts the server in the background and simulates how an AI model would interact with it (reading resources, calling tools) without needing an actual AI model connected:
uv run python main.py
server.py: The MCP server implementation using FastMCP.main.py: A smoke test script that demonstrates how to interact with the server.init_db.py: A setup script to create the local books.db SQLite database.pyproject.toml: Project configuration and dependencies (managed by uv).books.db: The local SQLite database (created after running init_db.py).You can connect this server to any MCP-compatible client. Replace /absolute/path/to/mcp-demo with /Users/sanka/Documents/workspace/mcp-demo in the examples below.
You can add the server automatically using the Gemini CLI:
gemini mcp add --scope project personal-library uv --directory $(pwd) run python server.py
Or manually add this to .gemini/settings.json:
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"],
"trust": true
}
}
}
Add this to your claude_desktop_config.json (typically in ~/Library/Application Support/Claude/ on macOS):
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"]
}
}
}
Open the MCP Settings in Cline or edit cline_mcp_settings.json:
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"]
}
}
}
Once you've connected the server to your favorite AI assistant, try these prompts:
settings.json (e.g., "personal-library") is a unique identifier for your AI client to manage multiple servers. FastMCP("Personal Library Manager") in server.py is what appears in the UI of apps like Claude Desktop.add_book, library://...) must match exactly between server.py and main.py.You don't need to use these names in your prompts! The AI assistant automatically discovers all available tools and resources once the server is connected.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"personal-library-mcp-server": {
"command": "npx",
"args": []
}
}
}