loading…
Search for a command to run...
loading…
A local AI-powered file reader that connects a Python MCP server with Ollama's Mistral model for offline file summarization. It provides secure file discovery a
A local AI-powered file reader that connects a Python MCP server with Ollama's Mistral model for offline file summarization. It provides secure file discovery and reading capabilities without requiring API keys or cloud services.
A fully local, free AI-powered file reader that uses MCP (Model Context Protocol) to connect a Python tool server with a local LLM (Ollama + Mistral). No API keys, no cloud, no cost — runs entirely on your machine.
list_files, read_file)| Technology | Description |
|---|---|
| Python | Core language for server and client |
| MCP | Model Context Protocol (tool server) |
| Ollama | Local LLM runtime (free, offline) |
| Mistral | Local AI model for summarization |
| asyncio | Async communication between client/server |
| requests | HTTP calls to Ollama API |
git clone https://github.com/JaneKarunyaJ/MCP-File-Reader.git
cd MCP-File-Reader
pip install mcp requests
Install Ollama from https://ollama.com, then pull the model:
ollama pull mistral
Make sure Ollama is running (it starts automatically after installation), then:
python client.py
The client will:
list_files to discover files in my_files/read_file for each file foundMCP-File-Reader/
│
├── server.py # MCP server — exposes list_files and read_file tools
├── client.py # MCP client — calls tools and queries Ollama
├── requirements.txt # Python dependencies
└── my_files/ # Folder the AI is allowed to read
├── project_ideas.txt
└── wishlist.txt
my_files/ directory../../etc/passwd) are automatically blockedclient.py
│
├── Step 1: Calls MCP tool → list_files()
│ ↓
│ Returns filenames from my_files/
│
├── Step 2: Calls MCP tool → read_file(filename)
│ ↓
│ Returns actual file contents
│
└── Step 3: Sends real content to Ollama (Mistral)
↓
Returns AI summary
.txt file into my_files/ and run againserver.py (e.g. search_in_file, write_file)user_question in client.py to ask anything about your filesMODEL = "mistral" in client.py to any model you have pulled in Ollamaollama pull mistral)mcp and requests Python packagesДобавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"mcp-file-reader": {
"command": "npx",
"args": []
}
}
}