loading…
Search for a command to run...
loading…
Provides AI agents with access to file metadata, vector search, and workflow metrics. It enables operations such as file metadata retrieval and semantic search
Provides AI agents with access to file metadata, vector search, and workflow metrics. It enables operations such as file metadata retrieval and semantic search over file embeddings using pgvector.
MCP Server for Istedlal AI Agents - file metadata, vector search, workflow metrics access.
requirements.txt for dependencies# Create virtual environment
python -m venv venv
venv\Scripts\activate # Windows
# Install dependencies
pip install -r requirements.txt
# Create .env with required variables (see docs/ENV_SETUP.md)
Terminal testing (use streamable-http to avoid "Invalid JSON: EOF" errors):
# .env: MCP_TRANSPORT=streamable-http
python -m src.main
# Server at http://localhost:8000/mcp
Cursor/IDE integration (stdio - Cursor spawns the process, don't run manually):
# .env: MCP_TRANSPORT=stdio
# Add server to Cursor MCP settings; Cursor will start it automatically
get_file_metadata - Fetch metadata for a file by ID (real DB when VECTOR_PROVIDER=pgvector)search_files - Search files by metadata filters (real DB when pgvector)semantic_search_files - Semantic search over file embeddings (Ollama + pgvector)See docs/MCP_INSPECTOR_GUIDE.md for the complete step-by-step guide.
npx -y @modelcontextprotocol/inspector
| Item | Required | Notes |
|---|---|---|
| Dockerfile | Yes | Build container image |
| .dockerignore | Yes | Exclude venv, .env, pycache |
| Production .env | Yes | Set on server (never commit) |
| Port 8000 | Yes | Expose for MCP endpoint |
| PostgreSQL + pgvector | Phase 2 | document_metadata, document_embeddings (see data/vectordb_schema_documentation.pdf) |
| Ollama | Phase 2 | For semantic search query embeddings |
.cursor/ – Cursor IDE config only, not needed on servervenv/ – Create fresh on server or use Docker.env – Contains secrets; set separately on server__pycache__/ – Python cache, auto-generateddata/ – Reference docs only, not runtimeMCP_TRANSPORT=streamable-http
HTTP_HOST=0.0.0.0
HTTP_PORT=8000
DATABASE_URL=postgresql://user:password@db-host:5432/dbname
VECTOR_PROVIDER=pgvector # mock | pgvector | chromadb
OLLAMA_URL=https://your-ollama:11433
OLLAMA_EMBEDDING_MODEL=llama3.2
OLLAMA_USERNAME= # if Basic Auth required
OLLAMA_PASSWORD=
LOG_LEVEL=INFO
MCP_BEARER_TOKEN=your-secret-token # Required – Bearer token auth for /mcp
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY src/ ./src/
ENV MCP_TRANSPORT=streamable-http
ENV PYTHONUNBUFFERED=1
EXPOSE 8000
CMD ["python", "-m", "src.main"]
venv/
.env
.git/
.cursor/
__pycache__/
*.pyc
data/
docs/
scripts/
tests/
infra/
docker build -t istedlal-mcp .docker run -p 8000:8000 -e DATABASE_URL=... -e MCP_BEARER_TOKEN=your-secret istedlal-mcpcurl http://localhost:8000/ (info page)http://your-server:8000/mcpinfra/k8s// returns JSON with status/mcp (for MCP clients only)Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"istedlal-mcp-server": {
"command": "npx",
"args": []
}
}
}