loading…
Search for a command to run...
loading…
Provides local Retrieval-Augmented Generation (RAG) capabilities using Ollama for embeddings and ChromaDB for vector storage. It enables users to ingest and per
Provides local Retrieval-Augmented Generation (RAG) capabilities using Ollama for embeddings and ChromaDB for vector storage. It enables users to ingest and perform semantic searches across PDF, Markdown, and TXT documents within MCP-compatible clients.
A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) functionality using local embeddings via Ollama and Chroma vector database.
npm run setup
This will:
./docs# Start MCP server
npm run dev
# Ingest new documents
npm run ingest
npm run stop
The server uses a config.json file for configuration:
{
"documentsPath": "./docs",
"chunkSize": 1000,
"chunkOverlap": 200,
"ollamaUrl": "http://localhost:11434",
"embeddingModel": "nomic-embed-text",
"chromaUrl": "http://localhost:8001",
"collectionName": "rag_documents",
"mcpServer": {
"name": "mcp-rag-server",
"version": "1.0.0"
}
}
ingest_docs({path?}) - Ingest documents from a directorysearch({query, k?}) - Search for relevant document chunksget_chunk({id}) - Retrieve a specific chunk by IDrefresh_index() - Clear and refresh the entire indexrag://collection/summary - Collection statistics and metadatarag://doc/<filename>#<chunk_id> - Individual document chunksAdd to your Cursor MCP settings:
{
"mcpServers": {
"rag-server": {
"command": "node",
"args": ["/Users/luizsoares/Documents/buildaz/mcp_rag/dist/index.js"],
"env": {}
}
}
}
npm run setup - Complete setup (Ollama + ChromaDB + build + ingest)npm run dev - Start MCP server in development modenpm run ingest - Ingest documentsnpm run build - Build the projectnpm run test - Run testsnpm run stop - Stop all servicesollama pull nomic-embed-text to install the embedding modelДобавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"mcp-rag-server": {
"command": "npx",
"args": []
}
}
}