loading…
Search for a command to run...
loading…
MCP server for document ingestion and semantic search on Qdrant. Enables ingesting local documents, generating embeddings with OpenAI, and performing vector sea
MCP server for document ingestion and semantic search on Qdrant. Enables ingesting local documents, generating embeddings with OpenAI, and performing vector search with metadata filters.
MCP server for document ingestion and semantic search on Qdrant.
qdrant-mcp provides tools to:
ingest_documentsdocx, pptx, and pdf to Markdown via MarkItDownchunk_size and overlap_ratiotext-embedding-3-small by default)search_documentsk matches from Qdrantcategory and pathuvhttp://localhost:6333)OPENAI_API_KEYuv sync
[mcp_servers.qdrant-mcp]
command = "uv"
args = ["run", "qdrant-mcp"]
cwd = "/sandbox/qdrant-mcp"
env = {
OPENAI_API_KEY = "sk-...",
QDRANT_URL = "http://127.0.0.1:6333",
QDRANT_API_KEY = "QDRANT_API_KEY",
QDRANT_COLLECTION = "codex_collection",
CHUNK_HEADER_MODEL = "gpt-5.4-mini"
}
Set OPENAI_API_KEY, QDRANT_URL, and QDRANT_API_KEY in .env, then run:
uv run python -m unittest tests/integration/test_qdrant_integration.py
ingest_documentsParameters:
paths: list[str]category: strchunk_size: int = 1200overlap_ratio: float = 0.15embedding_model: str = "text-embedding-3-small"chunk_header_mode: Literal["enabled", "disabled"] = "enabled"Returns:
collectionembedding_modelingested_filesingested_pointsfailed_filessearch_documentsParameters:
query: strtop_k: int = 5category: str | None = Nonepath: str | None = Noneembedding_model: str = "text-embedding-3-small"Returns:
collectionembedding_modelquerycountresults (score, path, category, chunk_index, text)delete_documents_by_pathParameters:
path: strcategory: str | None = NoneReturns:
collectionpathcategorystatusoperation_idlist_categoryParameters:
limit: int = 100Returns:
collectioncountcategorieslist_pathParameters:
category: strlimit: int = 1000Returns:
collectioncategorycountpathscategory and path do not exist, they are created during ingestion.Chunk-Header (max 64 chars), derived from the first 4096 bytes, to every chunk.CHUNK_HEADER_MODEL when chunk_header_mode is enabled (default: gpt-5.4-mini).QDRANT_COLLECTION (not via MCP tool parameters).text-embedding-3-small, the vector size is 1536.See LICENSE.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"qdrant-mcp": {
"command": "npx",
"args": []
}
}
}