loading…
Search for a command to run...
loading…
An MCP-based AI agent that retrieves and processes documents to answer queries using a RAG pipeline with LangChain and Claude models. It enables document indexi
An MCP-based AI agent that retrieves and processes documents to answer queries using a RAG pipeline with LangChain and Claude models. It enables document indexing, context-aware retrieval, and multi-tool orchestration for research and knowledgebase applications.
MCP-based AI Research Assistant (RAG + LangChain + Claude)
What it does
AI agent that retrieves documents, processes context, and answers queries using an MCP architecture with RAG (Retrieval-Augmented Generation).
Tech stack
Features
Demo
See /app/demo_output.md for an example run showing Input → Retrieved documents → Final AI response. Include screenshots or short GIFs in the presentation/ folder if available.
How to run (quick)
python -m venv .venv
.venv\Scripts\activate # Windows
pip install -r requirements.txt
export OPENAI_API_KEY=...
export CLAUDE_API_KEY=...
# For Windows PowerShell:
$env:CLAUDE_API_KEY = '...'
python -m rag_pipeline.run # pipeline entry (if present)
python -m mcp_server.server # MCP server (if present)
Notes
/legacy_course./legacy_course, confirm and I will move them.Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"ai-research-assistant-mcp-server": {
"command": "npx",
"args": []
}
}
}Web content fetching and conversion for efficient LLM usage.
Retrieval from AWS Knowledge Base using Bedrock Agent Runtime.
Provides auto-configuration for setting up an MCP server in Spring Boot applications.
A very streamlined mcp client that supports calling and monitoring stdio/sse/streamableHttp, and can also view request responses through the /logs page. It also