loading…
Search for a command to run...
loading…
MCP server that exposes RocketRide AI pipelines as tools for Claude, Cursor, and Windsurf. Self-hosted, open-source pipeline tool with multi-LLM support.
MCP server that exposes RocketRide AI pipelines as tools for Claude, Cursor, and Windsurf. Self-hosted, open-source pipeline tool with multi-LLM support.
Open-source, developer-native AI pipeline tool.
Build, debug, and deploy production AI workflows - without leaving your IDE.
RocketRide is an open-source data pipeline builder and runtime built for AI and ML workloads. With 50+ pipeline nodes spanning 13 LLM providers, 8 vector databases, OCR, NER, and more — pipelines are defined as portable JSON, built visually in VS Code, and executed by a multithreaded C++ runtime. From real-time data processing to multimodal AI search, RocketRide runs entirely on your own infrastructure.
Home | Documentation | Python SDK | TypeScript SDK | MCP Server

Design, test, and ship complex AI workflows from a visual canvas, right where you write code.

Drop pipelines into any Python or TypeScript app with a few lines of code, no infrastructure glue required.
Install the extension for your IDE. Search for RocketRide in the extension marketplace:
Click the RocketRide extension in your IDE
Deploy a server - you'll be prompted on how you want to run the server. Choose the option that fits your setup:
All pipelines are recognized with the *.pipe format. Each pipeline and configuration is a JSON object - but the extension in your IDE will render within our visual builder canvas.
All pipelines begin with source node: webhook, chat, or dropper. For specific usage, examples, and inspiration on how to build pipelines, check out our guides and documentation.
Connect input lanes and output lanes by type to properly wire your pipeline. Some nodes like agents or LLMs can be invoked as tools for use by a parent node as shown below:
You can run a pipeline from the canvas by pressing the ▶️ button on the source node or from the Connection Manager directly.
Deploy your pipelines on your own infrastructure.
Docker - Download the RocketRide server image and create a container. Requires Docker to be installed.
docker pull ghcr.io/rocketride-org/rocketride-engine:latest
docker create --name rocketride-engine -p 5565:5565 ghcr.io/rocketride-org/rocketride-engine:latest
Local deployment - Download the runtime of your choice as a standalone process in the 'Deploy' page of the Connection Manager
Run your pipelines as standalone processes or integrate them into your existing Python and TypeScript/JS applications utilizing our SDK.
Selecting running pipelines allows for in-depth analytics. Trace call trees, token usage, memory consumption, and more to optimize your pipelines before scaling and deploying. Find the models, agents, and tools best fit for your task.
RocketRide is built by a growing community of contributors. Whether you've fixed a bug, added a node, improved docs, or helped someone on Discord, thank you. New contributions are always welcome - check out our contributing guide to get started.
Made with 🤍 in SF & EU
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"rocketride-org-rocketride-server": {
"command": "npx",
"args": []
}
}
}Web content fetching and conversion for efficient LLM usage.
Retrieval from AWS Knowledge Base using Bedrock Agent Runtime.
Provides auto-configuration for setting up an MCP server in Spring Boot applications.
A very streamlined mcp client that supports calling and monitoring stdio/sse/streamableHttp, and can also view request responses through the /logs page. It also