loading…
Search for a command to run...
loading…
Provides a standalone MCP interface for analyzing and querying code repositories using Trailmark, supporting multiple graphs and snapshot management.
Provides a standalone MCP interface for analyzing and querying code repositories using Trailmark, supporting multiple graphs and snapshot management.
Trailmark MCP Server is a standalone MCP wrapper around railofbits/trailmark.
While I do understand the ToB's usage with Claude skills, my usecase requires an MCP server that can analyze and server multiple graphs. The server can scan multiple repositories and the LLM can request information from each separately.
Mostly created with OpenAI GPT-5.5 via Github Copilot in VS Code. Point your LLM to the ai-docs directory for documentation and development support.
uvProject metadata:
trailmark-mcptrailmark-mcpInstall runtime and development dependencies:
uv sync --group dev
Start server over stdio:
uv run trailmark-mcp serve --transport stdio
Smoke-test direct scan path without an MCP client:
uv run trailmark-mcp scan /path/to/repo
Skip preanalysis during scan when needed:
uv run trailmark-mcp scan /path/to/repo --skip-preanalysis
Primary lifecycle entrypoint is open_repository(...).
Behavior summary:
rescan=False, the server reloads the latest snapshot into a live sessionrescan=True, the server rebuilds from source and saves a fresh snapshotThis means the common flow is:
open_repositorysave_snapshot after meaningful in-memory mutations when you want persistencesession_id is MCP wrapper state, not Trailmark core state.
Current semantics:
open_repository(...) call creates a new session idsession_id to target a specific graphsession_id uses the most recently opened still-open sessionUse current_repository(session_id=...) to verify which repository a session points to.
Lifecycle:
open_repositorycurrent_repositoryclose_repositorysave_snapshotNavigation:
graph_summarydiff_graphssearch_nodescallers_ofcallees_ofancestors_ofreachable_frompaths_betweenentrypoint_paths_toattack_surfacecomplexity_hotspotsfunctions_that_raiseContext and mutation:
subgraphannotations_offindingsnodes_with_annotationrun_preanalysisannotate_nodeclear_annotationsaugment_findingsNotes:
diff_graphs(before_session_id, after_session_id) treats after as the new statesearch_nodes supports contains, exact, and suffixscan_repository and tool_manifest are intentionally not part of the public runtime anymoreSnapshots are written under the analyzed repository, not under this server repository:
<target-repo>/.trailmark/snapshots/<timestamp>/
Current snapshot artifacts include:
graph.jsonsummary.jsonentrypoints.jsonhotspots.jsonsubgraphs.jsonscan-metadata.jsonSnapshots support reload into a live session. Use rescan=True when you explicitly need a fresh rebuild from source.
Key files:
src/trailmark_mcp/cli.py: CLI entrypoint for scan and servesrc/trailmark_mcp/mcp_app.py: MCP tool registrationsrc/trailmark_mcp/tool_catalog.py: declarative metadata for exposed toolssrc/trailmark_mcp/services/registry.py: session trackingsrc/trailmark_mcp/services/runtime.py: main Trailmark-backed runtime behaviorRun focused test suite:
uv run --group dev pytest tests/test_tool_catalog.py tests/test_registry.py tests/test_stdio_server.py
Current CI runs that same focused suite on Python 3.12.
Extension rule:
mcp_app.pytool_catalog.pyVS Code can launch this server directly through MCP using a workspace-level mcp.json file.
Typical setup:
uv sync --group dev.vscode/mcp.jsonstdioThis repository already includes .vscode/mcp.json for local use.
Example mcp.json:
{
"servers": {
"trailmark-mcp": {
"type": "stdio",
"command": "uv",
"args": [
"run",
"trailmark-mcp",
"serve",
"--transport",
"stdio"
]
}
}
}
If you use this server from a larger multi-project workspace, copy the same definition into that workspace root's .vscode/mcp.json and make sure the command runs in an environment where uv and this project are available.
Add this to claude_desktop_config.json and restart Claude Desktop.
{
"mcpServers": {
"trailmark-mcp-server": {
"command": "npx",
"args": []
}
}
}