loading…
Search for a command to run...
loading…
Atlassian Jira/Confluence MCP server with schema-validated Markdown↔ADF roundtrips — surfaces content-model violations at conversion time instead of silently dr
Atlassian Jira/Confluence MCP server with schema-validated Markdown↔ADF roundtrips — surfaces content-model violations at conversion time instead of silently dropping panels, mentions, and layouts.
Crates.io Documentation Build Status License: BSD-3-Clause
An intelligent Git commit message toolkit with AI-powered contextual intelligence. Transform messy commit histories into professional, conventional commit formats with project-aware suggestions.
# Install from crates.io
cargo install omni-dev
# Install with Nix
nix profile install github:rust-works/omni-dev
# Install with Nix flakes (development)
nix run github:rust-works/omni-dev
# Enable binary cache for faster builds (optional)
cachix use omni-dev
Next step: see Getting Started — a 10-minute walkthrough from authentication to your first AI-improved commit. (For just the API-key reference, see Authentication.)
For faster Nix builds, you can use the binary cache:
# Install cachix if you don't have it
nix profile install nixpkgs#cachix
# Enable the omni-dev binary cache
cachix use omni-dev
# Now Nix installations will use pre-built binaries instead of compiling from source
nix profile install github:rust-works/omni-dev
Watch omni-dev transform messy commits into professional ones with AI-powered analysis
Transform your commit messages and create professional PRs with AI intelligence:
# Analyze and improve commit messages in your current branch
omni-dev git commit message twiddle 'origin/main..HEAD' --use-context
# Before: "fix stuff", "wip", "update files"
# After: "feat(auth): implement OAuth2 authentication system"
# "docs(api): add comprehensive endpoint documentation"
# "fix(ui): resolve mobile responsive layout issues"
# Create a professional PR with AI-generated description
omni-dev git branch create pr
# 🎉 Generates comprehensive PR with detailed description, testing info, and more
twiddle)The star feature - intelligently improve your commit messages with real-time model information display:
# Improve commits with contextual intelligence
omni-dev git commit message twiddle 'origin/main..HEAD' --use-context
# Process large commit ranges with parallel processing
omni-dev git commit message twiddle 'HEAD~20..HEAD' --concurrency 5
# Save suggestions to file for review
omni-dev git commit message twiddle 'HEAD~5..HEAD' \
--save-only suggestions.yaml
# Auto-apply improvements without confirmation
omni-dev git commit message twiddle 'HEAD~3..HEAD' --auto-apply
# Analyze commits in detail (YAML output)
omni-dev git commit message view 'HEAD~3..HEAD'
# Analyze current branch vs main
omni-dev git branch info main
# Get comprehensive help
omni-dev help-all
Create professional pull requests with AI-generated descriptions:
# Generate and create PR with AI-powered description
omni-dev git branch create pr
# Create PR with specific base branch
omni-dev git branch create pr main
# Save PR details to file without creating
omni-dev git branch create pr --save-only pr-description.yaml
# Auto-create without confirmation
omni-dev git branch create pr --auto-apply
Read, write, and manage JIRA issues and Confluence pages from the command line:
# Authenticate with Atlassian Cloud
omni-dev atlassian auth login
# Check authentication status
omni-dev atlassian auth status
# Fetch a JIRA issue as markdown
omni-dev atlassian jira read PROJ-123
# Fetch as raw ADF JSON
omni-dev atlassian jira read PROJ-123 --format adf
# Push markdown changes back to JIRA
omni-dev atlassian jira write PROJ-123 issue.md
# Interactive edit: fetch, edit in $EDITOR, push
omni-dev atlassian jira edit PROJ-123
# Search issues with JQL
omni-dev atlassian jira search --project PROJ --status Open
# Create an issue
omni-dev atlassian jira create issue.md --project PROJ --summary "Fix bug"
# Transition an issue
omni-dev atlassian jira transition PROJ-123 "In Progress"
# Confluence: read, search, create pages
omni-dev atlassian confluence read 12345
omni-dev atlassian confluence search --space ENG --title auth
omni-dev atlassian confluence create page.md --space ENG --title "New Page"
# Convert markdown to ADF JSON (offline)
omni-dev atlassian convert to-adf input.md
Authenticate against the Datadog API and query metrics, monitors, dashboards, logs, events, SLOs, hosts, and downtimes. See the Datadog integration guide for the full subcommand reference, authentication setup, rate-limit behaviour, and troubleshooting.
# Configure Datadog API credentials (prompts for API key, APP key, and site)
omni-dev datadog auth login
# Verify the credentials by calling /api/v1/validate
omni-dev datadog auth status
# Query metrics, monitors, dashboards, logs, and SLOs
omni-dev datadog metrics query --query 'avg:system.cpu.user{*}' --from 15m
omni-dev datadog monitor list --tags env:prod
omni-dev datadog dashboard list
omni-dev datadog logs search --filter 'service:api status:error' --from 1h
omni-dev datadog slo list --tags team:platform
DATADOG_SITE defaults to datadoghq.com. Other regions (datadoghq.eu,
us3.datadoghq.com, us5.datadoghq.com, ap1.datadoghq.com, ddog-gov.com)
are recognised without warning. Environment variables DATADOG_API_KEY,
DATADOG_APP_KEY, DATADOG_SITE override the stored settings. For on-prem
or proxied installs, set DATADOG_API_URL to override the site-derived URL.
All Datadog subcommands are also exposed as MCP tools (datadog_*) — see
docs/mcp.md. For the full guide covering
every family with worked examples, see docs/datadog.md.
Pull captions and transcripts from external media platforms. YouTube is the first supported source; the CLI namespace and library are designed so additional sources (Vimeo, podcast RSS, generic VTT/SRT URLs) can be added without restructuring. See docs/transcript.md for the full reference and the recipe for adding a new source.
# Fetch captions for a YouTube video as SubRip (default).
omni-dev transcript youtube fetch https://www.youtube.com/watch?v=jNQXAC9IVRw
# WebVTT to a file, falling through to auto-generated captions if needed.
omni-dev transcript youtube fetch jNQXAC9IVRw \
--format vtt --auto --output me-at-the-zoo.vtt
# Synthesise a translated track when no native French track exists.
omni-dev transcript youtube fetch <url> --lang fr --translate fr
# List available caption tracks (manual + auto-generated).
omni-dev transcript youtube list-langs <url>
# Show video metadata (title, channel, duration, languages).
omni-dev transcript youtube info <url> --output json
--format accepts srt, vtt, txt, or json. Locators may be a
watch?v= URL, a youtu.be/ short URL, a /shorts/ or /embed/ URL,
or a bare 11-character video ID. Age-gated and login-required videos
surface as a typed PlayabilityRefused error carrying YouTube's status
code rather than a generic HTTP failure.
# Apply specific amendments from YAML file
omni-dev git commit message amend amendments.yaml
Generate ready-to-use Claude Code slash-command templates into the
project's .claude/commands/ directory. Each template is a self-contained
workflow that drives a multi-step omni-dev operation from inside a Claude
Code session.
# Generate all templates: commit-twiddle, pr-create, pr-update
omni-dev commands generate all
# Or individually
omni-dev commands generate commit-twiddle
omni-dev commands generate pr-create
omni-dev commands generate pr-update
Each subcommand writes .claude/commands/<name>.md. Commit the files to
share the workflows with collaborators — Claude Code picks them up
automatically, so anyone in the repo can invoke /commit-twiddle,
/pr-create, or /pr-update inside a Claude Code session. See the
user guide
for the full reference.
Export your Claude Code chat history to a directory of .jsonl files for
behavioural analysis, work-log generation, or downstream tooling. Re-running
acts as an idempotent sync: new chats are added, modified chats are
overwritten, unchanged chats are skipped.
# Mirror ~/.claude/projects to ./history/ (one .jsonl per chat, grouped by project slug)
omni-dev ai claude history sync --target ./history
# Limit to one project (encoded slug or decoded cwd path)
omni-dev ai claude history sync --target ./history --project /Users/me/work/repo
# Only sessions touched in the last week
omni-dev ai claude history sync --target ./history --since 7d
# Preview without writing, then prune target files for sessions removed upstream
omni-dev ai claude history sync --target ./history --dry-run --prune
# Render LLM-friendly markdown alongside the raw jsonl (one .md per session)
omni-dev ai claude history sync --target ./history --output-format jsonl,markdown
# Markdown only — suitable for piping into a coaching LLM
omni-dev ai claude history sync --target ./history --output-format markdown
The export is a behavioural transcript, not a faithful archive. The top-level session jsonl captures all prompts, responses, thinking blocks, tool calls, and tool-result metadata — the signal needed for analysis. Sub-agent internal turns, large tool-output sidecars, PDF page rasters, and Claude's auto-memory are deliberately excluded; they would bloat any LLM-ingested corpus without adding interaction-pattern signal.
In-progress chats produce a valid jsonl prefix (the source size is captured
once at the start of the copy), so you can sync safely while a chat is open.
The target layout mirrors the source — <target>/<slug>/<uuid>.jsonl — and
source mtime is preserved on each target file so downstream tooling can
sort sessions chronologically without parsing every file.
--output-format markdown writes a derived <target>/<slug>/<uuid>.md
alongside (or instead of) the jsonl. Each markdown file has YAML frontmatter
with session metadata followed by ## User / ## Assistant turns; tool calls
render as ### Tool call: <name> blocks, thinking blocks collapse into
<details>, and sub-agent (Agent) calls render the prompt argument only.
Agent-to-user interactions are surfaced as first-class structured events so the analyst LLM sees what was actually asked and how the user responded:
AskUserQuestion calls render as ### Agent question: <header> with the
question text and a bulleted list of options (with descriptions); the
paired user reply renders as ## User response.**Tool result (<tool>, denied by user):** —
detected by the canonical "The user doesn't want to proceed with this tool
use" sentinel Claude Code stuffs into the next tool_result.**Tool result (<tool>, interrupted by user):**.error label; successes use ok.System reminders, attachments, and permission-mode events are included by
default — pass --exclude-system to drop them. Markdown idempotency keys off
source mtime alone (the rendered length differs from the source length), and
--prune only deletes artifacts whose extension matches one of the formats
listed in --output-format.
See docs/user-guide.md#ai-claude-history-sync--export-conversation-history
for the in-depth reference, and the broader Claude Code Integration
section for related commands (ai chat, ai claude skills).
omni-dev ships an optional Model Context Protocol server so AI assistants
(Claude Desktop, Claude Code, the MCP Inspector, custom agents) can call
omni-dev over stdio instead of shelling out to the CLI. The server is
delivered as a second binary, omni-dev-mcp, gated behind the mcp Cargo
feature (see ADR-0021).
Tools cover six domains:
| Domain | Examples |
|---|---|
| Git (5) | git_view_commits, git_branch_info, git_check_commits, git_twiddle_commits, git_create_pr |
| JIRA (28) | core read/write/search/transition/comment/link/dev/delete; sprints, boards, watchers, worklogs, fields, attachments, projects, changelog |
| Confluence (13) | read/write/search/create/delete/download/children, comments, labels, user search |
| Atlassian shared (2) | atlassian_auth_status, atlassian_convert (offline JFM ↔ ADF) |
| Datadog (14) | metrics, monitors, dashboards, logs, events, SLOs, hosts, downtimes, metrics catalog |
| AI / Config (5) | ai_chat (one-shot chat), claude_skills_* (sync / clean / status for .claude/skills/ distribution), config_models_show |
Resources exposed via URI templates:
| URI template | Returns |
|---|---|
git://repo/commits/{range} |
YAML commit analysis |
jira://issue/{key} |
JIRA issue as JFM |
jira://issue/{key}.adf |
JIRA issue body as ADF |
confluence://page/{id} |
Confluence page as JFM |
confluence://page/{id}.adf |
Confluence page body as ADF |
omni-dev://specs/{name} |
Embedded reference specs (e.g. jfm) |
See docs/mcp.md for the full tool catalog, resource
reference, cross-cutting parameters (output_file, confirm), and
troubleshooting.
cargo install omni-dev --features mcp
This adds a second binary, omni-dev-mcp, alongside the regular omni-dev
CLI. The default cargo install omni-dev build is unchanged — no MCP
dependencies are pulled in unless the mcp feature is enabled.
Edit ~/Library/Application Support/Claude/claude_desktop_config.json on
macOS (or %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"omni-dev": {
"command": "omni-dev-mcp"
}
}
}
Per-project — create .mcp.json at the repo root:
{
"mcpServers": {
"omni-dev": {
"command": "omni-dev-mcp"
}
}
}
Or register globally with the Claude Code CLI:
claude mcp add omni-dev omni-dev-mcp
npx @modelcontextprotocol/inspector omni-dev-mcp
The Inspector opens a browser UI where you can list tools and resources, call any tool interactively, and fetch resources against the current working directory.
For troubleshooting (stderr logs, RUST_LOG=debug, "failed to open git
repository"), see docs/mcp.md#troubleshooting.
# Show supported AI models and their specifications
omni-dev config models show
# View model information with token limits and capabilities
omni-dev config models show | grep -A5 "claude-opus-4.1"
omni-dev understands your project context to provide better suggestions:
Create .omni-dev/ directory in your repo root:
mkdir .omni-dev
.omni-dev/scopes.yaml)scopes:
- name: "auth"
description: "Authentication and authorization systems"
examples: ["auth: add OAuth2 support", "auth: fix token validation"]
file_patterns: ["src/auth/**", "auth.rs"]
- name: "api"
description: "REST API endpoints and handlers"
examples: ["api: add user endpoints", "api: improve error responses"]
file_patterns: ["src/api/**", "handlers/**"]
.omni-dev/commit-guidelines.md)# Project Commit Guidelines
## Format
- Use conventional commits: `type(scope): description`
- Keep subject line under 50 characters
- Use imperative mood: "Add feature" not "Added feature"
## Our Scopes
- `auth` - Authentication systems
- `api` - REST API changes
- `ui` - Frontend/UI components
omni-dev automatically detects:
.omni-dev/, CONTRIBUTING.mdfeature/auth-system)Large commit ranges are automatically split into manageable batches:
# Processes 50 commits in batches of 4 (default)
omni-dev git commit message twiddle 'HEAD~50..HEAD' --use-context
# Custom concurrency for very large ranges
omni-dev git commit message twiddle 'main..HEAD' --concurrency 2
| Option | Description | Example |
|---|---|---|
--use-context |
Enable contextual intelligence | --use-context |
--concurrency N |
Number of parallel commit processors (default: 4) | --concurrency 3 |
--no-coherence |
Skip cross-commit coherence refinement pass | --no-coherence |
--context-dir PATH |
Custom context directory | --context-dir ./config |
--auto-apply |
Apply without confirmation | --auto-apply |
--save-only FILE |
Save to file without applying | --save-only fixes.yaml |
Before: Messy commit history
e4b2c1a fix stuff
a8d9f3e wip
c7e1b4f update files
9f2a6d8 more changes
After: Professional commit messages
e4b2c1a feat(auth): implement JWT token validation system
a8d9f3e docs(api): add comprehensive OpenAPI documentation
c7e1b4f fix(ui): resolve mobile responsive layout issues
9f2a6d8 refactor(core): optimize database query performance
# 1. Work on your feature branch
git checkout -b feature/user-dashboard
# 2. Make commits (don't worry about perfect messages)
git commit -m "wip"
git commit -m "fix stuff"
git commit -m "add more features"
# 3. Before merging, improve all commit messages
omni-dev git commit message twiddle 'main..HEAD' --use-context
# 4. Create professional PR with AI-generated description
omni-dev git branch create pr
# ✅ Professional commit history + comprehensive PR description ready for review
We welcome contributions! Please see our Contributing Guidelines for details.
Clone the repository:
git clone https://github.com/rust-works/omni-dev.git
cd omni-dev
Install Rust (if you haven't already):
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Build the project:
cargo build
Run the build script (includes tests, linting, and formatting):
./scripts/build.sh
Or run individual steps:
cargo test # Run tests
cargo clippy # Run linting
cargo fmt # Format code
.env, or CI/CD secrets)omni-dev config models show~/.omni-dev/settings.json or ANTHROPIC_MODEL environment variableomni-dev atlassian auth loginomni-dev datadog auth loginomni-dev supports five AI backends, selected by env var or the
--ai-backend flag (priority order, first match wins):
--ai-backend claude-cli / OMNI_DEV_AI_BACKEND=claude-cli — sandboxed
claude -p subprocess that reuses your Claude Code session.USE_OLLAMA=true — local Ollama or LM Studio server.USE_OPENAI=true — OpenAI Chat Completions API.CLAUDE_CODE_USE_BEDROCK=true — AWS Bedrock.See the AI Backends Guide for required env vars,
model selection, the Claude CLI sandbox and its escape hatches
(--claude-cli-allow-tools, --claude-cli-allow-mcp), the
--claude-cli-max-budget-usd spending cap, and per-backend troubleshooting.
For troubleshooting and detailed logging, use the RUST_LOG environment variable:
# Enable debug logging for omni-dev components
RUST_LOG=omni_dev=debug omni-dev git commit message twiddle ...
# Debug specific modules (e.g., context discovery)
RUST_LOG=omni_dev::claude::context::discovery=debug omni-dev git commit message twiddle ...
# Show only errors and warnings
RUST_LOG=warn omni-dev git commit message twiddle ...
See Troubleshooting Guide for detailed debugging information.
See CHANGELOG.md for a list of changes in each version.
This project is licensed under the BSD 3-Clause License - see the LICENSE file for details.
Выполни в терминале:
claude mcp add omni-dev -- npx