A crash hits Sentry — Helix diagnoses it, writes a failing test, generates a fix, and opens a pull request. Your team approves in Slack. Nothing merges without your sign-off.
$ git clone https://github.com/88hours/helix-community
$ cp .env.example .env # add your keys
$ docker compose up --build▌
Works with your existing stack
See it in action
From Sentry alert to merged PR — fully autonomous, always human-approved.
Each agent hands off to the next automatically. You only step in when the fix is ready.
Receives Sentry or Rollbar webhooks, classifies severity, detects language, produces structured crash report.
Deduplicates against open issues, clones repo, generates a failing test asserting correct behaviour. Retried if wrong.
Reads source files, calls the LLM to generate a minimal fix, posts it as a comment on the GitHub Issue.
Sends a Slack notification with the fix suggestion, GitHub Issue link, and full crash context.
Helix posts the fix suggestion and GitHub Issue link to Slack. Your team reviews and applies it. Nothing changes in production without your decision.
Every architectural decision in Helix prioritises correctness and safety over speed.
All webhook payloads are HMAC-verified. Your credentials never leave your infrastructure. No code merges without explicit human approval in Slack.
The QA Agent writes tests that assert correct behaviour — not just that the crash disappears. Tests are validated and retried until the assertion is provably right.
Run entirely on your own infrastructure. Use Anthropic, Ollama (local), or any LLM provider. One Docker Compose command. No vendor lock-in.
Ship autonomous incident response in one Docker Compose command. No dashboard required.
Asserts what the function should return after the fix — not that the crash occurs. Validated and retried until correct.
Python, JavaScript, TypeScript, Ruby, Java, Kotlin, and Go. Stack trace parsing and fix generation are all language-aware.
Every fix is posted to Slack before anything changes. Your team decides. Nothing merges without a sign-off.
Checks open GitHub Issues before running the pipeline. Known bugs skip straight to a Slack nudge.
Plug in your existing monitors. HMAC-verified webhooks. No code changes in your app required.
Docker Compose brings up all four agents and Redis. Deploy to Railway, Fly.io, ECS, or bare metal.
Open source and self-hosted — or fully managed with a dashboard, per-project credentials, and enterprise observability.
| Feature |
Community
Open source · Self-hosted
|
Helix Cloud
Managed · Multi-project
|
|---|---|---|
| Core pipeline | ||
| Crash Handler Agent (Sentry + Rollbar) | ✓ | ✓ |
| QA Agent — test generation & deduplication | ✓ | ✓ |
| Dev Agent — fix suggestion posted to GitHub Issue | ✓ | ✓ |
| Notifier Agent — Slack approval & escalation | ✓ | ✓ |
| Multi-language support (Python, JS, Ruby, Java, Go) | ✓ | ✓ |
| Redis Pub/Sub event bus | ✓ | ✓ |
| Redis Streams — durable event log & incident replay | — | ✓ |
| Ollama — local LLM support (zero API cost) | ✓ | — |
| TDD loop — iterate to passing suite | — | ✓ |
| Automated PR creation from passing fix | — | ✓ |
| OpenRouter & multi-provider LLM support | — | ✓ |
| Hosting & configuration | ||
| Self-hosted (Docker Compose, Railway, Fly.io, ECS) | ✓ | — |
| Global env var configuration | ✓ | ✓ |
| Managed hosting (zero ops) | — | ✓ |
| Per-project credentials & webhook URLs | — | ✓ |
| GitHub App integration (scoped tokens) | — | ✓ |
| Dashboard & observability | ||
| React dashboard with live incident stream | — | ✓ |
| Auth0 authentication (GitHub OAuth) | — | ✓ |
| Structured JSON logging | ✓ | ✓ |
| OpenTelemetry distributed tracing | — | ✓ |
| LangSmith LLM tracing & eval suite | — | ✓ |
| Automated CI evals (fails if score < 80%) | — | ✓ |
Detected automatically from the Sentry or Rollbar payload.
| Language | Test framework |
|---|---|
| Python | pytest |
| JavaScript | jest |
| TypeScript | jest |
| Ruby | rspec |
| Java | junit |
| Kotlin | junit |
| Go | go test |
Helix ships a built-in Claude Code skill. Open the repo and type /setup — Claude walks you through prerequisites, environment config, and running the test suite interactively.
you /setup
Checking prerequisites... Python 3.12 ✓, Docker ✓, uv ✓
Copying .env.example → .env
Which LLM provider? Anthropic or Ollama?
you Anthropic
Set ANTHROPIC_API_KEY in .env, then run:
docker compose up --build
Once up, run pytest to verify — 18 tests, all passing.
Questions, feedback, or just want to say hi — we're here.
Found a bug or have a question? Open an issue on GitHub — the best place to get help and track progress.
Open an issueInterested in managed hosting, the dashboard, or enterprise features? We'd love to chat.
Get in touchOpen source. Self-hosted. Human-approved. One Docker Compose command away.