Skip to content

iloom

★ New
assess
AI / ML open-source BSL-1.1 free

At a Glance

CLI + VS Code extension that decomposes natural-language feature requests into tracked issues and deploys parallel Claude Code agents in isolated git worktrees, persisting AI reasoning permanently in GitHub, Linear, or Jira rather than ephemeral chat sessions.

Type
open-source
Pricing
free
License
BSL-1.1
Adoption fit
small
Top alternatives

What It Does

iloom is a CLI and VS Code extension that sits above Claude Code as an orchestration layer. Users describe a feature or project in natural language; iloom’s internal “Planner” agent decomposes it into discrete issues — each with requirements, dependencies, and acceptance criteria — and writes them to GitHub Issues, Linear, or Jira. In “swarm” mode, iloom then launches a parallel Claude Code agent per issue, each running in an isolated git worktree with its own filesystem path, database branch (Neon integration), and port assignment. The VS Code extension surfaces each agent’s reasoning, assumptions, risks, and decisions inline during code review.

The tool’s key architectural bet is that AI reasoning should be durable, not ephemeral. Rather than losing the context of why the agent made a decision when a terminal session ends, iloom writes the full analysis trail to the issue tracker. This makes the AI’s reasoning a first-class artifact in the team’s existing review workflow.

Key Features

  • Epic decomposition: A single natural-language description is broken into GitHub/Linear/Jira issues with dependency tracking before any code is written; user can edit the plan before implementation begins
  • Parallel swarm execution: Up to five concurrent “looms” per session, each in a fully isolated git worktree at ~/project-looms/issue-N/ with dedicated DB branch and port — no branch-switching overhead or file conflicts
  • Multi-agent roles: Internal agents — Enhancer, Evaluator, Analyzer, Planner, Implementer — handle distinct phases of the workflow rather than using a single monolithic agent
  • Issue tracker persistence: All reasoning, implementation strategy, risk assessments, and decisions are written as issue comments or linked artifacts; no ephemeral chat logs
  • VS Code extension: Real-time visibility into assumptions, risks, and decisions surfaced alongside the diff during code review
  • Issue tracker integrations: GitHub, Linear, Jira Cloud, Bitbucket VCS support
  • Neon database integration: Database branch per worktree for schema-safe parallel development
  • npm distribution: npm install -g @iloom/cli and VS Code Marketplace extension — no custom runtime or infrastructure required
  • Zero direct cost: iloom is free; API costs are billed directly from Anthropic (Claude Max recommended for swarm mode)

Use Cases

  • Parallel feature development on macOS: Solo developer or small team who wants to implement multiple independent issues simultaneously with Claude Code, without managing worktrees manually or losing agent reasoning between sessions
  • AI-augmented sprint planning: Tech lead who wants to decompose a feature request into trackable issues before handing off to agents, retaining full audit trail in the project’s existing issue tracker
  • Code review with AI reasoning context: Engineering team using GitHub/Linear/Jira who wants agent-generated analysis (risks, decisions, assumptions) attached to the issue before reviewing the resulting PR
  • OSS contribution onboarding: Contributor who runs il contribute to let iloom automatically set up the project environment and identify good first issues

Adoption Level Analysis

Small teams (<20 engineers): Fits for macOS-first teams already paying for Claude Max who use GitHub, Linear, or Jira as their system of record. The zero-infrastructure model (npm install, git worktrees, no Docker) keeps operational overhead minimal. However, 131 open GitHub issues, documented Linux instability, and explicit “early-stage product” self-classification mean early-adopter friction is real. Teams must be comfortable triaging issues and accepting rough edges.

Medium orgs (20–200 engineers): Does not fit today. No multi-user support, no RBAC, no audit logging beyond what iloom writes to issues, no enterprise authentication, and no SLA. The BSL 1.1 license requires legal review before organizational deployment — commercial use restrictions apply until 2030. Docker Compose stacks are unsupported, limiting applicability to projects without containerized dev environments.

Enterprise (200+ engineers): Does not fit. BSL 1.1 license terms, absence of governance features, macOS-centric design, and early-stage quality level are disqualifying. Enterprises should evaluate OpenHands or dedicated orchestration platforms.

Alternatives

AlternativeKey DifferencePrefer when…
Vibe KanbanApache-2.0, agent-agnostic (10+ agents), embedded browser + diff review UI, no issue tracker integrationYou want agent-agnostic orchestration with a richer review UI and aren’t locked into Claude Code
Claude Code (direct)First-party Anthropic CLI, single agent, no orchestration overhead, most polished UXYou don’t need parallel agents and want the best single-session experience
OpenHandsModel-agnostic, Docker-isolated sandbox runtime, cloud + enterprise tiers, 70k+ starsYou need sandboxed execution, multi-model support, or an enterprise offering
claude-flowClaude-native swarm with 314 MCP tools and 16+ agent roles; heavier frameworkYou want a more comprehensive multi-agent framework rather than a workflow orchestration layer
Composio Agent OrchestratorStructured agentic workflows with tool integrations beyond issue trackersYou want agent orchestration connected to a broader SaaS tool ecosystem

Evidence & Sources

Notes & Caveats

  • BSL 1.1 license, not open source. Despite the “free” marketing framing, iloom is source-available under Business Source License 1.1. Commercial use beyond the permitted free-use grant is restricted until April 17, 2030, when it converts to Apache 2.0. Enterprise legal review is required before organizational deployment. Forks for competitive commercial use are prohibited during the BSL term.
  • Claude Code hard dependency. iloom is explicitly built on top of Claude Code and requires a Claude Max subscription (recommended) or Claude Code API access. This is a significant vendor lock-in to Anthropic’s toolchain. If Claude Code changes its API, pricing, or availability, iloom’s core functionality is affected.
  • Linux support was broken at launch. All 8-9 agent templates (215KB JSON) were passed as a single CLI argument, immediately hitting Linux’s MAX_ARG_STRLEN 128KB per-argument kernel limit with E2BIG crashes. Community contributions added Linux/WSL terminal backends in v0.13.x, but Linux should be treated as beta-quality.
  • Docker Compose not supported. Projects relying on multi-service Docker Compose stacks cannot use iloom’s worktree isolation model today. This is a documented roadmap item (#332 on GitHub), not a near-term fix.
  • Encrypted secret formats incompatible. Rails credentials, ASP.NET User Secrets, and SOPS-managed secrets cannot be copied into worktrees via iloom’s environment variable system. Teams relying on these formats will need workarounds.
  • One-way file copying for gitignored files. Files excluded from git are copied into looms but not synced back on merge. This can cause state inconsistency for workflows that rely on untracked configuration files being modified during development.
  • No verified team or funding information. The company behind iloom (iloom.ai, © 2026) has not disclosed founders, investors, team size, or funding status publicly. This is a sustainability risk for a tool positioned as a production workflow dependency.
  • Early-stage quality level. 131 open GitHub issues and 32 releases across approximately three months of active development suggests rapid iteration over stability. Teams should expect breaking changes and should pin versions.

Related