What It Does
HyperFrames is an open-source video rendering framework that converts HTML-based compositions into MP4, MOV, or WebM video output. Compositions are plain HTML files annotated with data attributes (data-start, data-duration, data-track-index) that define timing; the engine uses Puppeteer to capture frames from a headless browser and FFmpeg to encode the resulting video.
The framework is explicitly designed for AI agent workflows. It ships as an installable Agent Skills package via npx skills add heygen-com/hyperframes, which registers slash commands (/hyperframes, /hyperframes-cli, /gsap) in Claude Code, Cursor, Gemini CLI, and other compatible agents. The premise is that language models already understand HTML and CSS natively, making HTML-based composition authoring more accessible to agents than React-based or DSL-based alternatives. The project is published by HeyGen, the AI video generation company ($69M raised, ~$95M ARR as of late 2025).
Key Features
- Plain HTML compositions: No React, no JSX, no proprietary DSL — compositions are vanilla HTML files with data attributes for timing and track assignment
- Puppeteer + FFmpeg rendering engine:
@hyperframes/enginecaptures frames from headless Chromium and encodes them via FFmpeg; Node.js 22+ and FFmpeg must be installed - Deterministic output guarantee: Prohibits
Math.random()and requires synchronous GSAP timeline construction to ensure identical output from identical input - Agent Skills integration: Registers as slash commands in Claude Code and 7+ other coding agents; the skill documents teach GSAP animation patterns, caption syntax, and composition constraints
- GSAP animation runtime: Native GSAP support with vocabulary mappings — “snappy” maps to
power4.outeasing, “bouncy” toback.out— so agents can use natural language to describe motion - 50+ installable blocks: Social overlays (Instagram follow, TikTok hooks), data charts, cinematic effects, and WebGL shader transitions via
npx hyperframes add [component-name] - Browser-based studio editor:
@hyperframes/studioprovides a live-reload browser preview;@hyperframes/playeris an embeddable web component for playback - WebGL shader transitions:
@hyperframes/shader-transitionsprovides GPU-accelerated transition effects - Monorepo architecture: Bun-managed monorepo with
cli,core,engine,player,producer,shader-transitions, andstudiopackages
Use Cases
- AI-generated marketing video: An AI agent receives a product brief, writes an HTML composition with GSAP animations and brand colors, and HyperFrames renders a polished MP4 without human involvement
- Automated social content at scale: Generate format-specific videos (9:16 TikTok, 1:1 Instagram, 16:9 YouTube) from data sources (CSVs, APIs) in a CI/CD pipeline
- Programmatic video tooling: A SaaS product that embeds video generation as a feature — users configure content via a UI, and the backend renders via HyperFrames
- Developer experimentation: Prototyping HTML-based video compositions before committing to a full production rendering pipeline
Adoption Level Analysis
Small teams (<20 engineers): Fits well for teams building AI-native video pipelines or experimenting with programmatic video generation. Apache 2.0 license removes commercial friction. FFmpeg and Node.js 22 are manageable dependencies at small scale. The agent skills integration reduces prompt engineering effort significantly.
Medium orgs (20–200 engineers): Fits for dedicated media/video engineering teams. Puppeteer-based rendering at scale requires careful resource management (headless browser pool, memory, concurrency limits). The lack of a managed cloud rendering tier means teams must operate their own rendering infrastructure. HyperFrames does not document horizontal scaling patterns as of April 2026.
Enterprise (200+ engineers): Does not fit yet. No enterprise support tier, no SLA, no managed cloud option, no independent governance body. Single vendor (HeyGen) controls the roadmap. License is permissive but operational maturity is early-stage. Organizations generating video at enterprise scale would need to build significant infrastructure around HyperFrames to meet reliability requirements.
Alternatives
| Alternative | Key Difference | Prefer when… |
|---|---|---|
| Remotion | React-based programmatic video; richer ecosystem; commercial license required for companies >3 employees | Existing React team; needs mature component library; willing to pay commercial license fees |
| WebVideoCreator | Also Puppeteer + FFmpeg under the hood; less agent-focused; less maintained | Simple HTML-to-video without agent integration requirement |
| FFmpeg directly | Raw encoding, no composition layer | Simple media processing pipelines without animation/composition needs |
| HeyGen API | Managed cloud rendering with avatar/voice features; proprietary | Need managed infrastructure, HeyGen avatar features, or enterprise SLA |
Evidence & Sources
- GitHub repository (2.5k stars, 189 forks as of review)
- HyperFrames documentation and prompting guide
- HeyGen Series A announcement — $60M raised, $500M valuation
- Community reception on X — Rohan Paul thread
- Comparison of HTML-to-video rendering approaches
Notes & Caveats
- Single-vendor risk: HeyGen controls the roadmap, block ecosystem, and skills registry. There is no independent governance (no Linux Foundation, no AAIF). If HeyGen pivots or fails, community continuity is uncertain. Apache 2.0 mitigates legal risk but not ecosystem risk.
- No managed cloud tier (yet): All rendering is local or self-hosted. Teams must provision their own Puppeteer + FFmpeg environment. HeyGen’s commercial cloud rendering is a likely future product that could create a freemium split between local and managed tiers.
- Puppeteer operational complexity: Headless Chromium rendering is resource-intensive, sensitive to font/OS differences between environments, and requires careful memory management under load. Byte-identical output across different OS/browser versions is not guaranteed despite the “deterministic” claim.
- Node.js 22+ hard requirement: Excludes teams on older Node.js runtimes or locked to LTS versions. As of April 2026, Node.js 22 is current but not yet LTS.
- Agent skills quality controlled by HeyGen: The skill documents that teach agents how to use HyperFrames are maintained by HeyGen alone. Errors or gaps in skill documents directly affect agent output quality.
- Very early project (April 2026 launch): No independent production case studies exist at review time. Community adoption is nascent (2.5k stars). The framework should be treated as early-stage despite the professional packaging.