HyperFrames: An Open-source Rendering Framework for Generating Video with HTML

HyperFrames: An Open-source Rendering Framework for Generating Video with HTML

May 2, 2026

HyperFrames demo

One-line Positioning

HyperFrames is HeyGen’s open-source HTML video rendering framework. It lets you define video scenes, timelines, and animations the same way you write web pages, then preview them in a browser and render them to MP4.

Basic Info

ItemInfo
GitHubhttps://github.com/heygen-com/hyperframes
Docshttps://hyperframes.heygen.com/introduction
NPM packagehyperframes
Main languageTypeScript
LicenseApache-2.0
Current version0.4.41
RequirementsNode.js >= 22, FFmpeg
PositioningHTML-native, AI-first, deterministic video rendering

What Problem Does It Solve?

There are usually two common paths for video automation. One is to use traditional editing tools, which are great for manual production but not ideal for batch generation. The other is to use code-based video frameworks, which are automation-friendly but often require developers to enter a specific component system or DSL.

HyperFrames takes a route that feels closer to normal web development: the video itself is HTML.

It puts clips, subtitles, images, audio, timing, and track relationships into HTML elements and data-* attributes. You can organize scenes like a web page, animate them with familiar frontend tools such as CSS, GSAP, Lottie, and Three.js, then render the result to MP4 from the command line.

This approach is also friendly to AI agents. Large language models are already good at generating HTML, CSS, and simple scripts. HyperFrames adds skills, plugins, and CLI workflows for agents, so an agent can go from “describe a video” to “create the project, preview it, inspect it, and render it.”

Core Features

1. Describe Video Structure with HTML

The core input to HyperFrames is an HTML document. Each element uses attributes such as data-start, data-duration, and data-track-index to describe when it appears, how long it lasts, and where it sits in the timeline.

A simple video can be composed from ordinary HTML elements such as video clips, titles, images, and background music:

<div id="stage" data-composition-id="my-video" data-start="0" data-width="1920" data-height="1080">
  <video
    id="clip-1"
    data-start="0"
    data-duration="5"
    data-track-index="0"
    src="intro.mp4"
    muted
    playsinline
  ></video>

  <h1
    class="clip"
    data-start="1"
    data-duration="4"
    data-track-index="1"
  >
    Welcome to HyperFrames
  </h1>

  <audio
    data-start="0"
    data-duration="5"
    data-track-index="2"
    data-volume="0.5"
    src="music.wav"
  ></audio>
</div>

The benefit is a low learning curve. You do not need to learn a new video-editing format first, and you do not have to rewrite everything as React components.

2. Browser Preview, Local MP4 Rendering

HyperFrames provides a CLI for project initialization, preview, and rendering:

npx hyperframes init my-video
cd my-video
npx hyperframes preview
npx hyperframes render

During preview, you inspect the composition in a browser. During rendering, headless Chrome captures frames and passes them to FFmpeg to produce the final video. The official docs emphasize deterministic rendering: the same input should produce the same output, which matters for automation pipelines.

3. AI-agent-oriented Workflow

HyperFrames is not just a CLI. It also provides supporting workflows for AI coding tools. The project README recommends installing skills first:

npx skills add heygen-com/hyperframes

These skills teach agents how to write valid compositions, GSAP timelines, Tailwind v4 browser-runtime styles, and adapters for different animation runtimes.

The project also provides instructions or plugin entry points for tools such as Claude Code, Cursor, Codex, and Gemini CLI. In other words, “AI support” is not just a marketing phrase. Agent-friendly initialization, linting, previewing, and rendering are part of the framework’s design.

4. Support for Multiple Animation and Asset Runtimes

HyperFrames uses a Frame Adapter pattern to integrate animation runtimes. You can build animations with GSAP, Lottie, CSS animation, Three.js, or the Web Animations API, as long as they can seek frame by frame and participate in deterministic rendering.

This matters for video automation. Video is not just screen recording. You need a deterministic frame at a specific timestamp. HyperFrames abstracts the relationship between “how the page plays” and “how the renderer captures frames”, allowing different animation technologies to enter the same video-generation pipeline.

5. Built-in Component Registry and Multiple Packages

The project provides 50+ ready-to-use blocks and components, such as social-media overlays, shader transitions, data visualization, and cinematic effects. They can be added through commands:

npx hyperframes add flash-through-white
npx hyperframes add instagram-follow
npx hyperframes add data-chart

From the repository structure, HyperFrames is a monorepo with several key packages:

PackagePurpose
hyperframes / @hyperframes/cliCLI for init, preview, lint, and render
@hyperframes/coreTypes, parser, generator, linter, runtime, and frame adapters
@hyperframes/enginePuppeteer + FFmpeg based page-to-video rendering engine
@hyperframes/producerFull rendering pipeline, including capture, encoding, and audio mixing
@hyperframes/studioBrowser-side composition editor UI
@hyperframes/playerEmbeddable <hyperframes-player> Web Component
@hyperframes/shader-transitionsWebGL shader transitions for compositions

Official Video Examples

npx hyperframes init my-video --example warm-grain
ExampleTypeVideo link
Warm GrainBrand / lifestylehttps://static.heygen.ai/hyperframes-oss/docs/images/templates/warm-grain.mp4
Play ModeSocial media / product launchhttps://static.heygen.ai/hyperframes-oss/docs/images/templates/play-mode.mp4
Swiss GridEnterprise / technology / datahttps://static.heygen.ai/hyperframes-oss/docs/images/templates/swiss-grid.mp4
Kinetic TypeTitle card / promohttps://static.heygen.ai/hyperframes-oss/docs/images/templates/kinetic-type.mp4
Decision TreeExplainer video / tutorialhttps://static.heygen.ai/hyperframes-oss/docs/images/templates/decision-tree.mp4
Product PromoProduct showcasehttps://static.heygen.ai/hyperframes-oss/docs/images/templates/product-promo.mp4
NYT GraphData story / chart videohttps://static.heygen.ai/hyperframes-oss/docs/images/templates/nyt-graph.mp4
VignelliVertical title / announcementhttps://static.heygen.ai/hyperframes-oss/docs/images/templates/vignelli.mp4

Who Is It For?

HyperFrames is a good fit for several scenarios:

  1. Frontend developers who want to generate video with code
    If you already know HTML, CSS, and JavaScript, the mental model is easy to pick up.

  2. Teams that need batch-generated marketing, tutorial, or data videos
    For example, turning CSV files into animated charts, product docs into short videos, or templates into many personalized variants.

  3. People building AI-agent content production workflows
    Its CLI, skills, and plugin design all lean toward automation, making it suitable as an execution layer for agent-generated video.

  4. Teams that need reproducible video generation pipelines
    Compared with manual editing tools, HyperFrames is easier to run from scripts, CI, Docker, or backend jobs.

Quick Start

If you want an AI agent to participate in video generation, the official recommendation is to install the skills first:

npx skills add heygen-com/hyperframes

Then you can ask the agent to create a video with /hyperframes context, for example:

Using /hyperframes, create a 10-second product intro with a fade-in title, a background video, and background music.

If you want to create a project manually, use the CLI:

npx hyperframes init my-video
cd my-video
npx hyperframes preview
npx hyperframes render --output output.mp4

Prepare these locally:

  • Node.js >= 22
  • FFmpeg
  • an environment that can run Chromium / Puppeteer

Conclusion

The core value of HyperFrames is not “another video editor.” It brings video generation back into the web stack: HTML handles structure, CSS and animation libraries handle presentation, the browser handles rendering, and FFmpeg handles output.

For developers, that means videos can be generated, versioned, and automated like web pages. For AI agents, it provides a target format that is easier to understand and operate.

If you are working on content automation, batch video generation, data videos, or text-to-renderable-video workflows for AI agents, HyperFrames is an open-source project worth watching.

Tags
Related articles
Based on tags
Follow on WeChat
WeChat official account QR code