Skip to content

Latest commit

 

History

History
256 lines (204 loc) · 6.58 KB

File metadata and controls

256 lines (204 loc) · 6.58 KB
title Quickstart
description Get up and running with varg in 2 minutes

The fastest path: AI agent + skill

The recommended way to use varg is through an AI coding agent (Claude Code, Cursor, Windsurf, etc.) with the varg skill installed. The skill gives your agent full context on components, models, prompting, and rendering.

Sign up at [app.varg.ai](https://app.varg.ai) and copy your API key. ```bash npx -y skills add vargHQ/skills --all --copy -y ``` This installs the varg skill into Claude Code, Cursor, Windsurf, and 40+ other agents automatically. ```bash export VARG_API_KEY=varg_live_xxx ``` ```bash claude "create a 10-second product video for white sneakers, 9:16, UGC style, with captions and background music" ```
Or with Cursor, Windsurf, or any other agent — just prompt it the same way:

*"Create a 10-second TikTok video of a cute robot waving hello with upbeat music and captions"*

That's it. The agent handles all the TSX code, rendering, and file management for you.

First render takes 1-2 minutes (AI generation). Subsequent runs with the same prompts are instant (cached at $0).

Pricing

1 credit = 1 cent. Typical costs:

Action Model Credits Cost
Image nano-banana-pro 5 $0.05
Image flux-pro 25 $0.25
Video (5s) kling-v3 150 $1.50
Video (5s) seedance-2-preview 250 $2.50
Speech eleven_v3 25 $0.25
Music music_v1 25 $0.25
Cache hit any 0 $0.00

A typical 3-clip video costs $2-5. Cache hits are always free.


Alternative: Cloud Render (curl only)

No installation needed. Submit TSX code to the Cloud Render API via curl:

# Submit a render job
JOB=$(curl -s -X POST https://render.varg.ai/api/render \
  -H "Authorization: Bearer $VARG_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "code": "const img = Image({ model: fal.imageModel(\"nano-banana-pro\"), prompt: \"a cabin in mountains at sunset\", aspectRatio: \"16:9\" });\nexport default (<Render width={1920} height={1080}><Clip duration={3}>{img}</Clip></Render>);"
  }')

JOB_ID=$(echo $JOB | jq -r '.job_id')

# Poll for result
curl -s "https://render.varg.ai/api/render/jobs/$JOB_ID" \
  -H "Authorization: Bearer $VARG_API_KEY"
Cloud render uses `fal.*Model()` syntax — provider globals are auto-injected server-side. See the [Cloud Render API](/render-api) reference for full details.

Alternative: Local Render (bun + ffmpeg)

For full control, render videos locally. Requires Bun and ffmpeg.

Installation

```bash bun install vargai ai ``` ```bash npm install vargai ai ```

Set your API key

Create a .env file in your project root:

VARG_API_KEY=varg_xxx
Bun auto-loads `.env` files — no `dotenv` needed.

Your first video

Create hello.tsx:

/** @jsxImportSource vargai */
import { Render, Clip, Image, Video } from "vargai/react"
import { createVarg } from "vargai/ai"

const varg = createVarg({ apiKey: process.env.VARG_API_KEY! })

const character = Image({
  model: varg.imageModel("nano-banana-pro"),
  prompt: "cute kawaii fluffy orange cat character, round body, big eyes, Pixar style",
  aspectRatio: "9:16",
})

export default (
  <Render width={1080} height={1920}>
    <Clip duration={5}>
      <Video
        prompt={{
          text: "character waves hello enthusiastically, bounces up and down",
          images: [character],
        }}
        model={varg.videoModel("kling-v3")}
      />
    </Clip>
  </Render>
)

Render it:

bunx vargai render hello.tsx --preview   # free preview (placeholders)
bunx vargai render hello.tsx --verbose   # full render (costs credits)

Add music

/** @jsxImportSource vargai */
import { Render, Clip, Image, Video, Music } from "vargai/react"
import { createVarg } from "vargai/ai"

const varg = createVarg({ apiKey: process.env.VARG_API_KEY! })

const character = Image({
  model: varg.imageModel("nano-banana-pro"),
  prompt: "cute kawaii cat character, Pixar style",
  aspectRatio: "9:16",
})

export default (
  <Render width={1080} height={1920}>
    <Music
      prompt="upbeat happy electronic music, cheerful vibes"
      model={varg.musicModel()}
      volume={0.3}
      duration={5}
    />
    <Clip duration={5}>
      <Video
        prompt={{ text: "cat dancing happily", images: [character] }}
        model={varg.videoModel("kling-v3")}
      />
    </Clip>
  </Render>
)

Add voiceover with captions

/** @jsxImportSource vargai */
import { Render, Clip, Image, Video, Speech, Captions } from "vargai/react"
import { createVarg } from "vargai/ai"

const varg = createVarg({ apiKey: process.env.VARG_API_KEY! })

const character = Image({
  model: varg.imageModel("nano-banana-pro"),
  prompt: "friendly robot character, blue metallic, expressive eyes",
  aspectRatio: "9:16",
})

const voiceover = Speech({
  model: varg.speechModel("eleven_v3"),
  voice: "adam",
  children: "Hello! I'm your AI assistant. Let me show you something cool!",
})

export default (
  <Render width={1080} height={1920}>
    <Clip duration={5}>
      <Video
        prompt={{ text: "robot talking, subtle head movements", images: [character] }}
        model={varg.videoModel("kling-v3")}
      />
    </Clip>
    <Captions src={voiceover} style="tiktok" color="#ffffff" withAudio />
  </Render>
)

Project structure

Recommended layout:

my-video-project/
├── .env              # VARG_API_KEY=varg_xxx
├── package.json
├── output/           # Generated videos
├── .cache/ai/        # Cached AI generations
└── videos/
    ├── intro.tsx
    ├── episode-1.tsx
    └── outro.tsx

Next steps

All JSX components and their props All supported models with pricing Copy-paste examples for common video types Submit TSX via API — zero local dependencies