Skip to main content

Livepeer’s video infrastructure handles transcoding, packaging, and delivery for both live and recorded video. The Studio API exposes this as a REST interface with official SDKs in TypeScript, Python, Go, and React. You call the API; the network provides the compute. Video workloads on Livepeer use LPT stake-weighted routing and round-based rewards — this is the original Livepeer use case and the most mature part of the network.

Capabilities

The Studio API covers four workload types: Livestreams — Create a stream, get an RTMP ingest URL, and Livepeer handles transcoding to adaptive bitrate HLS for delivery. Sub-second WebRTC playback is available for latency-sensitive applications. VOD assets — Upload a video file, Livepeer transcodes it to multiple renditions. The asset is playable immediately via HLS or MP4. Webhooks — Subscribe to events (stream started, stream ended, asset ready, asset failed) via HTTP callbacks. Event-driven architectures do not need to poll the API. Access control — Gate content behind JWTs or webhook-based authorisation. JWT playback policy signs a token server-side; the player verifies it at playback.

API surface

All video operations go through the Studio REST API at https://livepeer.studio/api. Authentication uses a Bearer token from the Studio dashboard.

SDKs

Four officially supported SDKs cover the full Studio API surface including video resources and AI inference:
# TypeScript / JavaScript (npm v3.5.0)
npm install livepeer

# Python
pip install livepeer

# Go
go get github.com/livepeer/livepeer-go

# React UI components (Player, Broadcast)
npm install @livepeer/react
All SDKs share the same resource structure. Initialise with your API key:
import { Livepeer } from 'livepeer';

const client = new Livepeer({ apiKey: process.env.LIVEPEER_API_KEY });

const stream = await client.stream.create({
  name: 'my-stream',
});

console.log(stream.rtmpIngestUrl); // push here from OBS / ffmpeg
console.log(stream.playbackId);    // use in Player component

Playback protocols

Livepeer delivers content over two protocols: HLS — Standard adaptive bitrate streaming. Works with any HLS player: @livepeer/react Player, HLS.js, Video.js, or native iOS/Android players. Latency is typically 3-8 seconds. WebRTC — Sub-second latency for live interactive applications. Supported via the @livepeer/react Player and Broadcast components. Requires no additional configuration — the player selects WebRTC automatically when available. The @livepeer/react Player handles both protocols, fallback logic, and access-controlled playback via the jwt prop:
import * as Player from '@livepeer/react/player';

export const VideoPlayer = ({ playbackId }: { playbackId: string }) => (
  <Player.Root src={`https://livepeercdn.studio/hls/${playbackId}/index.m3u8`}>
    <Player.Container>
      <Player.Video />
      <Player.Controls />
    </Player.Container>
  </Player.Root>
);

Pricing

Transcoding Quickstart

Create your first stream and test end-to-end in 15 minutes.

Developer Stack

Understand all three access layers and when to use Studio vs a self-hosted gateway.

Access Control

Gate content with JWTs or webhook-based authorisation.

Webhooks

Subscribe to stream and asset events for event-driven applications.
Last modified on April 7, 2026