Skip to content

halo3buff/streaming-film-efficiency

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 

Repository files navigation

Streaming Film Efficiency Analysis (2024–2026)

Overview

This project analyzes how efficiently major streaming platforms convert production budgets into audience reach for feature films released over the last 18 months. Instead of measuring success by absolute popularity alone, the analysis introduces an Efficiency Score—a normalized measure of estimated audience reach per dollar spent.

Streaming platforms disclose performance using incompatible metrics (e.g., Netflix view counts versus opaque Top-10 rankings on Apple TV+, Paramount+, Hulu, and others), which made the core challenge of this project metric normalization. The resulting framework enables meaningful cross-platform comparison while explicitly accounting for uncertainty and proxy limitations.

Primary insight:
Low to Mid-budget films ($20M–$50M), particularly acquired and genre-focused titles, consistently outperform high-budget originals on a reach-per-dollar basis—even when tentpole releases dominate total audience volume.


Research Questions

  • How does production budget relate to audience efficiency in the streaming era?
  • Do high-budget originals reliably outperform lower-budget acquisitions when normalized for spend?
  • Where do diminishing returns emerge across platforms with different subscriber scales?
  • Which genres and acquisition strategies generate the highest efficiency?

🔗 Quick Links


Dataset & Scope

  • Time Window: Mid-2024 through early-2026 (18 months)
  • Sample Size: 150 feature films
  • Platforms Included:
    Netflix, Amazon Prime Video, Apple TV+, Disney+, Hulu, Paramount+, Peacock
  • Content Filters:
    • Feature films only (no series, shorts, or documentaries)
    • Streaming-first or streaming-dominant releases
    • Runtimes > 60 minutes

Budget Data

  • Standardized to Millions USD
  • Labeled as:
    • Reported: Trade publications or official disclosures
    • Estimated: Modeled based on production scale and comparable releases

The Performance Proxy Problem

Streaming platforms do not report performance uniformly:

Platform Type Available Signal
Netflix View counts / total hours
Others Top-10 ranking positions

Direct comparison of these signals is not possible without normalization. Treating rankings as equivalent to view counts would introduce false precision and distort results.


Proxy Normalization Methodology

1. Proxy Classification

Each film is assigned a proxy_type:

  • Numeric: Raw view counts (Netflix)
  • Ordinal: Ranking-based signals (non-Netflix platforms)

2. Rank → Audience Percentage Mapping

Ordinal rankings are converted into conservative estimates of audience reach as a percentage of platform subscribers:

Rank Tier Estimated % of Subscribers
Top 1 35%
Top 3 25%
Top 5 20%
Top 10 12%

These values are directional approximations, designed to preserve relative scale without overstating measurement accuracy.

3. Platform Subscriber Scaling

Ordinal proxies are scaled by estimated platform subscriber counts to approximate audience reach:

Platform Estimated Subscribers (2026)
Netflix ~301M
Amazon Prime Video ~200M
Disney+ ~132M
Paramount+ ~79M
Hulu ~64M
Apple TV+ ~45M
Peacock ~41M

This prevents smaller platforms from being artificially inflated relative to Netflix.


Efficiency Score Definition

Efficiency Score = Estimated Audience Reach / Production Budget (M USD)

  • Numeric proxies use raw view counts
  • Ordinal proxies use subscriber-weighted estimated reach
  • The score reflects audience reach per million dollars spent

The intent is comparative insight, not exact audience measurement.


SQL Architecture

  • Base table: Cleaned and standardized film dataset
  • Derived fields:
    • proxy_type
    • rank_pct
    • estimated_reach
    • efficiency_score
    • budget_band (Low / Mid / High)
  • Views and checks:
    • Platform-level efficiency summaries
    • Budget-band comparisons
    • Top and bottom efficiency extremes
    • Distribution sanity checks (nulls, ranges, skew)

All transformations are deterministic and documented.


Tableau Dashboard

The final Tableau dashboard presents:

  • Efficiency Frontier: Budget vs. Efficiency scatter
  • Budget Band Comparison: Efficiency by spend category
  • Platform Efficiency Distribution
  • Genre × Platform ROI Heatmap
  • Top & Bottom Performers Table

The dashboard is designed to support strategic decision-making, not exploratory noise.


Key Findings

  • The The Low to Mid-Budget Sweet Spot:
    Films in the $20M–$50M range consistently achieve the highest efficiency scores across platforms.

  • The Acquisition Advantage:
    Acquired and indie-studio titles show greater upside variance and frequently outperform in-house originals on a per-dollar basis.

  • Platform Ceilings:
    Smaller platforms exhibit diminishing returns beyond mid-budget spend levels, suggesting audience saturation effects.

  • Strategic Genre Yield:
    Horror, Comedy, and Romance demonstrate concentrated efficiency on specific platforms, while Action and Sci-Fi show high variance despite large budgets.


Tools Used

  • SQL — data modeling, normalization, and efficiency computation
  • Tableau — interactive visualization and narrative analysis
  • GitHub — reproducible project structure and documentation

👤 Author

About

Analyzes streaming film efficiency by normalizing inconsistent performance metrics across major platforms. Introduces an Efficiency Score to compare audience reach per dollar spent, revealing a consistent mid-budget sweet spot and acquisition advantage in the modern streaming landscape.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors