PLAY PODCASTS

The Download Mirage: What a Podcast 'Download' Actually Counts in 2026

A million-download podcast sounds enormous — until you ask which million, on what platform, by which standard. We unpick what the industry's biggest numbers actually count, and what they quietly leave out.

Open any podcast network's annual report and you'll find a sentence that goes something like: "The Rest Is History reached one million downloads per episode this year." It is the kind of figure that hangs in the air. It looks like a unit of measurement — a sound recording played a million times — and we read it the way we used to read newspaper circulation. We probably shouldn't.

A podcast download in 2026 is a strange, fragile metric. It is not a recording of a listening event; it is a server's record of a file request, filtered through a set of rules the industry agreed upon in 2017, updated in 2021, and quietly disagrees about now. Each major platform counts something slightly different and reports something slightly more flattering. The figure that ends up on a press release tells you remarkably little about how many people heard the show.

This piece is an attempt to untangle that. What does a "download" actually mean in 2026, who gets to decide, and which numbers should engaged listeners — and the producers reading our economics coverage — actually pay attention to?

What the IAB rulebook says (and what it doesn't)

The closest the industry has to a shared definition is the IAB Tech Lab's Podcast Measurement Technical Guidelines, currently sitting at version 2.2 — issued in late 2024 and adopted across most major hosting platforms through 2025. It defines a "download" with reasonable precision:

  1. A single user, on a single device, in a single 24-hour window. Repeat requests inside the day collapse into one. This kills the boost from podcast apps that retry on flaky Wi-Fi.
  2. At least one full minute of media transferred. Not played — transferred. The user need never have listened. The minute simply has to leave the server.
  3. Bot traffic excluded. Crawlers, link-preview fetchers, security scanners and the long tail of headless agents are filtered against a maintained list.
  4. Range requests stitched together. Modern apps fetch audio in chunks; the standard requires those chunks to be combined into one count, not several.
  5. IP and User-Agent fingerprinting, not cookies. Podcast feeds are anonymous by design; counts rely on imperfect signals.

That list is the floor. Anything beneath it isn't a "certified" download under IAB rules. It is — crucially — not a definition of listening. A user can download an episode, never tap play, and still appear in the number. A user can stream the same episode three times and count only once. The conflation of downloads with audience is the first illusion to clear.

How the three big platforms quietly diverge

Once you understand the IAB floor, the real divergence is what each platform layers on top — and which slice of it they share publicly. Here is how the major sources of podcast numbers actually count, based on each platform's published methodology as of Q1 2026:

PlatformHeadline number reportedCounts atWhat it measuresAudited against IAB v2.x?
Apple Podcasts ConnectPlays + Followers30 seconds of audio playedEngaged play, on-devicePartial — methodology aligned, not certified
Spotify for PodcastersStarts + StreamsA tap (Starts) and ≥60s played (Streams)Tap-throughs and on-platform listensNo — proprietary
Acast / Megaphone / Buzzsprout (hosts)IAB-certified downloads≥60s of media deliveredFile requests, IAB-filteredYes — annual recertification
YouTube (video podcasts)Views + Watch time30s of video playedViews, retention curvesNo — YouTube methodology
Goalhanger / paid networks"Engaged listeners" (per show)Show-definedComposite (downloads + listens)Internal

The same episode, in the same week, can produce four wildly different totals. A flagship Goalhanger title might report 1.2 million IAB downloads via Acast, 480,000 Spotify streams, 310,000 Apple plays and 92,000 YouTube views — all real, all measuring different things, none of them strictly comparable. When a press release picks one number, it has in effect chosen its preferred reality.

The sixty-second loophole and why long episodes look healthier than they are

The most consequential quirk of the IAB definition is the one-minute threshold. It made sense in 2017 when typical episodes ran twenty-five to thirty-five minutes; a minute felt like a meaningful commitment. In a podcast economy now dominated by ninety-minute conversations, the bar has barely moved while the content around it has tripled in length.

Run the arithmetic on a 90-minute interview podcast and a 6-minute news brief side by side. To "count" as a download under IAB rules, the long episode needs the listener to get 1.1% of the way through it. The short one needs 16.7%. The result is a quiet structural advantage for long-form: every drift-into-the-pre-roll, every accidental open in CarPlay, every tap-and-bail produces a counted download. We have written before, in The 90-Minute Problem, about why episodes keep ballooning. This is part of the answer the economics rarely admits to: long shows count more cheaply.

The honest counter-metric, watched obsessively inside the bigger networks, is completion rate — the share of listeners who get past 80% of an episode. Few networks publish it. Goalhanger and Wondery share aggregate ranges with sponsors; the rest treat it as commercial-in-confidence. If a show won't tell you its completion rate, the download figure is doing a lot of work.

What sponsors actually buy — and what they are starting to ask for

The interesting recent shift is on the buy side. Through 2024 and 2025, the largest podcast ad-buying agencies — Oxford Road, Veritone One, A Million Ads — began moving away from CPM-on-IAB-downloads and toward what they are calling listened-impressions: ads served and heard past the 75% mark of the spot itself. Acast's mid-2025 pricing card now offers both, and the gap is telling. A typical pre-roll on a tier-one UK chat show carries a CPM of:

  • £22–28 on IAB downloads
  • £36–48 on listened-impressions at 75% completion
  • £60+ on listened-impressions with brand-suitability filtering layered on top

Sponsors are paying nearly double for the metric closest to "a human heard this." That tells you what they actually trust. It also explains why dynamic ad insertion — the subject of our Mid-Roll Problem piece — has accelerated. If you are being paid for ears, you want the ad inserted as close to those ears as possible, with the ability to retarget if the first attempt fell silent.

Which numbers should an engaged listener actually trust?

For listeners — and for anyone reading the trade coverage and trying to work out which shows are genuinely big versus which have the best PR — here is a working hierarchy, in order of how meaningful we think each number is:

  1. First-72-hour completion rate, if published. Almost nobody publishes it. If a show does, take it seriously; it is the only figure on this list that measures attention rather than delivery.
  2. Average listens per episode at 30 days, ideally from a host platform's own dashboard. This is the closest thing to "regular audience size" most shows will share.
  3. Streaming + IAB downloads combined, when both are disclosed. A composite is more honest than either alone, because the two media types miss opposite halves of the audience.
  4. YouTube watch-time hours, for video-first shows. Views are noisy; watch-time is harder to inflate and tracks meaningfully with sponsor outcomes.
  5. IAB downloads alone. The default headline number. Better than nothing, but the weakest of the five — it is a measure of distribution, not consumption.
  6. Spotify Wrapped placements or Apple chart positions. Vibes, not measurement. Useful as cultural signal, not as audience size.

Press releases will keep leading with download counts because the number is large, simple and instinctively legible. That's fine; we will keep reading them. But the next time you see "five million downloads per episode," ask three small questions: across which platforms, over what window, and how many of those listeners made it past the first ad break? The honest answer to all three is usually narrower, slower and stranger than the headline.

The download was a useful fiction for a decade — a shared unit that let an emerging medium be priced and bought. It is worth understanding what we've inherited from that compromise, what it leaves out, and where the more truthful numbers are quietly starting to take its place.