apple-mailmppopen-ratesdeliverabilitymetrics

Apple Mail Privacy Protection broke open rates. Here is what still tells the truth.

Since iOS 15, Apple pre-fetches every tracking pixel through a proxy. Your open rate is now half Apple bots, half humans, and you cannot tell which is which. Here is how to read what is left and the metrics that still mean something.

April 9, 2026·9 min read·Draftship

Apple shipped Mail Privacy Protection (MPP) with iOS 15 in September 2021. The feature is on by default for any user who taps "Protect Mail Activity" the first time they open Mail. Roughly 95% of Apple Mail users do.

When MPP is on, Apple does two things:

1. Routes Mail through a proxy that pre-fetches every remote resource (including tracking pixels) the moment a message arrives, before the user opens it. 2. Strips the user's IP from any image request and replaces it with a generic Apple proxy IP.

The first kills open rate as a meaningful metric. Apple opens every email, regardless of whether the user does. The second kills any geo-targeting or device-attribution from open events.

Five years in, most marketers' dashboards still report opens as if they meant something. They do not. Here is what does, and how to rebuild your reporting on top of metrics that survived.

How big is the impact, really

If your audience is consumer-skewing, plan for 45% to 65% of your opens being MPP-driven, not human. The exact number depends on how Apple-heavy your list is. If you sell to creatives, designers, or Mac-first audiences, you are at the upper end. If you sell to Windows-first IT professionals, expect 25% to 40%.

Numbers that have been published:

  • Litmus, post-MPP analysis: roughly 50% of opens are now Apple proxies for typical lists.
  • Mailmodo, 2024 audit: 47% to 62% across consumer SaaS clients.
  • Anyone who runs a "preview before open" experiment by withholding the click and timing the pixel observes pixels firing within seconds of delivery.

The shape of your "open rate" curve also changed. Pre-MPP, opens trickled in over hours and days as people checked their inbox. Post-MPP, you see a vertical spike of opens within 30 minutes of send (Apple's pre-fetch), then a long flat tail of actual human opens that is hard to separate from the bot column.

What metrics still tell you the truth

These survived MPP intact:

  • Click-through rate. Clicks happen on the recipient's device when a human taps. They are not pre-fetched.
  • Click-to-delivered ratio. Useful as your engagement metric. Better than click-to-open because the open denominator is mostly garbage.
  • Conversion and revenue per email sent. End-to-end attribution. The most honest metric.
  • Reply rate. Bots do not reply. Reply rate is undermeasured because few ESPs report it well, but it is a clean signal of human engagement.
  • Unsubscribe and spam-complaint rate. These are recipient actions on the device, not bot artifacts.
  • List growth and churn. Net subscribers tells you whether the audience is growing or shrinking.
  • Forward rate. Underutilized, hard to track without a "Forward to a friend" link, but a real signal of value.

These got worse but are still useful with caveats:

  • Open rate as a coarse health check. If your open rate suddenly drops by 50%, something is broken (usually deliverability), regardless of MPP. As an A/B test signal between two near-identical sends to the same audience, it is noisy but directional.
  • Send-time optimization based on opens. Use this with skepticism. Apple opens within minutes of receipt. The "best send time" your ESP suggests using opens is the time Apple's proxy was active, not the time your audience checks email. Use clicks for send-time optimization, or stop optimizing send time entirely (it is overrated).

These are dead:

  • Geo-targeting from open IP. Apple's proxy IP is a generic anycast endpoint. The "this user is in Berlin" inference from opens no longer works.
  • Device detection from opens. Apple's proxy User-Agent is generic. iPhone/iPad/Mac client share inferred from opens is wrong.
  • "Opened but did not click" segments. This segment is now mostly Apple bots that opened with no human involved.

How to rebuild a reporting dashboard

Stop putting open rate at the top. Put click-through rate, conversion rate, and revenue per email at the top. Move open rate to a "system health" widget that lights up red when it deviates more than 30% from baseline.

If your CRM or BI tool still bases "engaged subscriber" on opens, fix the definition. A reasonable post-MPP "engaged" definition for a weekly newsletter:

  • Clicked any email in the last 60 days, OR
  • Opened any email AND took an action on the site (logged in, made a purchase, scrolled a doc) in the last 60 days, OR
  • Replied to any email in the last 90 days.

The pure-open path is gone. Do not use opens as the only proof of engagement.

For send-time optimization, the question to answer is "when does this audience click", not "when does this audience open". Most ESPs still surface "best send time" using opens. Discount it. Run your own analysis on click timestamps if you care, or accept that a Tuesday at 10am send time is fine for most audiences.

Behavioral signals on the site

If you have a recipient ID that follows the user from email to site, you can sidestep email-side metrics entirely. The pattern:

1. Embed the recipient ID in the click URL (?rid=abc123). 2. On the site, set a session cookie with the ID. 3. Track behavior on site (pages, time, scroll depth, conversion). 4. Attribute back to the email send.

This works because the site is the recipient's device, not Apple's proxy. The fidelity is high. The only thing you lose is anyone who reads but does not click, which is a smaller cohort than the marketing world likes to admit.

ESPs that claim "MPP-aware" opens

Several ESPs (Mailchimp, Klaviyo, ActiveCampaign) added "filtered open" reporting that tries to subtract Apple proxy opens from the total. The methods vary:

  • Some look at the User-Agent of the open request.
  • Some look at the IP range (Apple publishes the proxy ranges).
  • Some use timing heuristics (an open in the first 60 seconds after delivery is presumed bot).

The accuracy varies. The IP-range approach is the most reliable; Apple's ranges are public and stable. The User-Agent approach is unreliable; Apple's proxy uses a generic UA that not all ESPs identify correctly.

Even with filtering, expect 5% to 15% over- or under-counting. Use filtered opens for trend analysis, not absolute numbers.

The argument against measuring opens at all

Some teams have just stopped reporting opens. The argument: a metric that is half noise is worse than no metric, because people make decisions on it.

The counter: opens are still a free, low-effort dashboard signal that catches catastrophic failures (a broken sending IP, a misconfigured DKIM, a list import gone wrong). Removing them entirely loses that.

The middle path most well-run teams settle on: report opens as a system-health metric with clear caveats next to it, and never use it as a primary decision input.

TL;DR

  • Apple MPP pre-fetches every tracking pixel for ~95% of Apple Mail users. Your "opens" are mostly bots.
  • Plan for 45% to 65% of opens being Apple proxies on consumer-skewing lists.
  • Click-through rate, conversion rate, reply rate, and revenue per send are unaffected.
  • Send-time optimization based on opens is wrong. Use clicks if you want signal.
  • Rebuild "engaged subscriber" definitions to include site behavior, not just opens.
  • Geo and device inference from opens is dead.
  • Keep opens as a health-check metric. Do not make decisions on them.
Try it yourself

Open the editor and ship an email that doesn't break.