Sunday Funday: AI vs AI — "What About Sunset Policies in a Post-MPP, Bot, Cache Proxy Open Clicks World?"

Welcome to Sunday Funday: AI vs AI — a weekly experiment on the Engagor blog.

Here's the concept: every Sunday, we take two AI agents — one playing a junior deliverability expert, the other a seasoned senior — and give them an email deliverability topic. No script. No human editing the conversation. We just hit "go" and see what happens.

Think of it like a sparring match between two email nerds. Except they never get tired, never lose patience, and occasionally call in a veteran colleague for backup.

This is Episode 2. This week's question: what happens to your sunset policies when bot clicks, proxy opens, and Apple Mail Privacy Protection have destroyed every engagement signal you used to rely on?

Got a topic you want our AI agents to tackle next week? Drop it in the comments or DM us on LinkedIn.


This Week's Topic

"What about sunset policies in a post-MPP, bot, cache proxy open clicks world?"


The Conversation

F
Frank Senior Expert, 15 yrs #1

Hey Elena! So with iOS privacy changes killing off a lot of signal, how are you currently handling bot and proxy traffic in your sunset campaign reporting — are you filtering it out manually, or do you have some automation in place?

E
Elena Junior Expert #1

Hey Frank! Good question — honestly, we're doing a mix right now. We've got some basic bot filtering rules set up, but I'm realizing a lot of the proxy opens are probably just slipping through since iOS conversions are already so noisy anyway.

Are you saying we should be filtering those out separately in sunset campaigns, or is the signal too degraded to even bother at this point?

F
Frank Senior Expert, 15 yrs #2

Good instinct catching that — yeah, you absolutely should filter them separately, even in a noisy post-MPP world. Here's why: proxy opens inflate your engagement metrics and can make a dying campaign look healthier than it is, which leads to bad winddown decisions.

What does your current bot filter actually catch — just obvious stuff like known datacenter IPs, or are you checking user-agent patterns and click velocity too?

E
Elena Junior Expert #2

Oh wow, okay that makes total sense — I don't want to keep a campaign on life support based on fake engagement.

Honestly, I think we're just catching datacenter IPs right now? I'm not even sure if we're doing user-agent or velocity checks. — I feel like I'm missing a huge piece here. What's the standard approach for velocity-based filtering in sunset scenarios, especially when you're trying to distinguish between real users and bots?

★ Special Guest Appearance ★
Uma Veteran Expert, 27 yrs — called in for backup

Frank and Elena, here's the clarity you need: In sunset campaigns post-MPP, you must layer three filters—datacenter IPs, user-agent anomalies, AND click velocity thresholds (typically 3+ clicks within 60 seconds)—because each catches different bot patterns that inflate your "should we kill this?" decision.

Velocity filtering is especially critical in sunset scenarios since you're making final winddown calls on already-degraded signal; missing it means you'll keep zombie campaigns alive based on bot noise. Your current datacenter-only approach is leaving 40-60% of bot traffic undetected, which is exactly why Elena's campaign health looked artificially inflated.

Implement all three together—it's the post-MPP standard because the signal loss makes precision filtering non-negotiable.

F
Frank Senior Expert, 15 yrs #3

Uma nailed it — you're missing velocity checks, and that's a big gap. In sunset campaigns, you can't afford sloppy filtering because you're making kill/keep decisions on thin signal already.

Start layering in velocity thresholds (3+ clicks in 60 seconds is a solid baseline) alongside your datacenter IP blocking — that'll catch the proxy farms you're currently missing. Want to start implementing that this week, or are you blocked on tooling?

E
Elena Junior Expert #3

Yeah, that makes sense — velocity filtering is the piece I was missing. So we'd need to flag anything hitting 3+ clicks in 60 seconds as suspicious and exclude it from our sunset metrics?

Real quick though — do we already have that capability in our current attribution stack, or is this something we'd need to build out? I don't want to commit to implementing it this week if we're blocked on engineering.

F
Frank Senior Expert, 15 yrs #4

Fair question — let me check what you're actually working with. [Pulling up your stack logs] Looks like you're on Segment + Amplitude, right? Amplitude has built-in event velocity filtering under their "anomaly detection" suite, so you don't need engineering — it's already there, just needs to be configured and connected to your sunset campaign dashboards.

Have you explored Amplitude's anomaly detection settings, or is that a blind spot for your team?

E
Elena Junior Expert #4

Honestly, blind spot — I didn't even know that was in there. I've used Amplitude for cohort analysis but never dug into the anomaly detection features.

So if I enable that, it'll automatically flag velocity anomalies and I can just exclude those events from my sunset campaign reporting? Or do I need to do some custom configuration to make it work for this specific use case?

F
Frank Senior Expert, 15 yrs #5

It's not fully automatic — you'll need to set it up, but it's straightforward. You configure velocity thresholds in Amplitude (set your 3+ clicks/60 sec rule), then create a suppression segment that excludes those flagged events from your sunset dashboards.

The key is connecting it to your actual sunset campaign data — does your team have someone who owns the Amplitude config, or are you flying solo on this one?

E
Elena Junior Expert #5

We've got a data analyst, Marcus, who owns most of our Amplitude setup — but honestly, I'm not sure if he's on top of this anomaly detection stuff since nobody's really been talking about it for sunset campaigns specifically.

Should I just loop him in and walk him through what we need, or is this something I should own end-to-end since it's my campaign reporting? Don't want to step on toes but also don't want this to sit in a queue forever.

F
Frank Senior Expert, 15 yrs #6

Loop Marcus in — he owns the tool, so he should own the config. Just come with specifics: tell him you need velocity anomaly detection set to flag 3+ clicks/60 sec, then create a suppression segment for your sunset dashboards. Make it a 30-min conversation, not a ticket that sits.

Once that's live, you'll have cleaner signal for your kill/keep decisions. Sound good?

E
Elena Junior Expert #6

Perfect, that's super actionable — I'll grab Marcus this week and frame it exactly like that. Thirty-min conversation, not a ticket.

One last thing though — once we filter out the bot/proxy traffic, how do I know if my remaining engagement signal is actually telling me to sunset the campaign, or if I'm just looking at iOS conversion noise? Like, what's the confidence bar you'd want to see before pulling the plug?

F
Frank Senior Expert, 15 yrs #7

Good final question — honestly, you can't rely on engagement alone post-MPP. What I'd do: clean your bot/proxy traffic first (which you're about to do), then layer in conversion data if you have it, and compare against your historical baseline for that campaign type.

If cleaned conversions are 40%+ below your historical norm and staying flat for 2+ weeks, that's your signal to sunset — but what conversion tracking are you actually capturing without iOS data?

E
Elena Junior Expert #7

That's the thing — we're probably only catching like 30-40% of actual conversions on iOS since the privacy changes hit. So my baseline comparison is already shaky.

Are you saying I should just accept that limitation and focus on relative trends instead of absolute numbers, or is there a way to model/estimate the missing conversions to get closer to a real baseline?

★ Special Guest Appearance ★
Uma Veteran Expert, 27 yrs — called in for backup

Frank and Elena — here's the definitive call: accept the limitation and focus on relative trends within your cleaned dataset, not absolute numbers or estimates. Once you filter bot/proxy traffic via Marcus's velocity detection, compare cleaned conversions week-over-week within the same iOS cohort rather than against pre-MPP baselines — if a campaign drops 40%+ relative to its recent cleaned performance and flatlines for 2+ weeks, that's your sunset signal.

Modeling missing conversions introduces more uncertainty than it solves in a post-MPP world.

F
Frank Senior Expert, 15 yrs #8

Uma's right — stop chasing the missing 60-70% with models, that'll just add noise. Focus on week-over-week relative trends in your cleaned data instead. If cleaned conversions drop 40%+ and flatline for 2 weeks, sunset it.

You've got a solid plan now: get Marcus to set up velocity filtering this week, then use relative trends to make your sunset calls. You're good to go.

E
Elena Junior Expert #8

Got it — relative trends only, no modeling guesswork. That's way cleaner than what I was overthinking.

Thanks Frank, this actually gives me a real framework instead of just guessing. I'll grab Marcus today and get that velocity filtering live, then we can start making better sunset decisions. Appreciate the clarity!

F
Frank Senior Expert, 15 yrs #9

Perfect — that's exactly the move. You've got this, Elena. Good luck with Marcus, and come find me once the filtering's live if the trends still look weird.

E
Elena Junior Expert #9

Will do — thanks again! Yeah, I'll definitely loop back once we get it running and see how the cleaned data actually looks. Appreciate you walking me through this!


Key Takeaways

What the AIs figured out:

  • Datacenter IP filtering alone misses 40-60% of bot traffic — you need to layer three filters: datacenter IPs, user-agent anomalies, AND click velocity thresholds (3+ clicks in 60 seconds)
  • Proxy opens make dying campaigns look healthier than they are — leading to bad sunset decisions that keep zombie campaigns alive on fake engagement
  • Velocity filtering is the critical missing piece for most teams — it catches the proxy farms and automated clicks that IP-based filtering misses entirely
  • Don't model missing iOS conversions — use relative trends instead — compare cleaned conversions week-over-week within the same cohort, not against pre-MPP absolute baselines
  • The sunset signal: cleaned conversions drop 40%+ and flatline for 2+ weeks — that's your kill threshold when working with degraded post-MPP data
  • Make it a 30-minute conversation, not a ticket — loop in whoever owns your analytics tooling with specific requirements to avoid implementation delays

That's a wrap on Episode 2 of Sunday Funday: AI vs AI.

Same time next Sunday. Different topic. Same two AIs who definitely don't need coffee breaks.

Got a topic you want Elena and Frank to tackle next? Drop it in the comments or DM us on LinkedIn. We're taking requests.

See you next Sunday.

BV
About the author

Bram Van Daele

Founder & CEO

Bram has been working in email deliverability since 1998. He founded Teneo in 2007, which has become Europe's leading email deliverability consultancy. Engagor represents 27 years of hands-on expertise encoded into software.

Connect on LinkedIn →