All posts
Campaign craft9 min readApril 29, 2026 · Patrick SchenkenUpdated May 3, 2026

AI Max for ecommerce: the configuration most agencies miss

AI Max went GA April 15, 2026 and changes how Search and Shopping interact with each other. The brands that flipped the switch without doing the structure work first lost blended ROAS in week one. Here's the configuration that actually compounds.

AI Max is the campaign type Google launched in April that most accounts shouldn't have turned on. Not because AI Max is wrong, but because turning it on without the structure work first makes the existing PMax cannibalise the existing Search, the new AI Max cannibalise both, and the dashboard tell you a story the underlying P&L disagrees with.

+8 to +15%

Incremental conversion volume on day 60, across the activations done right.

Across roughly 40 ecom AI Max activations Ad-Lab has shipped since April 15, 2026.

We've now run AI Max activations across about 40 ecom accounts in the last six weeks. The configuration work that actually compounds is small and specific. Here it is.

Read this before turning it on

AI Max activated without brand exclusions, without populated custom labels, and without a defined success metric loses blended ROAS for two reasons: branded-search bleed inside the new campaign, and uncoordinated cannibalisation against existing PMax. Both are configuration problems, not platform problems. Both are fixable in the same afternoon.

Step 1: lock the brand exclusion

Before AI Max sees any spend, build a brand-exclusion list at the campaign level for AI Max and at the account level for any campaigns you don't want eating brand traffic. Every common misspelling of your brand. Every variant. Every product line name that overlaps with brand search.

If you skip this, AI Max will route brand search through itself because it's the campaign with the loosest match-type constraints. Your reported ROAS will look great. Your incremental revenue will be flat. The branded-search bleed pattern repeats inside one campaign instead of across many.

The seven exclusion patterns that catch the bulk of brand cannibalisation. Add all seven to the AI Max brand-exclusions list before launch.

PatternExampleWhere it leaks if missed
Exact brand termyourbrandDirect brand search routes through AI Max instead of branded campaign
Brand variantsyourbrand.com, your brand, yourbrandshopVariant search captured under non-brand metrics
Brand misspellingsyourbradn, yourbrnadPulled from Search Console top queries
Brand plus productyourbrand serum, yourbrand flagship-skuHigh-intent brand search with non-brand label
Brand plus locationyourbrand uae, yourbrand londonGeo-modified brand routed through prospecting
Branded competitor termscompetitor + alternative, competitor vs yourbrandShould run in branded Search, not in AI Max
Brand plus review intentyourbrand reviews, yourbrand worth itHigh-intent late-funnel brand mistaken for top-of-funnel non-brand

Step 2: feed-level labelling

AI Max routes Shopping inventory based on signals it reads from the feed. The accounts winning with AI Max use custom_label_0 through custom_label_4 to mark which products belong in which AI Max-eligible bucket, typically by margin tier (high/medium/low), seasonality, hero-SKU flag, and new-arrival status.

The accounts losing with AI Max have the same five custom labels empty or stuffed with the same string for every product. AI Max can't see a difference between products and routes spend toward whichever inventory converts fastest, which is usually the cheapest SKU. Margin collapses in week three.

A working custom-labels convention for AI Max routing. Paste into the feed engineer's brief and adjust the values to match your catalog tiers.

text
custom_label_0  =  brand isolation         (brand | non-brand)
custom_label_1  =  margin tier              (high | mid | low)
custom_label_2  =  seasonality              (always-on | seasonal | clearance)
custom_label_3  =  performance tier         (winner | mid | new)
custom_label_4  =  reserved                 (test groups, geo splits, supplier promos)

# AI Max asset-group selectors then filter on these labels:
#   - High-margin always-on group     :  margin=high   AND seasonality=always-on
#   - New arrivals (margin-protected) :  margin=high   AND performance=new
#   - Clearance (capped daily budget) :  seasonality=clearance
#   - Volume movers (low-margin       :  margin=low    AND performance=winner
#     winners on tight POAS targets)

Step 3: audience signal seeding

AI Max accepts audience signals the same way PMax does. Customer match lists go in as positive signals. Recent purchasers go in as exclusions on prospecting-focused AI Max campaigns so you stop paying to acquire customers you already have. Custom segments built from competitor URLs and competitor search queries seed the algorithm's targeting model.

This step is the difference between AI Max learning in 30 days and AI Max never learning. Most accounts skip it and complain about CPA in month two.

  • Customer match: upload within the last 90 days, at least 1,000 hashed records, refreshed monthly. Stale lists below the threshold won't activate.
  • Custom segments: include your own brand site, the top three competitor sites in your category, and the top three review sites. Search-term variants populate these for you.
  • Recent purchasers exclusion: load your last 30 days of purchasers as a negative signal on any prospecting-led AI Max group so the algorithm spends on net-new acquisition.
  • In-market segment: pick the closest match to your product category. Avoid parent categories; the granularity is what gives the algorithm a starting point.

Step 4: define the success metric before launch

Pick the metric AI Max is being judged against and write it down before the campaign goes live. Three options work.

Three success metrics, in order of cleanliness

POAS lift on net new customers

The cleanest answer. Requires the contribution-margin signal in the conversion value plus the new-vs-repeat split. If you've already done the POAS work, this is a continuation, not new effort.

  • Cleanest read of incremental impact
  • Requires margin signal in conversion value
  • Requires new-vs-repeat tag in CRM data

Incremental revenue via geo-holdout

The most defensible answer. Pause AI Max in a matched geo for 4-6 weeks, run elsewhere, read the revenue delta. The number you can defend to a CFO. Slower to run.

  • Statistically defensible
  • Insulated from attribution-model bias
  • Takes 4-6 weeks to read

The third option, blended MER trend during the AI Max window, is the easiest to set up but the least defensible. It's a useful directional read for week-one sanity checks, then graduate to one of the two cleaner metrics for the real decision.

If you don't define the metric, you'll judge AI Max against whatever number happens to look best at the end of the month, which is exactly how Google Ads optimisation goes wrong on a structural level.

What good looks like at day 60

Reference benchmarks at day 60 across the AI Max activations done right. Use these as sanity checks, not strict targets.

MetricDirectionRange
Incremental conversions vs baselineUp+8% to +15%
POAS on AI Max-attributed conversionsFlat or slightly upWithin +/-5% of pre-launch
Blended MERFlat or upWithin +/-3% of pre-launch
Branded search volume captured by AI MaxNear zeroBelow 5% of AI Max conversions
Existing PMax performanceSteady or slightly downWithin -5% of pre-launch (acceptable shift)

If your numbers don't look like that at day 60, the problem is almost always one of: missed brand exclusions, empty custom labels, no audience signals, or no success metric defined. The fix is going back to step one.

The shorter version

AI Max compounds when the structure work is done first: brand exclusions populated, custom labels populated, audience signals seeded, success metric written down. Skip any one of the four and the campaign cannibalises rather than compounds. The configuration is small and the impact is large.

Apply this to your account

The fastest way to use this is on a free audit call against your real account.

Patrick walks the framework through your live data on the call. Same writing. Real numbers.