How FanDuel Used Geo Testing to Prove Incremental Value
FanDuel ran geo experiments to measure the real incremental value of their TV and digital ad spend. Here's what they found and how you can apply it.
When billions are at stake, you measure incrementality
FanDuel spends hundreds of millions annually on advertising across TV, digital, out-of-home, and sponsorships. In the hyper-competitive sports betting market, every dollar must drive actual new depositors -- not just brand impressions that feel productive.
FanDuel's marketing team faced a challenge familiar to every CMO: platforms claimed fantastic returns, but the overlap between channels made it impossible to know which dollars actually mattered. Were the TV ads driving new signups, or just reinforcing awareness among people who would have downloaded the app anyway?
To answer that question, FanDuel turned to geo testing -- one of the most rigorous and public examples of incrementality measurement in recent marketing history.
The challenge: separating signal from noise
Sports betting marketing is uniquely complex:
Massive multi-channel investment. FanDuel advertises on national TV, local TV, digital (Meta, Google, YouTube), podcasts, radio, out-of-home, and sponsorships. Every channel claims credit for every new user.
Seasonal concentration. NFL season drives the majority of new signups. Marketing effectiveness changes dramatically between September and March.
Promotional overlap. Sign-up bonuses, risk-free bets, and referral programs run simultaneously. Separating the impact of advertising from promotions is difficult.
Competitive pressure. DraftKings, BetMGM, and dozens of competitors run similar campaigns in the same markets. Share of voice shifts constantly.
Platform-reported attribution was essentially useless in this environment. When every channel claims credit and promotions muddy the signal, only controlled experimentation can reveal the truth.
FanDuel's geo testing methodology
FanDuel's approach followed the classic geo experiment framework, adapted for their specific market.
Market selection and matching
FanDuel divided legal sports betting markets into matched pairs based on:
- Historical sign-up rates per capita
- Market maturity (how long sports betting had been legal)
- Competitive intensity (number and spend level of competitors)
- Demographics (age, income, sports engagement)
- Seasonal patterns (NFL/NBA/MLB market differences)
Each pair consisted of two markets with similar characteristics -- one designated as test (ads continue) and one as control (ads paused or reduced).
Testing approach
Rather than testing everything at once, FanDuel ran sequential experiments:
Experiment 1: TV holdout. They paused or significantly reduced TV advertising in control markets while maintaining digital spend. This isolated TV's incremental contribution to new depositor acquisition.
Experiment 2: Digital holdout. In a separate set of markets, they paused digital advertising while maintaining TV. This isolated digital's contribution.
Experiment 3: Full dark market. In a small number of markets, they paused all paid advertising. This measured the baseline organic acquisition rate -- how many people would sign up without any advertising.
Measurement
FanDuel measured results using their own backend data:
- New depositing users (first-time deposits)
- Cost per acquisition by market
- Retention rates by acquisition source
- Lifetime value of acquired users
They deliberately did not use platform-reported conversions. The measurement was entirely based on first-party data from their registration and deposit systems.
Key findings
TV drove significant incremental volume
The TV holdout experiment showed that markets with TV advertising generated 25-40% more new depositors than matched markets without TV. This was genuine incremental lift -- these users wouldn't have signed up without the TV exposure.
This finding was significant because digital attribution models typically gave TV zero or near-zero credit. From a platform-reported perspective, TV looked like a branding expense with no measurable return. The geo test revealed it was one of the highest-impact channels.
Digital and TV had strong interaction effects
When FanDuel ran the full dark market test, they discovered that the incremental impact of TV + digital together was greater than the sum of each channel tested independently.
TV created awareness and interest. Digital converted that interest into app downloads and deposits. Neither channel was as effective alone as they were in combination. This interaction effect was invisible to any single-channel attribution model.
Some digital spend was capturing, not creating, demand
The digital holdout experiment revealed that a meaningful portion of digital conversions were demand capture, not demand creation. In markets where digital ads were paused but TV continued, many users found FanDuel through organic search, app store searches, or direct navigation.
This meant digital's platform-reported ROAS was inflated because it was taking credit for users TV had already motivated. The actual incremental digital ROAS was lower than reported -- still positive, but less impressive than platform data suggested.
Geographic and seasonal variation was substantial
The experiments showed that marketing effectiveness varied dramatically by market maturity and season. New markets where sports betting had recently launched showed 3-5x higher incremental response to advertising than mature markets. NFL season showed 2-3x higher response than off-season.
This finding led FanDuel to shift budget allocation dynamically -- heavier spend in new markets and peak season, reduced spend in mature markets and off-season.
What this means for your brand
You're not FanDuel. You probably don't have a nine-figure ad budget or the resources to run parallel experiments across dozens of markets. But the principles apply directly to ecommerce and lead generation brands.
Lesson 1: Your highest-attributed channel may not be your most incremental
FanDuel found that digital had the best platform-reported numbers but lower incrementality than expected, while TV had no platform-reported attribution but strong incremental impact. The same pattern applies to smaller brands: your Google Search campaigns likely have great reported ROAS because they capture existing demand, while your Meta awareness campaigns may drive more incremental growth than your attribution model shows.
Lesson 2: Channels work together, not independently
The interaction between TV and digital at FanDuel mirrors the interaction between Meta awareness and Google Search for ecommerce brands. Cutting Meta might reduce Google Search volume. Your attribution model probably doesn't account for this interaction. Geo testing reveals it.
Lesson 3: Effectiveness changes over time
FanDuel's seasonal and market-maturity findings parallel what ecommerce brands experience: campaigns that work in Q4 don't work the same in Q2. New audience segments respond differently than retargeted ones. Regular incrementality testing catches these shifts.
Lesson 4: Backend data is the only trustworthy measurement
FanDuel measured everything using their own registration and deposit data, not platform-reported conversions. For ecommerce brands, this means measuring attribution against Shopify orders and bank deposits, not against what Meta or Google reports.
How to apply FanDuel's approach at smaller scale
You don't need a nine-figure budget to run geo experiments. Here's the scaled-down version:
Start with one channel. Test Meta first (it's typically the largest spend and the most overcounted). Pause Meta in 5-10 DMAs while keeping it active in 5-10 matched DMAs.
Run for 3-4 weeks. Long enough for statistical significance, short enough to limit revenue sacrifice.
Measure from your backend. Use Shopify revenue, not platform data, as your measurement source.
Calculate calibration factors. Use the results to adjust your attribution model's channel weights.
Repeat quarterly. Test a different channel each quarter to build a complete picture of incrementality across your media mix.
FAQ
How much did FanDuel's experiments cost in lost revenue?
The direct cost was the new depositors lost in control markets during the test periods. For a company acquiring millions of users, even a 4-week test in 20% of markets represents significant foregone revenue. They judged the cost worthwhile because the insights saved them from misallocating hundreds of millions in annual ad spend.
Can smaller brands replicate FanDuel's sequential testing approach?
Yes, at a proportional scale. Run one geo test per quarter on your largest or most questioned channel. Over a year, you'll have incrementality data on 3-4 channels. The methodology is the same -- only the scale differs.
Did FanDuel stop using attribution models after geo testing?
No. They used geo test results to calibrate their attribution models, making the models more accurate. Geo tests provide periodic ground truth; calibrated attribution models provide continuous guidance between tests. The two approaches complement each other.
Go Funnel uses server-side tracking and multi-touch attribution to show you which ads actually drive revenue. Book a call to see your real numbers.
Want to see your real ROAS?
Connect your ad accounts in 15 minutes and get attribution data you can actually trust.
Book a Call