Getting Started
In the hyper-competitive mobile app ecosystem, measurement isn’t just important—it’s existential. But with dozens of metrics to track and a fragmented attribution landscape, how do you know what actually matters? Let’s break down the essential KPIs, explore the role of Mobile Measurement Partners (MMPs) like AppsFlyer and Adjust, and give you a framework to actually use this data.
Part 1: The KPI Framework That Actually Works
Acquisition & Activation Metrics
Cost Per Acquisition (CAC)
Your CAC tells you how much you’re spending to acquire each user. But here’s the thing: not all CACs are created equal. A $5 CAC from Facebook might deliver users who churn in 3 days, while a $15 CAC from Google Search might bring users who stick around for years. Context is everything.
Calculate it: Total marketing spend / Number of new users acquired
Industry benchmarks vary wildly by vertical:
- Gaming: $1-$3 (highly competitive)
- E-commerce: $10-$40
- Finance/Banking: $50-$200
- B2B SaaS: $100-$500+
[Sources: https://liftoff.io/resources/mobile-app-engagement-index-2024/, https://www.appsflyer.com/resources/reports/performance-index/]
Install-to-First-Open Rate
This metric reveals friction in your onboarding flow. Industry benchmark? Around 70-80% is solid. If you’re below 60%, you’ve got a problem—maybe your app store listing is misleading, or technical issues are preventing successful installations.
[Sources: https://www.adjust.com/resources/ebooks/mobile-app-trends-2024/, https://www.mobileaction.co/app-store-optimization-guide]
Time to First Key Action (TTFKA)
This is where the magic happens (or doesn’t). How long does it take users to complete their first meaningful action? For Uber, it’s requesting a ride. For Spotify, it’s playing a song. For Instagram, it’s viewing the feed. The faster users reach this moment, the better your retention will be.
Best-in-class apps get users to their “aha moment” within:
- 30 seconds for social/content apps
- 2 minutes for marketplace/e-commerce
- 5 minutes for productivity/utility apps
[Sources: https://amplitude.com/blog/product-analytics, https://www.reforge.com/growth-series]
Activation Rate
What percentage of users who install your app actually complete the key actions that define an “activated” user? Industry data shows that apps with activation rates above 40% tend to have strong retention. Below 25%? Your onboarding needs serious work.
[Sources: https://mixpanel.com/blog/product-benchmarks-report/, https://www.reforge.com/blog/retention-engagement-growth-framework]
Engagement & Retention Metrics
Daily Active Users (DAU) and Monthly Active Users (MAU)
The raw numbers matter, but the DAU/MAU ratio matters more. This “stickiness” metric tells you how habitually users engage with your app.
- 20% = Excellent (users open your app 6+ days per month)
- 10-20% = Good
- <10% = Warning sign
Social apps like Instagram or TikTok can hit 50-60% DAU/MAU. Utility apps might be perfectly healthy at 15-20%. Context matters: a meditation app used daily is different from a travel booking app used quarterly.
[Sources: https://a16z.com/measuring-product-health/, https://www.facebook.com/notes/facebook-engineering/facebook-f8-developer-conference/]
Session Length and Session Frequency
How long do users spend in your app per session? How many sessions per day/week? These metrics need context by category:
- Gaming apps: Longer sessions (15-30 min) but maybe 1-2x daily
- Social media: Shorter sessions (5-10 min) but 5-10x daily
- Banking apps: Very short sessions (2-3 min) but high conversion rates
- E-commerce: Medium sessions (8-12 min) with spiky frequency around paydays
[Sources: https://www.data.ai/en/insights/market-data/state-of-mobile-2024/, https://sensortower.com/blog/mobile-app-usage-intelligence]
Retention Curves: D1, D7, D30, D90
This is THE single most important metric for predicting your app’s long-term success. Here’s what good looks like across categories:
Day 1 retention:
- Social/Communication: 40-50%
- Gaming (casual): 25-35%
- Gaming (mid-core): 35-45%
- E-commerce: 30-40%
- Fintech: 40-50%
Day 7 retention:
- Social/Communication: 20-30%
- Gaming (casual): 10-15%
- Gaming (mid-core): 15-25%
- E-commerce: 15-20%
- Fintech: 20-30%
Day 30 retention:
- Social/Communication: 10-20%
- Gaming (casual): 4-8%
- Gaming (mid-core): 8-15%
- E-commerce: 8-12%
- Fintech: 12-18%
If your D1 retention is below 20% across any category, you have a fundamental product problem. No amount of marketing spend will fix it—you need to fix the product first.
[Sources: https://www.lennysnewsletter.com/p/what-is-good-retention, https://www.appsflyer.com/resources/reports/retention-benchmarks/, https://www.localytics.com/resources/blog/behavioral-analytics-mobile-app-retention/]
Cohort Analysis
Don’t just look at aggregate retention—analyze by cohort. Critical questions:
- Are users acquired in January performing better than February users?
- Is iOS retention different from Android?
- Do users from paid channels retain better than organic?
- How do different geographies compare?
Strong cohort analysis can reveal that your $5 CPM Facebook campaign actually delivers negative LTV users, while your $50 CPM Google UAC delivers profitable users. This is where real optimization happens.
[Sources: https://www.reforge.com/blog/retention-engagement, https://www.gopractice.io/retention-curves-analysis/]
Churn Rate
The flip side of retention. Monthly churn rate = (Users who churned this month / Total users at start of month) × 100
Acceptable churn varies dramatically:
- Consumer subscription apps: 5-7% monthly churn is good
- Enterprise SaaS: 1-2% monthly churn
- Gaming: 40-60% monthly churn can be normal (!)
The key is understanding if your churn is accelerating or decelerating over time.
[Sources: https://www.profitwell.com/recur/all/subscription-benchmarks, https://chartmogul.com/resources/saas-metrics/]
Monetization Metrics
Average Revenue Per User (ARPU) and ARPPU
ARPU = Total revenue / Total users ARPPU = Total revenue / Paying users only
Why both matter: An app with $2 ARPU might have 10% paying users at $20 ARPPU, or 50% paying users at $4 ARPPU. These are completely different businesses requiring different strategies.
Industry ARPU benchmarks (monthly):
- Dating apps: $1-$3
- Gaming (casual): $0.50-$2
- Gaming (mid-core/hardcore): $5-$25
- Streaming media: $8-$15
- Fitness/Health: $5-$10
[Sources: https://www.businessofapps.com/data/app-revenues/, https://www.revenuecat.com/blog/subscription-benchmarks/]
Lifetime Value (LTV)
The holy grail metric. How much revenue will a user generate over their entire relationship with your app?
Simple LTV calculation: ARPU × Average User Lifespan Better LTV calculation: (ARPU × Gross Margin) / Churn Rate
For subscription apps: Monthly subscription price × Average subscription length For ad-monetized apps: (Daily ad revenue per user × Retention curve integration)
LTV is notoriously hard to calculate accurately for new apps since you don’t have enough historical data. Many teams use predicted LTV at D7, D30, or D60 as proxies.
[Sources: https://andrewchen.com/how-to-calculate-lifetime-value/, https://clevertap.com/blog/customer-lifetime-value/]
LTV:CAC Ratio
The ultimate profitability metric. Generally:
- <1 = You’re losing money on every user (death spiral)
- 1-2 = Breaking even to marginally profitable (risky)
- 3-5 = Healthy, sustainable growth
- 5 = Excellent, but maybe you’re under-investing in growth
However, timeframe matters. If your LTV is $100 but takes 24 months to realize, while your CAC is $30 paid upfront, you might have a cash flow problem even with a 3.3x LTV:CAC ratio.
[Sources: https://www.forentrepreneurs.com/saas-metrics-2/, https://www.reforge.com/blog/ltv-cac-playbook]
Payback Period
How long does it take to recover your CAC? Payback Period = CAC / (ARPU × Gross Margin)
Benchmarks:
- Consumer apps: 3-6 months is good
- SaaS: 12-18 months is acceptable
- Enterprise: 24-36 months can work (but requires deep pockets)
Investors love short payback periods because they enable faster capital recycling and growth compounding.
[Sources: https://openviewpartners.com/blog/saas-benchmarks/, https://a16z.com/enterprise-metrics/]
Conversion Rate (Free to Paid)
What percentage of your users convert to paying customers?
- Freemium SaaS: 2-5% is typical
- Gaming (IAP): 1-3% is standard, 5%+ is exceptional
- Dating apps: 10-20% premium conversion
- Fitness apps: 5-8%
The “2% rule” in mobile gaming (only 2% of users will ever pay) has been standard wisdom for years, but top apps now achieve 4-6% through better monetization design.
[Sources: https://www.ecgr.games/monetization-study, https://mobiledevmemo.com/iap-conversion-analysis/]
Part 2: Mobile Measurement Partners (MMPs) – Why They’re Essential
The Attribution Problem
Here’s the challenge: You’re running ads on Facebook, Google, TikTok, Snapchat, Twitter, Apple Search Ads, and 10 other networks. A user sees your ad on Instagram, clicks it, then sees another ad on Google two days later, clicks that, and installs your app. Who gets credit for the install?
Without an MMP, each ad network will claim credit for that install (called “attribution”), leading to massive over-reporting. You think you acquired 10,000 users last month, but you actually acquired 6,000—the rest were double or triple-counted.
MMPs solve this through deterministic and probabilistic attribution methods, acting as a neutral third party that all networks agree to trust.
[Sources: https://www.appsflyer.com/resources/guides/mobile-attribution-101/, https://www.adjust.com/glossary/mobile-attribution/]
What MMPs Actually Do
1. Install Attribution
When a user clicks an ad and installs your app, the MMP tracks this journey and attributes the install to the correct source. This works through:
- Click IDs: Unique identifiers passed from the ad network to the MMP
- Device fingerprinting: Matching device characteristics (IP address, device model, OS version, etc.)
- Store redirect: Apple and Google Play’s official attribution methods
- SKAdNetwork: Apple’s privacy-preserving attribution framework for iOS 14+
The MMP then sends postbacks to ad networks confirming which installs they should count.
[Sources: https://www.branch.io/resources/guide/mobile-attribution/, https://www.singular.net/glossary/mobile-attribution/]
2. Deep Linking & Deferred Deep Linking
Deep linking takes users directly to specific content within your app after clicking an ad, rather than dumping them on the home screen like a caveman.
Example: You click an Instagram ad for a specific pair of shoes. Deep linking opens your app directly to that product page, not the app homepage. This can improve conversion rates by 2-3x.
Deferred deep linking works even if the user doesn’t have your app installed yet—after they install, they’re taken to the intended content on first open.
[Sources: https://www.branch.io/resources/guide/deep-linking/, https://www.appsflyer.com/resources/guides/onelink-deep-linking/]
3. Fraud Prevention
Mobile ad fraud costs the industry $5-7 billion annually. Common fraud types:
- Click flooding: Fraudsters send massive volumes of fake clicks, hoping to steal credit for organic installs
- Click injection: Malware triggers a fake click just before an install completes
- Install farms: Real devices, fake users—humans paid pennies to install and briefly use apps
- SDK spoofing: Fraudsters fake install postbacks without any real user
Top-tier MMPs use machine learning to detect these patterns and reject fraudulent installs in real-time, saving advertisers 15-30% of wasted budget.
[Sources: https://www.appsflyer.com/resources/reports/fraud-report/, https://www.adjust.com/resources/ebooks/fraud-prevention/, https://www.pixalate.com/mobile-fraud-statistics/]
4. Incrementality Testing
Just because users installed your app after clicking an ad doesn’t mean the ad caused the install. Maybe they would have installed anyway (organic intent).
MMPs enable incrementality testing through:
- Ghost ads (PSA/placebo ads) to measure true lift
- Geo-holdout tests (run ads in some regions, not others)
- Audience splitting to compare exposed vs. non-exposed groups
Studies consistently show that 20-40% of attributed installs aren’t actually incremental—they would have happened anyway. Incrementality testing helps you stop wasting money on these.
[Sources: https://support.google.com/google-ads/answer/7665882, https://www.facebook.com/business/measurement/conversion-lift, https://www.appsflyer.com/resources/guides/incrementality-measurement/]
5. Re-engagement & Retargeting Attribution
Acquiring a user is just the beginning. MMPs also track re-engagement campaigns that bring back dormant users. This is particularly valuable for apps with longer consideration cycles (travel, e-commerce) or seasonal usage patterns.
Re-attribution windows (typically 30-90 days) determine when a returning user is considered “re-engaged” versus just naturally returning.
[Sources: https://liftoff.io/resources/re-engagement-benchmarks/, https://www.adjust.com/resources/guides/retargeting/]
6. Cost & ROI Aggregation
MMPs pull cost data directly from ad networks (via API integrations), match it with performance data, and give you unified reporting. Instead of logging into 15 different dashboards, you see everything in one place:
- Spend by network, campaign, ad set, creative
- CPI (Cost Per Install), CPM, CTR
- ROI and ROAS at any level of granularity
- Cohorted LTV vs. CAC by channel
This is where the “measurement partner” part really shines—they become your single source of truth.
[Sources: https://www.appsflyer.com/product/roi360/, https://www.adjust.com/product/automate/]
AppsFlyer vs. Adjust vs. Branch vs. Singular
Let’s break down the major players:
AppsFlyer
- Market leader with ~40% market share
- Best-in-class fraud prevention (Protect360)
- Strongest incrementality measurement suite
- Most extensive network integrations (10,000+)
- Pricing: Generally the most expensive, especially at scale
- Best for: Large enterprises, gaming companies, anyone serious about fraud prevention
[Sources: https://www.appsflyer.com/product/, https://www.businessofapps.com/data/mmp-market-share/]
Adjust
- #2 player, owned by AppLovin (potential conflict of interest if you’re not in their ecosystem)
- Cleanest, most intuitive UI
- Strong automation features (Automate)
- Excellent for beginners and mid-market
- Tight integration with AppLovin’s monetization platform
- Pricing: Mid-range, good value
- Best for: Mid-market apps, anyone using AppLovin for monetization
[Sources: https://www.adjust.com/product/, https://investors.applovin.com/]
Branch
- Started as a deep linking company, expanded into full MMP
- Absolute best deep linking and cross-platform identity
- Excellent for apps with heavy web <> app flows
- Strong in e-commerce and marketplace apps
- Weaker fraud prevention compared to AppsFlyer
- Pricing: Competitive, especially for startups
- Best for: E-commerce, marketplaces, apps with significant web traffic
[Sources: https://www.branch.io/product/, https://www.mobilegrowthstack.com/mmp-comparison/]
Singular
- Great for power users who want extreme granularity
- Excellent data export and BI tool integrations
- Strong creative analytics (which ad creative performs best)
- Smaller market share = fewer pre-built integrations
- Pricing: Competitive
- Best for: Data-driven teams, performance marketers who live in dashboards
[Sources: https://www.singular.net/platform/, https://www.businessofapps.com/guide/mobile-attribution-platforms/]
Kochava, Tenjin, and Others
Several other players serve specific niches (gaming, privacy-first, etc.) but lack the scale and network integrations of the top 4.
[Sources: https://www.kochava.com/products/, https://www.tenjin.com/product/]
The iOS 14+ Privacy Apocalypse & SKAdNetwork
In April 2021, Apple released iOS 14.5 with App Tracking Transparency (ATT), requiring apps to ask users for permission to track them across apps and websites. Opt-in rates? Around 15-25% globally (though higher in some verticals).
For users who opt out, traditional attribution methods (IDFA-based) don’t work. Enter SKAdNetwork (SKAN), Apple’s privacy-preserving attribution framework.
How SKAN Works:
- Attribution happens on-device, not server-side
- Extremely limited data: Just campaign ID, conversion value (0-63), and rough timing
- No user-level data, no cohort analysis, no lifetime tracking
- 24-48 hour delay in receiving attribution data
SKAN’s Impact:
- Massive hit to campaign optimization (you’re flying blind for 1-2 days)
- Can’t measure retention or LTV accurately
- Creative testing becomes harder
- Small campaigns don’t meet Apple’s privacy thresholds, so you get zero data
MMPs have built SKAN management layers to help, but iOS attribution is fundamentally degraded vs. the pre-ATT era. Many advertisers saw iOS CPMs drop 30-40% as targeting efficacy declined.
[Sources: https://www.appsflyer.com/resources/guides/skadnetwork/, https://mobiledevmemo.com/att-impact-analysis/, https://developer.apple.com/documentation/storekit/skadnetwork]
Part 3: Building Your Measurement Stack
The Essential Stack
1. MMP (AppsFlyer, Adjust, etc.)
Your source of truth for attribution, fraud prevention, and unified cost/performance reporting.
[Sources: https://www.appsflyer.com/, https://www.adjust.com/]
2. Product Analytics (Amplitude, Mixpanel, Heap)
Deep behavioral analysis, funnel optimization, cohort analysis. Your MMP tells you who installed; your product analytics platform tells you what they did after installing.
[Sources: https://amplitude.com/, https://mixpanel.com/, https://heap.io/]
3. A/B Testing Platform (Optimizely, Firebase, VWO)
Rigorous experimentation to optimize onboarding, features, and monetization. Never trust opinions—ship variants and let data decide.
[Sources: https://www.optimizely.com/, https://firebase.google.com/products/ab-testing, https://vwo.com/]
4. Business Intelligence / Data Warehouse (BigQuery, Snowflake, Redshift)
For teams that want to own their data and build custom analyses, a data warehouse is essential. MMPs and analytics platforms export data here for advanced modeling.
[Sources: https://cloud.google.com/bigquery, https://www.snowflake.com/, https://aws.amazon.com/redshift/]
5. Customer Data Platform (Segment, mParticle, RudderStack)
Acts as a central hub, collecting data from all sources and routing it to all destinations. Reduces engineering burden and ensures data consistency.
[Sources: https://segment.com/, https://www.mparticle.com/, https://rudderstack.com/]
Avoid These Common Mistakes
1. Over-reliance on Vanity Metrics
“We hit 1 million downloads!” Cool—how many are still using the app? Downloads mean nothing. Focus on retention and monetization.
[Sources: https://www.reforge.com/blog/vanity-metrics]
2. Not Segmenting Your Data
Aggregate metrics hide the truth. Segment by:
- Acquisition channel (organic vs. paid, by network)
- Geography (US vs. India might be completely different)
- Platform (iOS vs. Android)
- Device tier (premium vs. budget phones)
- User cohort (install date)
[Sources: https://www.gopractice.io/cohort-analysis/]
3. Ignoring Statistical Significance
“Variant B has 3.2% conversion vs. Variant A’s 3.1%—ship it!” Not so fast. With small sample sizes, that difference could be pure noise. Use proper A/B testing tools that calculate confidence intervals.
[Sources: https://amplitude.com/blog/ab-testing-best-practices]
4. Attribution Window Misconfigurations
Your MMP’s default attribution window might be 7 days for clicks and 1 day for impressions. But if your user journey is longer (travel booking, car purchase), you need longer windows. Conversely, shorter windows reduce fraud vulnerability.
[Sources: https://www.appsflyer.com/resources/guides/attribution-windows/]
5. Not Reconciling Data Sources
Your MMP says 10,000 installs. Google Analytics says 9,200. Your internal database says 8,800. Which is right? Discrepancies are normal (5-10%), but if they’re larger, you have a tracking implementation problem. Fix it before making decisions.
[Sources: https://www.gopractice.io/data-reconciliation/]
Part 4: Advanced Tactics & Future Trends
Predictive LTV Modeling
Don’t wait 365 days to know if a user will be valuable. Use machine learning models trained on D7 or D30 behavior to predict lifetime value. This enables real-time bidding optimization and faster iteration cycles.
Tools like AppsFlyer Predict, Optimove, and custom models built on your data warehouse can achieve 70-85% accuracy in LTV prediction after just 7 days.
[Sources: https://www.appsflyer.com/product/predict/, https://medium.com/zynga-engineering/ml-for-ltv-prediction]
Privacy-First Measurement
With ATT on iOS and potential similar regulations coming to Android (Privacy Sandbox), the industry is shifting toward:
- Aggregated measurement (no user-level data)
- Incrementality > last-click attribution
- Media Mix Modeling (MMM) making a comeback
- First-party data strategies (owned channels, email, push)
Smart marketers are building owned audiences and reducing dependence on third-party attribution.
[Sources: https://developer.android.com/design-for-safety/privacy-sandbox, https://www.facebook.com/business/help/331612538028890]
Creative Analytics
The biggest lever in mobile UA is often creative, not targeting. MMPs now offer creative-level reporting, showing which ad creatives drive the best LTV users (not just the most installs).
Pair this with creative intelligence platforms (AppLovin Creative Labs, TikTok Creative Center) to systematically test and iterate.
[Sources: https://www.singular.net/blog/creative-analytics/, https://mobiledevmemo.com/creative-intelligence-mobile-growth/]
Web-to-App Attribution
Many apps (especially e-commerce, travel, food delivery) have significant web traffic. Tracking the user journey from web discovery → app install → in-app purchase requires specialized measurement (Branch excels here).
Emerging standards like App Links (Android) and Universal Links (iOS) help, but implementation is tricky.
[Sources: https://www.branch.io/resources/guide/web-to-app-measurement/, https://www.airship.com/resources/explainer/cross-channel-attribution/]
Conclusion: Measurement Is Strategy
Great mobile apps aren’t built on gut feel—they’re built on rigorous measurement, experimentation, and iteration. Your KPI framework determines what you optimize for. Your MMP determines how accurately you can measure and improve.
Start with the fundamentals:
- Nail your North Star Metric (the one metric that best captures value delivery)
- Implement bulletproof attribution (choose an MMP and configure it correctly)
- Obsess over retention (everything else is secondary)
- Build a culture of experimentation (ship, measure, learn, repeat)
The apps that win aren’t necessarily those with the best ideas—they’re the ones that learn fastest. Measurement is your competitive advantage.