Back to prompts
Data AnalysisChatGPTClaudeGemini

Product Usage Analytics Interpreter

Interpret raw product analytics data — feature adoption, session patterns, power user behaviors — and translate findings into actionable product decisions with prioritized recommendations.

Prompt Template

You are a product analytics expert. Interpret the following product usage data and provide actionable insights:

**Product type:** [SaaS / mobile app / marketplace / platform]
**Key metrics to analyze:**
- DAU/WAU/MAU: [numbers]
- Average session duration: [minutes]
- Sessions per user per week: [number]
- Feature adoption rates: [list features with % of users who've used them]
- Power user definition: [how you define a power user]
- Power user percentage: [% of total users]

**Additional data (paste what you have):**
[e.g., top 10 features by usage frequency, drop-off points in key flows, time-to-first-action distribution, user segment breakdowns]

**Business context:**
- Current company goal: [growth / retention / monetization / engagement]
- Recent product changes: [any new features or UI changes]
- North star metric: [the one metric that matters most]

Provide:
1. **Health dashboard** — is the product healthy? Score across stickiness (DAU/MAU), depth (features used per session), and breadth (% of features adopted)
2. **Feature adoption analysis** — which features are underperforming expectations and why (hypothesis)
3. **Power user profile** — what do power users do differently? Which behaviors predict long-term retention?
4. **Engagement segments** — group users into 3-4 segments based on usage patterns with descriptions and sizes
5. **"Aha moment" hypothesis** — which specific action or feature usage correlates most strongly with retention?
6. **Product recommendations** — 5 prioritized actions ranked by expected impact on the north star metric
7. **Experiments to run** — 3 A/B tests or feature experiments based on the data
8. **Tracking gaps** — what data points are missing that would unlock deeper insights?

Example Output

Product Usage Analysis — Flowdesk (B2B SaaS)

Health Dashboard

| Metric | Value | Benchmark | Status |

|--------|-------|-----------|--------|

| Stickiness (DAU/MAU) | 24% | 15-25% for B2B SaaS | ✅ Healthy |

| Avg features used per session | 2.1 | 3+ for sticky products | ⚠️ Below average |

| Feature breadth (% of features tried by avg user) | 31% | 40-60% | 🔴 Low discovery |

Feature Adoption Analysis

| Feature | Adoption | Expected | Gap | Hypothesis |

|---------|----------|----------|-----|------------|

| Email builder | 89% | 90% | ✅ | Core feature, well-adopted |

| Automation builder | 34% | 60% | 🔴 -26% | Hidden in navigation; onboarding doesn't surface it |

| Analytics dashboard | 28% | 50% | 🔴 -22% | Users don't know it exists; no in-app prompt |

| Template library | 72% | 70% | ✅ | Well-positioned on home screen |

| A/B testing | 11% | 25% | 🔴 -14% | Requires Pro plan; free users can't discover the value |

Power User Profile

**Definition:** 5+ sessions/week AND uses 4+ features per session (12% of users)

What they do differently:

- 94% use automations (vs. 34% overall) — this is the clearest differentiator

- Average 3.2 automations active vs. 0.4 for regular users

- They connected 2+ integrations in the first week

- They check analytics dashboard 3x/week

"Aha Moment" Hypothesis

Users who create their first automation within 7 days of signup retain at 68% (Month 3) vs. 29% for users who don't. **First automation = likely aha moment.**

Product Recommendations (Prioritized)

1. 🔴 **Add automation builder to onboarding flow** — guided setup of first automation within signup. Expected: +15-20% automation adoption, strong retention impact.

2. 🟡 **In-app discovery prompts for analytics** — tooltip or banner after first email send: "See how your email performed →". Expected: +10-15% analytics adoption.

3. 🟡 **Integration prompt at Day 3** — email nudge to connect first integration if not done. Power users do this early.

4. 🟢 **Free tier A/B testing preview** — let free users see A/B test results (read-only) to build desire for Pro upgrade.

5. 🟢 **Feature usage streak/badge** — gamify feature exploration for the first 14 days.

Experiments to Run

1. **Onboarding automation wizard vs. current flow** — measure Day-7 automation creation rate

2. **Analytics prompt after first email send vs. no prompt** — measure analytics dashboard visits

3. **Progressive feature unlock messaging vs. no messaging** — measure feature breadth per user

Tips for Best Results

  • 💡Always compare feature adoption against your expectations, not just absolute numbers — 34% adoption might be great for a niche feature but terrible for a core one.
  • 💡Look for the 'aha moment' by comparing retained vs. churned users' first-week behaviors — the differences reveal what drives stickiness.
  • 💡Include your business context (growth vs. retention focus) — the same data leads to different recommendations depending on your current priority.