Product Usage Signals 101: Five Essential Metrics

I find that many teams don’t have a solid understanding of the tools that assist with product management. The challenging part for me is that there are many great tools available, but given this week’s focus, I chose to stay within a two-tool boundary. Just know that there isn’t a single way to do all of these things; experimenting and finding what works best for you is ideal. However, if you need ideas or want to explore the tools we’re discussing today, perhaps these posts can help you get started. 

Today, we establish the analytical foundation for the rest of the week. Below, you will find the meaning of each metric, how to set it up, and the exact steps to access it in both Mixpanel and Amplitude

You can bookmark this guide and share it with every new product team you work with. 

Adoption Depth. 

This metric shows how many unique users encounter a feature after it ships. Think of it as the “front door” indicator: if adoption depth is low, you’re fighting a discoverability battle well before usability even comes into play. Healthy core features typically reach 60–80% of daily active users; anything below 40% indicates that placement, onboarding, or marketing needs improvement. Because adoption depth is at the top of the funnel, increasing it often has multiplying downstream effects on frequency, task success, and stickiness. 

What it answers: Who finds the feature? 

Instrument it: Load the feature view once the UI element is fully rendered. Pass feature_name, platform, and plan_tier as properties so you can segment later. 

Mixpanel setup
  1. Go to Reports → Insights.
  2. Select the Feature Viewed event.
  3. Change the aggregation to Uniques to count distinct users.
Amplitude setup
  1. Create a Feature Adoption chart from the template gallery.
  2. Pick Feature Viewed and set the metric to Unique Users.
Readout tips
  • <40 % of daily active users (DAU) means discoverability, not functionality, is the bottleneck.
  • Track adoption depth per cohort (new vs returning) to assess the effectiveness of onboarding.

Frequency.

Frequency measures how often those adopters return to the feature within a set window, turning a one-time glance into a habit. While adoption depth answers who, frequency answers how much and how routinely. Rising frequency indicates the feature has become part of a user’s workflow; a plateau suggests unmet needs or workflow friction. Track frequency by cohort (e.g., “week 1 adopters vs. week 4”) to identify early drop-offs before they lead to churn.

What it answers: How often do adopters use it?
Instrument it: Re-use Feature Viewed plus a Feature Used event that fires on completion (e.g., filter applied, export clicked).

Mixpanel setup
  1. Stay in Insights.
  2. Select Total Events for Feature Used.
  3. Divide by uniques (chart formula: total/uniques) for average uses per user.
Amplitude setup
  1. Open an Event Segmentation chart.
  2. Choose Feature Used; set Measured As → Count per User.
Readout tips
  • Rising frequency indicates the feature is embedded into the daily workflow.
  • Compare average uses in the first 7 days vs days 8-30 to catch drop-offs early.

Task Success.

Task success measures the completion rate of a user’s intended task: did they finish what they set out to do? It highlights friction points like confusing UI copy, missing validation, or performance issues. Any critical flow with a completion rate below 80% should prompt an investigation. Since the metric is binary, success or failure, it works well with qualitative research to understand the reasons behind the numbers.

What it answers: Do users finish the job they started?
Instrument it: Emit paired events such as Task Started and Task Completed, both carrying task_name.

Mixpanel setup
  1. Navigate to Reports → Funnels.
  2. Add Task Started → Task Completed.
  3. Set a 15-minute conversion window for short tasks.
Amplitude setup
  1. Build a Funnel Analysis or use Event Segmentation with a formula completed/started.
    Amplitude
Readout tips
  • Anything under 80 % success on a core flow screams usability friction.
  • Drill by platform or plan_tier to isolate segments that struggle. 

Time-in-Feature.

Time-in-feature measures how long users stay from entry to exit within a feature. A moderate duration usually indicates healthy engagement, but a long right tail (high 95th percentile) may uncover hidden complexity that frustrates power users. Conversely, very short times could suggest users bounce before gaining value. Monitoring median and P95 durations side-by-side helps you determine whether to simplify advanced workflows or provide more guidance early on.

What it answers: How long does value delivery take?
Instrument it: Let the analytics layer handle it. Mixpanel and Amplitude both auto-compute session duration; you only need Session Start and Session End.

Mixpanel setup
  1. In Insights, choose the virtual event Session End.
  2. Select Median Duration or the reserved $duration_s property for P50 and P95.
Amplitude setup
  1. In Event Segmentation, pick the Session End event.
  2. Use Show → Average of Property → Duration to surface time in seconds. Tip: use a custom formula for P95.
Readout tips
  • Long right-tail (high P95) often means advanced users fight the UI.
  • Slice by user segment to see where veterans vs novices diverge.

Stickiness.

Stickiness measures the percentage of users who return and activate the feature on multiple distinct days, your early indicator of retention. In subscription or "freemium" models, increasing stickiness directly relates to lower churn and higher lifetime value. A 7-day stickiness below 25% for a key feature is a warning sign; consistent improvement in this area generally leads to significant revenue gains. Since it combines adoption, frequency, and task success into a single behavior pattern, stickiness becomes your guiding metric for long-term product engagement.

What it answers: Do users come back on a predictable cadence?
Instrument it: No new events needed; re-use Feature Used.

Mixpanel setup
  1. Open Reports → Retention.
  2. Pick Feature Used; set Returning cohort to N-day frequency.
Amplitude setup
  1. Go to Charts → Stickiness.
  2. Choose Feature Used and a 7-day window to see the percentage of users firing the event on multiple days.
Readout tips
  • <25 % 7-day stickiness on a core feature is an early churn warning.
  • Pair stickiness with ARPU to show revenue upside.

Pulling it All Together.

  1. Create a shared dashboard with all five charts; set the time range to last 28 days for stability.
  2. Annotate major releases to correlate code drops with metric spikes.
  3. Export a CSV of today’s baseline; you will compare Tuesday’s backlog hypotheses against these numbers. 

Capturing the numbers is just the first step; tomorrow, we turn them into action. In Tuesday’s post, “Trace Signals to Outcomes,” we’ll take each of the five metrics you just set up, link it with a user-problem hypothesis and a clear product goal, then walk through the whole process of metric → insight → backlog item using sample data from Mixpanel and Amplitude. 

Bring today’s dashboard and be ready to convert raw signals into stories your Scrum team can deliver.