luvv to helpDiscover the Best Free Online Tools
Topic 1 of 8

User Versus Session Funnels

Learn User Versus Session Funnels for free with explanations, exercises, and a quick test (for Product Analyst).

Published: December 22, 2025 | Updated: December 22, 2025

Why this matters

As a Product Analyst, you routinely diagnose drop-offs and conversion changes. Choosing the wrong unit of analysis (user vs session) can reverse conclusions, mislead prioritization, and waste engineering time.

  • Weekly growth review: diagnose a signup dip. Is it fewer users converting or more low-intent sessions?
  • Experiment readout: an onboarding change affects first-session behavior. Should the funnel be session-based?
  • Retention deep-dive: are returning users progressing over multiple sessions or stuck per session?

Concept explained simply

User-based funnels count unique people. Each person can appear at most once per step. Use when your question is about people progressing over any time window.

Session-based funnels count attempts. A person can have multiple sessions and multiple attempts. Use when your question is about within-session behavior or experiences that reset each visit.

Mental model

Think of a supermarket:

  • User funnel = how many distinct shoppers eventually buy after entering the store this week (even if they come twice).
  • Session funnel = how many store visits result in a purchase (each visit is an attempt, regardless of shopper identity).

Key trade-off: User funnels show reach and long-term progression; session funnels show friction within a single attempt.

Definitions & when to use each

  • User funnel
    • Denominator: unique users who hit Step 1 (in the analysis window).
    • Each user counted once per step, even across multiple sessions.
    • Best for: onboarding completion across days, long journeys (discovery → purchase), activation/retention.
  • Session funnel
    • Denominator: total sessions that hit Step 1.
    • Each new session is a new attempt.
    • Best for: first-session onboarding, checkout flow within a visit, support flow per incident, kiosk or POS usage.
Quick chooser: user or session?
  • Is behavior expected to finish in one visit? Choose session.
  • Can users return to continue later? Choose user.
  • Experiment changes first-touch or UI friction? Start with session.
  • Business question is "how many people succeed overall?" Choose user.

Worked examples

Example 1: Signup funnel

Window: last 7 days

  • Unique users who visited landing: 800
  • Unique users who started signup: 400
  • Unique users who completed signup: 320
  • Total sessions with landing: 1,200
  • Sessions with start signup: 500
  • Sessions with complete signup: 360

User-based conversion

  • Start rate: 400 / 800 = 50%
  • Complete (overall): 320 / 800 = 40%
  • Step-to-step: 320 / 400 = 80%

Session-based conversion

  • Start rate: 500 / 1,200 ≈ 41.7%
  • Complete (overall): 360 / 1,200 = 30%
  • Step-to-step: 360 / 500 = 72%

Interpretation: Sessions underperform users. Many people complete eventually, but not necessarily in their first attempt. Focus on first-visit friction.

Example 2: Product → Cart → Purchase

Window: last 14 days

  • Users who viewed product: 600 (users)
  • Users who added to cart: 300
  • Users who purchased: 240
  • Sessions with product view: 900 (sessions)
  • Sessions with add to cart: 320
  • Sessions with purchase: 260

User-based: Add rate 300/600 = 50%; Purchase overall 240/600 = 40%; Step-to-step 240/300 = 80%

Session-based: Add rate 320/900 ≈ 35.6%; Purchase overall 260/900 ≈ 28.9%; Step-to-step 260/320 = 81.3%

Interpretation: Strong eventual conversion per person, but many low-intent views inflate sessions. Consider traffic quality and pre-qualification.

Example 3: In-app support flow (incident-based)

Events: Crash → Open Help → Submit Report

  • Users with any crash: 1,000
  • Users who submitted a report: 500
  • Sessions with crash: 1,600
  • Sessions with report: 720

User-based: 500/1,000 = 50% of affected users eventually report.

Session-based: 720/1,600 = 45% of crash sessions get a report in that session.

Interpretation: If the goal is to improve response per incident, session funnel is correct. If you want the proportion of users who ever report, use user funnel.

How to build funnels correctly

  1. Choose the entity: user or session. Write it at the top of your query or notebook to avoid confusion.
  2. Define the window: Within-session funnels often use minutes; user funnels can use days/weeks between steps. Decide max gap between steps.
  3. Counting rules
    • Repeated steps: do you take first, last, or best attempt?
    • Out-of-order events: require strict order or allow repeats in between?
    • Cross-device identity: ensure user deduplication logic is sound.
  4. Attribution: If multiple sessions touch the funnel, decide where to attribute completion (first session, last session, or any).
  5. Quality filters: Exclude bots, test users, and clearly incomplete sessions (e.g., < 3s duration).
Implementation tips
  • For user funnels, aggregate to distinct user_id per step before joining steps.
  • For session funnels, aggregate to distinct session_id per step; avoid mixing counts (user at step 1 vs session at step 2).
  • Keep consistent denominators across steps.

Common mistakes (and self-check)

  • Mismatched denominators: Step 1 uses users, Step 2 uses sessions. Self-check: print count type next to each step.
  • Infinite windows: Users complete after months; you think the funnel is great. Self-check: cap max days between steps and report both capped and uncapped.
  • Double counting repeats (session funnel): One session fires Add to Cart twice. Self-check: deduplicate by session_id and step.
  • Identity drift: Cross-device users counted as multiple people. Self-check: compare results using stable identifiers (e.g., account_id).
  • Attributing to wrong attempt: Purchase attributed to first session instead of last. Self-check: list sample paths and verify attribution rule.

Exercises

Do these before the quick test. A short dataset is included for manual calculation.

Exercise 1: Compute both funnels from raw events

Dataset (user_id, session_id, event, ts):

u1, s1, visit, 00:01
u1, s1, start_signup, 00:02
u1, s1, close, 00:03
u1, s2, visit, 10:00
u1, s2, start_signup, 10:01
u1, s2, complete_signup, 10:03
u2, s3, visit, 02:00
u2, s3, start_signup, 02:02
u3, s4, visit, 05:00
u3, s4, start_signup, 05:01
u3, s4, close, 05:03
u3, s5, visit, 07:00
u4, s6, visit, 08:00
u4, s6, start_signup, 08:02
u4, s6, complete_signup, 08:05
  • Steps: Visit → Start Signup → Complete Signup
  • Calculate user-based and session-based overall conversion and step-to-step rates.

Exercise 2: Choose the right unit + rules

Scenario: A checkout redesign aims to reduce abandonment within a visit. You also care that people eventually buy this week. Define:

  • Which entity for the primary analysis and why.
  • Which entity for a secondary analysis and why.
  • Time window between steps for each.
  • How to handle repeated Add to Cart events in one session.
  • Checklist
    • Entity is declared for each analysis.
    • Denominators are consistent across steps.
    • Time windows are realistic (minutes for session; days for user).
    • Deduplication rule per step is stated.

Mini challenge

Marketing launched a campaign that increases top-of-funnel traffic. Conversion rate dropped on a session-based funnel, but user-based conversion stayed flat. In one paragraph, explain what likely happened and one action you would take next.

One possible angle

New visitors increased low-intent sessions, reducing per-session conversion. However, the proportion of unique people who eventually convert is stable. Next: segment by traffic source and compare session depth; consider pre-qualification or landing page improvements.

Practical projects

  • Build a dual-entity signup dashboard: toggle between user and session funnels with the same steps; include overall and step-to-step rates.
  • Run an A/B test readout template: show first-session conversion (primary) and user conversion within 7 days (guardrail).
  • Create a data quality validator: checks entity consistency, dedup rules, and window caps for any funnel query.

Who this is for

  • Product Analysts and Data Analysts diagnosing conversion and onboarding.
  • PMs who need to interpret funnel metrics correctly.
  • Growth marketers evaluating campaign impact on conversion.

Prerequisites

  • Basic event data concepts: user_id, session_id, timestamp.
  • Comfort with percentages, ratios, and time windows.
  • Ability to group and deduplicate records.

Learning path

  1. Review event tracking and identity basics.
  2. Learn funnel step definitions and ordering.
  3. Master user vs session funnels (this lesson).
  4. Practice with mixed funnels (hybrid windows, attribution variations).
  5. Apply to experiments and retention studies.

Next steps

  • Complete the exercises below and check your answers.
  • Take the quick test to validate understanding. Test is available to everyone; only logged-in users will have progress saved.
  • Apply this to a real product flow you know; produce both user- and session-based results and compare insights.

Progress note

Your quick test progress is saved only if you are logged in. You can still take the test for free.

Practice Exercises

2 exercises to complete

Instructions

Using the dataset in the lesson, compute:

  • User-based: overall conversion from Visit to Complete; step-to-step rates.
  • Session-based: overall conversion; step-to-step rates.

Assumptions:

  • Deduplicate by entity (user or session) per step.
  • Order is Visit → Start → Complete.
  • A user/session counts at a step if the event is present at any time after Visit (no cap for this exercise).
Expected Output
Numerical rates for user-based and session-based funnels (overall and step-to-step), and the counts used for denominators and numerators.

User Versus Session Funnels — Quick Test

Test your knowledge with 8 questions. Pass with 70% or higher.

8 questions70% to pass

Have questions about User Versus Session Funnels?

AI Assistant

Ask questions about this tool