luvv to helpDiscover the Best Free Online Tools

BI Tools

Learn BI Tools for Product Analyst for free: roadmap, examples, subskills, and a skill exam.

Published: December 22, 2025 | Updated: December 22, 2025

BI Tools for Product Analysts: what this skill covers

BI tools turn raw data into decisions. As a Product Analyst, you will use BI to build consistent, interactive dashboards that track product health, funnels, retention, and experiments, and to enable self-serve for stakeholders without sacrificing data quality.

What you will be able to do after this skill
  • Model clean, governed metrics that are reused across dashboards.
  • Build product metrics, funnel, and retention dashboards with filters and drilldowns.
  • Set alerts on key KPIs, reduce dashboard noise, and improve performance.
  • Design self-serve experiences and onboard stakeholders effectively.

Who this is for

  • Product Analysts shipping dashboards, supporting PMs, designers, and engineers.
  • Data-savvy PMs or analysts moving from ad-hoc SQL to governed BI.
  • Anyone responsible for product health reporting and insights.

Prerequisites

  • SQL basics: SELECT, JOIN, GROUP BY, WHERE, COUNT DISTINCT, window functions (intro level).
  • Understanding of core product metrics (activation, retention, engagement).
  • Comfort with date/time handling and filters.
  • Access to a BI tool (e.g., Looker, Tableau, Power BI, or similar). Vendor specifics vary; principles here are tool-agnostic.
Quick readiness checklist
  • You can write a query for daily active users by date.
  • You can explain the difference between events and users.
  • You know what a funnel step and a cohort are.
  • You can map business questions to metrics and segments.

Learning path

  1. Connect and profile data — Identify source tables, grains, keys, and freshness.
  2. Model core entities — Users, sessions, events, orders; define ID and date keys.
  3. Govern metrics — Centralize definitions (e.g., DAU, Activation) so everyone reuses the same logic.
  4. Build core dashboards — Product metrics, funnel, retention with clear visuals and notes.
  5. Add interactivity — Global filters, drilldowns, cross-highlighting, and segments.
  6. Alerts and QA — Threshold/anomaly alerts; test, document, and monitor performance.
  7. Enable self-serve — Landing pages, discoverable datasets, docs, and training sessions.

Worked examples

These examples show tool-agnostic logic with SQL and common BI expressions.

Example 1 — Daily Active Users (DAU) card with definition

Goal: build a KPI card for DAU with a 7-day trend and a definition note.

-- SQL (generic): daily active users by date
SELECT
  event_time::date AS d,
  COUNT(DISTINCT user_id) AS dau
FROM analytics.events
WHERE event_time >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY 1
ORDER BY 1;

-- Optional: 7D rolling active users (DAX-like measure)
-- Active Users 7D = CALCULATE(DISTINCTCOUNT(Events[user_id]), DATESINPERIOD(Calendar[Date], MAX(Calendar[Date]), -7, DAY))
  • Visualization: small multiples or line chart with yesterday vs 7-day average.
  • Definition note (add to chart description): “DAU = distinct users with any qualifying product event in a day.”

Example 2 — Funnel dashboard with conversion and drop-offs

Goal: show step conversions, drop-offs, and segment filters (e.g., device, country).

-- Each user’s max completed step in a 7-day window after first touch
WITH first_touch AS (
  SELECT user_id, MIN(event_time) AS first_time
  FROM analytics.events
  WHERE event_name IN ('visit', 'signup', 'activate', 'purchase')
  GROUP BY 1
),
steps AS (
  SELECT e.user_id,
         CASE e.event_name
           WHEN 'visit' THEN 1
           WHEN 'signup' THEN 2
           WHEN 'activate' THEN 3
           WHEN 'purchase' THEN 4
         END AS step_num,
         e.event_time
  FROM analytics.events e
  WHERE e.event_name IN ('visit','signup','activate','purchase')
),
windowed AS (
  SELECT s.user_id, MAX(step_num) AS max_step
  FROM steps s
  JOIN first_touch f USING (user_id)
  WHERE s.event_time BETWEEN f.first_time AND f.first_time + INTERVAL '7 days'
  GROUP BY 1
)
SELECT
  step,
  COUNT_IF(max_step >= step) AS reached,
  SAFE_DIVIDE(COUNT_IF(max_step >= step+1), NULLIF(COUNT_IF(max_step >= step), 0)) AS step_conversion
FROM UNNEST([1,2,3,4]) AS step
CROSS JOIN windowed
GROUP BY 1
ORDER BY 1;
  • Visualization: bar for “reached” and label for conversion %. Add slicers for segments.
  • Tip: filter cohorts to the same acquisition period to avoid bias.

Example 3 — Retention cohorts (user-based D1/D7)

Goal: cohort users by first activation date, measure return at D1, D7.

WITH first_activation AS (
  SELECT user_id, MIN(event_date) AS cohort_date
  FROM analytics.events
  WHERE event_name = 'activate'
  GROUP BY 1
),
user_activity AS (
  SELECT e.user_id, e.event_date
  FROM analytics.events e
),
retention AS (
  SELECT fa.cohort_date,
         DATE_DIFF(user_activity.event_date, fa.cohort_date) AS day_num,
         user_activity.user_id
  FROM user_activity
  JOIN first_activation fa USING (user_id)
)
SELECT
  cohort_date,
  COUNT(DISTINCT CASE WHEN day_num = 0 THEN user_id END) AS cohort_size,
  ROUND(100.0 * COUNT(DISTINCT CASE WHEN day_num = 1 THEN user_id END)
        / NULLIF(COUNT(DISTINCT CASE WHEN day_num = 0 THEN user_id END), 0), 1) AS d1_retention_pct,
  ROUND(100.0 * COUNT(DISTINCT CASE WHEN day_num = 7 THEN user_id END)
        / NULLIF(COUNT(DISTINCT CASE WHEN day_num = 0 THEN user_id END), 0), 1) AS d7_retention_pct
FROM retention
GROUP BY 1
ORDER BY 1;
  • Visualization: heatmap matrix (cohort by week vs days since), plus D1 and D7 headline stats.
  • Note: choose user-based or event-based retention and document which one you use.

Example 4 — Metric consistency via a reusable definition

Goal: define metrics once and reuse everywhere.

-- Example: Active Users metric in a semantic layer (pseudo-Looker style)
measure: active_users {
  type: count_distinct
  sql: ${user_id} ;;
  filters: [event_type: "qualifying"]
}

-- Or expose standardized SQL view
CREATE OR REPLACE VIEW analytics.metrics_active_users AS
SELECT event_date AS d, COUNT(DISTINCT user_id) AS dau
FROM analytics.events
WHERE event_type IN ('open', 'view', 'click')
GROUP BY 1;
  • All dashboards pull from the semantic measure or standardized view to prevent drift.
  • Document metric name, logic, acceptable filters, and edge cases.

Example 5 — Alerts on key metrics without alert fatigue

Goal: alert when DAU or conversion meaningfully drops, while avoiding noise.

# Pseudo alert configuration
alert: dau_drop
  metric: dau
  condition: pct_change_7d < -20%
  min_samples: 1000
  cooldown: 24h
  channels: ["slack:#product-analytics", "email:pm@company.com"]
  include_snapshot: true
  owner: product-analytics
  • Use percentage change vs 7-day average, minimum sample size, and cooldown.
  • Include a link to a triage view and an on-call owner in your doc string.

Drills and exercises

  • Write a query to compute WAU and a 4-week rolling average. Validate against a small sampled table.
  • Build a 4-step acquisition funnel and add a device filter. Compare desktop vs mobile conversion.
  • Create a retention heatmap by signup week. Add a toggle for user-based vs event-based.
  • Add a “Product Area” drilldown from the main KPI to a detail page. Verify filter propagation.
  • Set an alert for conversion rate drop > 15% WoW with a 12-hour cooldown. Simulate a test alert.
  • Document the definition of “Activation” and reuse it across two dashboards. Spot-check for consistency.

Common mistakes and debugging tips

  • Mixing grains: combining event-level and user-level metrics without proper aggregation.
  • Double counting users across multiple events in the same period.
  • Inconsistent metric definitions across dashboards (DAU logic differs).
  • Unclear date filters: comparing different time windows (e.g., last 7 vs calendar week).
  • No segmentation: reporting aggregate metrics hides product issues.
  • Slow dashboards: unindexed filters, heavy joins, no incremental refresh or extracts.
  • Alert fatigue: thresholds too tight; no cooldown or sample-size check.
  • Timezone confusion: users appear to churn/return due to day boundary mismatches.
Debugging playbook
  • Row-level check: pull 20 sample rows and manually verify calculations.
  • Recompute with an alternative method (e.g., event-based vs user-based) to triangulate.
  • Trace filters: confirm each filter’s logic and default values in the dashboard.
  • Check freshness: display last refresh timestamp on every dashboard.
  • Performance: reduce fields, add extracts/aggregations, and avoid SELECT *.

Practical projects

  • Product Health Dashboard: DAU/WAU/MAU, engagement rate, and uptime with notes and alerting.
  • Activation Funnel: 4–5 steps with segment filters (country, device) and drop-off analysis.
  • Retention Explorer: cohort heatmap with cohort size and D1/D7 metrics; drilldown to user segments.

Mini project: Product Health BI suite

  1. Define metrics: DAU, Activation, Conversion, D1 retention. Write 1–2 sentence definitions.
  2. Standardize logic: create a shared view or semantic metric for each.
  3. Build dashboards:
    • Metrics Dashboard: headline KPIs + 7/28-day trends.
    • Funnel Dashboard: step conversion with segment filters.
    • Retention Dashboard: weekly cohorts with D1/D7.
  4. Add interactivity: global date filter; drilldown from KPI to segment breakdown.
  5. Alerts: set DAU drop and funnel conversion drop alerts with cooldowns.
  6. Docs and enablement: add a “How to read this” section; run a 30-minute stakeholder walkthrough.
  7. QA: validate numbers on a sampled set; test performance; confirm filter propagation.
Acceptance criteria checklist
  • All dashboards show last refresh time and metric definitions.
  • Filters apply consistently across tiles and drilldowns.
  • Metrics match SQL spot checks within 1%.
  • Alerts tested and routed to the right channel.

Subskills

  • Product Metrics Dashboards — Build clear KPI views for DAU/WAU/MAU, activation, and engagement with definitions.
  • Funnel Dashboards — Visualize step conversions, drop-offs, and segments; support cohort-aligned comparisons.
  • Retention Dashboards — Cohort matrices and D1/D7/28 retention; support user/event-based methods.
  • Self Serve Analytics Design — Create discoverable datasets, guided navigation, and safe defaults.
  • Metric Consistency Across Dashboards — Reuse centralized metric definitions to prevent drift.
  • Filters And Drilldowns — Global filters, hierarchical drill paths, and cross-highlighting.
  • Alerts On Key Metrics — Threshold and anomaly alerts with cooldowns and routing.
  • Stakeholder Enablement — Docs, training, and office hours that drive BI adoption.

Next steps

  • Deepen your semantic layer or metric store so teams reuse definitions automatically.
  • Expand testing: add data quality checks for freshness, duplicates, and outliers.
  • Pair BI with experimentation analysis to explain metric movements, not just display them.
Glossary
  • Grain: the level of detail of a table or chart (e.g., daily, user-level).
  • Cohort: a group of users sharing a start event/time (e.g., signup week).
  • Cross-highlighting: selecting a chart element updates related charts.

BI Tools — Skill Exam

This exam checks your understanding of BI Tools for Product Analysts: metrics, dashboards, filters, drilldowns, retention, funnels, alerts, and enablement. You can take it for free. Anyone can attempt the exam; only logged-in users will have their progress saved and can resume later.Scoring: pass at 70% or higher. You can retake the exam at any time.

12 questions70% to pass

Have questions about BI Tools?

AI Assistant

Ask questions about this tool