Why this matters
As a BI Developer, you don’t just build dashboards—you make sure people actually use them to make decisions. Monitoring usage and adoption tells you which assets create value, which need improvements, and where to focus enablement. Typical tasks you’ll do:
- Track active users and popular content by team to prioritize support and training.
- Detect abandoned dashboards to clean up clutter and improve trust.
- Measure the impact of changes (new dashboards, refresh schedules, tutorials).
- Report adoption KPIs to stakeholders to justify BI investments.
Concept explained simply
Usage monitoring = collecting and analyzing how people interact with BI content (views, filters, exports). Adoption = the sustained, habitual use of BI content to make decisions. You don’t need complex tooling to start—use your BI platform’s audit logs and a few clear metrics.
Mental model
Think of adoption as a loop:
- Discover: Users find a dashboard.
- Try: They view, filter, or export.
- Adopt: They return regularly.
- Advocate: They share and recommend.
Your job is to instrument the loop, monitor the numbers, and make small changes that increase return visits and decision impact.
Key metrics to track
- Active users (DAU/WAU/MAU): Distinct viewers by day/week/month. Stickiness = DAU/MAU (closer to 1.0 means more habitual).
- Active content: Dashboards with ≥ N unique viewers in the last 28 days.
- Time to first insight (TTFI): Time from a user’s first login to viewing at least one relevant dashboard (proxy via first view event after account creation).
- Retention: Percent of new users who return in weeks 2, 4, 8.
- Engagement depth: Median views per active user, filter interactions per session, exports (for operational usage).
- Content health: Last refresh time, error rates, slow loads.
- Coverage: % of target audience with at least one session in last 28 days.
Tip: choosing thresholds
Start simple. Example thresholds:
- Active content: ≥ 5 unique viewers / 28 days.
- Healthy dashboard: median load time < 5s, refresh failures < 1%.
- Adoption improvement target: +10% WAU over 4 weeks.
Data sources and instrumentation
- Platform audit logs: events like view, filter, export, schedule run, error.
- Content catalog: owners, last refresh, certified status.
- User directory: team/department, role, join date.
Combine these into a simple usage mart (tables you can query):
- events (ts, user_id, content_id, action, duration_ms, success)
- content (content_id, name, owner, workspace, certified, last_refresh_ts)
- users (user_id, department, role, created_ts)
Worked examples
Example 1: Weekly active users and stickiness
Show steps
- Aggregate distinct users by day and by month.
- Compute DAU/MAU stickiness for the latest month.
-- Pseudocode SQL (adapt to your warehouse)
WITH daily AS (
SELECT DATE_TRUNC('day', ts) d, COUNT(DISTINCT user_id) dau
FROM events WHERE action = 'view'
GROUP BY 1
), monthly AS (
SELECT DATE_TRUNC('month', ts) m, COUNT(DISTINCT user_id) mau
FROM events WHERE action = 'view'
GROUP BY 1
)
SELECT m.m, AVG(d.dau) AS avg_dau, m.mau, AVG(d.dau)/NULLIF(m.mau,0) AS stickiness
FROM monthly m
JOIN daily d ON DATE_TRUNC('month', d.d) = m.m
GROUP BY 1,3;Interpretation: if stickiness is 0.25, roughly 25% of monthly users are daily users—OK for executive dashboards; ops dashboards might aim higher.
Example 2: Identify top dashboards to support
Show steps
- Count unique viewers per content in the last 28 days.
- Join content metadata to find owners and certification status.
- Flag candidates for enablement (training, docs).
SELECT c.content_id, c.name, c.owner, c.certified,
COUNT(DISTINCT e.user_id) AS viewers_28d,
COUNT(*) AS views_28d
FROM content c
LEFT JOIN events e ON e.content_id = c.content_id
AND e.action = 'view' AND e.ts > CURRENT_DATE - INTERVAL '28 day'
GROUP BY 1,2,3,4
ORDER BY viewers_28d DESC
LIMIT 10;Use the result to schedule office hours or add quick-start notes to the top 5 assets.
Example 3: Prune or fix unused dashboards
Show steps
- Classify dashboards with < 3 viewers in 28 days as "Archive candidates".
- If certified but low usage, move to "Fix": check naming, filters, refresh.
- Notify owners with a 14-day window to keep or archive.
SELECT c.content_id, c.name,
CASE
WHEN COUNT(DISTINCT e.user_id) FILTER (WHERE e.ts > CURRENT_DATE - INTERVAL '28 day') < 3
THEN CASE WHEN c.certified THEN 'Fix' ELSE 'Archive' END
ELSE 'Keep'
END AS action
FROM content c
LEFT JOIN events e ON e.content_id = c.content_id AND e.action='view'
GROUP BY 1,2, c.certified;Outcome: a tidy workspace improves discoverability and trust.
How to set baselines and thresholds
- Measure current state for 4–8 weeks (DAU/WAU/MAU, top content, retention).
- Pick one north-star goal (e.g., WAU +15%).
- Choose 2–3 leading indicators (TTFI, engagement depth, % certified views).
- Set alert thresholds (e.g., refresh fail > 1% for 3 days).
- Review monthly; adjust targets by team context.
Practical workflow
- Instrument: confirm events cover views, filters, exports, refreshes.
- Build usage mart tables (events, users, content).
- Create a usage dashboard for stakeholders (KPIs + trends).
- Run a monthly "Keep/Fix/Archive" review.
- Share a short adoption report with owners and data champions.
Owner notification template
Subject: Dashboard usage review
Hi, your dashboard "{name}" had {viewers_28d} unique viewers in the last 28 days. Status: {Keep/Fix/Archive}. If Fix/Archive, please update or confirm archival within 14 days.
Exercises
Complete the tasks below. A quick test is available to everyone; only logged-in users will have their progress saved.
- Exercise 1 — Calculate Weekly Adoption Metrics from Logs
Use the mini dataset and compute WAU, top content, and stickiness. See the Exercises section for full instructions and solution. - Exercise 2 — Define KPI Thresholds and Alerts
Create simple, realistic thresholds for adoption and operational health. See the Exercises section for full instructions and solution.
Self-check checklist
- I can compute DAU/WAU/MAU and stickiness from logs.
- I can list top content by unique viewers and views.
- I can classify dashboards into Keep/Fix/Archive.
- I can set and explain thresholds for adoption and system health.
Common mistakes (and how to self-check)
- Mistake: Optimizing for total views only. Fix: Track unique viewers and retention; views can be inflated by refresh or automation.
- Mistake: Comparing teams without context. Fix: Normalize targets by team size and workflow (ops vs exec).
- Mistake: Archiving too quickly. Fix: Check seasonality and business cycles before removing content.
- Mistake: Ignoring performance. Fix: Include load time and refresh failure rates—slow dashboards kill adoption.
- Mistake: No feedback loop. Fix: Notify owners and collect comments before decisions.
Practical projects
- Build a "BI Usage Overview" dashboard with DAU/WAU/MAU, stickiness, top 10 content, and trend lines.
- Create an automated monthly report that outputs Keep/Fix/Archive with owner notifications.
- Design a simple cohort retention analysis for new users over 8 weeks.
Who this is for
- BI Developers and Analysts who publish dashboards and need to prove impact.
- Data Stewards and Product Owners responsible for content quality and discoverability.
Prerequisites
- Basic SQL (GROUP BY, DISTINCT, JOIN, date truncation).
- Access to your BI platform’s audit or usage logs.
- Familiarity with your content catalog (owners, certification, refresh).
Learning path
- Instrument events and validate logging coverage.
- Build a small usage mart (events, users, content).
- Create KPIs and a simple usage dashboard.
- Run a Keep/Fix/Archive review and communicate outcomes.
- Iterate monthly and add retention analysis.
Next steps
- Operationalize alerts for refresh failures and load-time regressions.
- Add qualitative feedback (owner notes, user comments) to complement metrics.
- Partner with team leads to align dashboards with decisions and cadences.
Mini challenge
Pick one business unit. In 2 weeks, raise their WAU by 10% by doing exactly two actions: (1) improve one dashboard’s performance or clarity; (2) run a 15-minute demo. Re-measure WAU and stickiness the next week to confirm impact.