Why this matters
Great dashboards start before any chart is built. Clear requirements prevent scope creep, mismatched expectations, rework, and adoption issues. As a Data Analyst, you will:
- Turn vague asks like "we need visibility" into measurable outcomes and concrete KPIs.
- Align stakeholders on definitions (e.g., revenue, churn) and refresh cadence.
- Check data feasibility early to avoid blockers after development starts.
- Set acceptance criteria so everyone knows when the dashboard is "done".
Who this is for
- Data Analysts and BI Developers creating or improving dashboards.
- Product/Data PMs coordinating analytics deliverables.
- Anyone responsible for stakeholder alignment on metrics and visuals.
Prerequisites
- Basic understanding of data models (tables, joins, dimensions, facts).
- Familiarity with common BI tools (e.g., Power BI, Tableau, Looker) at a beginner level.
- Comfort discussing business goals with non-technical stakeholders.
Concept explained simply
Requirements gathering is a short discovery phase where you clarify the why, who, what, and how of a dashboard before building it. The output is a one-page brief that everyone agrees to.
Mental model: PASTED
- Purpose: What decision or outcome will this dashboard enable?
- Audience: Who will use it? Their roles and needs.
- Success metrics: KPIs, targets, and acceptance criteria.
- Timing: Cadence, recency, and required latency.
- Evidence: Data sources, metric definitions, dimensions, filters.
- Delivery: Access, layout, devices, and next steps.
Discovery question bank (copy-paste ready)
- Purpose: What decisions will this dashboard directly inform? What will you do differently if it works?
- Audience: Who is the primary user? Secondary users? How often will they look at it?
- KPIs: What are the 3–5 must-have KPIs? What are their exact definitions and targets?
- Grain: What is the smallest unit needed (e.g., daily by product by channel)?
- Dimensions: Which cuts/filters are essential (date, product, region, channel, owner)?
- Data: Which systems store this data? Any known gaps or delays?
- Timing: How fresh must the data be (real-time, hourly, daily, weekly)?
- Actions: What actions should users take when a metric is off target?
- Constraints: Must-haves to launch vs nice-to-haves for later?
- Acceptance: What must be true for you to sign off as "done"?
Worked examples
Example 1 — E-commerce Sales Performance dashboard
Purpose: Help the e-commerce manager identify sales trends and react to underperforming products.
- Audience: E-commerce manager (daily), merchandiser (weekly)
- KPIs: Revenue (net of returns), Orders, AOV, Conversion Rate
- Grain: Daily; drill to product and channel
- Dimensions: Date, Product, Category, Channel, Device
- Data: Orders fact table; returns table; product dims; web analytics
- Timing: Daily refresh by 7am local time
- Acceptance: Revenue ties to finance report ±1% for last full day
Note: Defined revenue as gross sales minus discounts and refunds, excluding tax and shipping.
Example 2 — Support Queue Health dashboard
Purpose: Allow Support Leads to rebalance agent workload and meet SLAs.
- KPIs: First response time (median), % within SLA, Backlog volume, CSAT
- Grain: Hourly; drill by queue and agent
- Dimensions: Queue, Priority, Agent, Channel
- Timing: Near real-time (≤15 min delay)
- Acceptance: SLA compliance matches system report within ±0.5%
Example 3 — Executive Weekly Growth snapshot
Purpose: Give leadership a single-view weekly trend of growth, retention, and cost.
- KPIs: New customers, Active customers, Revenue, CAC, Churn rate
- Grain: Weekly; no drill-down for v1
- Timing: Updated every Monday 9am
- Acceptance: Consistency with Finance monthly report; variance notes included
Mini templates and worksheets
1-page dashboard requirements brief
Title: [Name]
- Purpose & Decisions: [1–2 sentences]
- Audience & Frequency: [Roles; daily/weekly/monthly]
- KPIs & Targets: [3–5 items with definitions]
- Grain & Dimensions: [Smallest unit; essential filters]
- Data Sources & Refresh: [Systems; latency]
- Mock sketch (text ok): [e.g., top KPIs + trend + breakdown]
- Acceptance Criteria: [e.g., tie-out rules; performance; access]
- V1 vs Later: [Must-haves / Nice-to-haves]
- Risks & Owners: [Assumptions; RACI]
Metric definition card
- Name: [e.g., Revenue]
- Business meaning: [Short sentence]
- Formula: [Explicit calc + inclusions/exclusions]
- Grain: [Daily by product]
- Source tables/fields: [System.table.field]
- Edge cases: [Refund timing, currency]
- Owner: [Who decides]
Stakeholder interview script (15–30 min)
- Open: What success looks like (2 min)
- Decisions & actions (5 min)
- KPIs & definitions (8 min)
- Cadence & latency needs (4 min)
- Scope & trade-offs (3 min)
- Acceptance criteria (3 min)
- Recap and confirm next steps (2 min)
Simple 6-step process
- Kickoff: Clarify purpose, audience, and decisions.
- List KPIs: 3–5 must-haves with definitions and targets.
- Map data: Sources, grain, dimensions, refresh expectations.
- Sketch layout: KPI strip, trend, breakdown, filters.
- Confirm acceptance: Tie-out rules, performance, security.
- Sign-off: Share 1-page brief for approval before build.
Exercises
Do these short tasks to practice. Everyone can take the test; only logged-in users will have progress saved.
- Exercise 1: Turn a vague request into a crisp brief (see details in the exercise card below).
- Exercise 2: Define metrics and feasibility for a churn dashboard (see details in the exercise card below).
Pre-flight checklist (before build)
- Purpose and primary decision are written in one sentence.
- Exactly 3–5 KPIs defined with formulas and targets.
- Grain and essential dimensions listed.
- Refresh cadence and latency expectations agreed.
- Data sources and known gaps documented.
- Acceptance criteria include tie-out rules and performance.
- V1 scope vs later enhancements split.
- Stakeholders acknowledged the brief (written confirmation).
Common mistakes and how to self-check
- Too many KPIs: Limit to 3–5 must-haves for v1.
- Undefined terms: Write formulas and edge cases (e.g., refunds, cancellations).
- Ignoring latency: Confirm how fresh data must be and what delay is acceptable.
- Skipping tie-out: Define what external report you will match and acceptable variance.
- Design before purpose: Start with decisions, then sketch visuals.
- No v1 vs later: Separate essentials from enhancements to avoid scope creep.
Self-check prompts
- If a metric spikes tomorrow, who acts and how?
- Can a new teammate understand each KPI from the card alone?
- Can you explain refresh latency in one sentence?
- What would make this dashboard a failure? Is that risk mitigated?
Practical projects
- Build a 1-page brief for a Marketing Acquisition dashboard using mock data. Include metric cards and a sketch.
- Interview a colleague (or simulate) for an Operations utilization dashboard. Capture PASTED and write acceptance criteria.
- Create a metrics dictionary of 5 core KPIs for your team with formulas, owners, and edge cases.
Learning path
- 1) Requirements Gathering (this page) →
- 2) Data modeling basics for BI (facts, dimensions, grain) →
- 3) Prototyping and low-fidelity dashboard sketches →
- 4) Build v1 in your BI tool →
- 5) Validation, tie-outs, and user feedback →
- 6) Iteration and documentation (metrics dictionary)
Next steps
- Use the 1-page brief template on your next dashboard request.
- Run one 20-minute stakeholder interview this week.
- Take the quick test to reinforce the concepts.
Mini challenge
Your VP says: "I want a customer health dashboard ASAP." In 10 minutes, write a one-paragraph purpose, list 4 KPIs with definitions, choose a grain and 5 dimensions, and propose two acceptance criteria.
Quick Test
Take the short test below to check your understanding. Everyone can take it for free; sign in to save your progress.