Why this matters
Clear acceptance criteria turn fuzzy requests into buildable, testable BI reports. They reduce rework, accelerate sign-off, and protect data quality. As a BI Developer, you will:
- Translate stakeholder goals into specific, testable report behaviors.
- Align metric definitions across teams to avoid conflicting numbers.
- Define data constraints, performance thresholds, and refresh expectations for reliable dashboards.
- Guide UAT (user acceptance testing) with unambiguous pass/fail checks.
Concept explained simply
Acceptance criteria describe what "good" looks like for a report so everyone can agree it’s done. They are specific, measurable conditions covering data, logic, visuals, and operations.
Helpful template (use selectively)
Given a context (filters, user role, date range), when the user performs an action (opens report, changes filter), then the outcome matches a known rule or number (with an exact or tolerance-based expectation).
Given Region = "North America" and Date = last full month, when the dashboard loads, then Total Revenue equals the finance ledger for that period within 0.1% variance.
Mental model: 4 layers to cover
- Data: Sources, freshness, joins, filters, currency/time zone, identity rules, exclusions.
- Logic: Metric formulas, aggregation levels, handling of nulls/outliers, rounding/tolerances.
- Presentation: Layout, sorting defaults, labels/units, drill-through behavior, accessibility.
- Operations: Refresh schedule, performance SLAs, access/roles, auditability, error states.
Core acceptance criteria checklist (adapt as needed)
- Data source(s) named and versioned.
- Data recency: last refresh timestamp and schedule (e.g., hourly at :15).
- Record scope: included/excluded entities (e.g., test users, refunds, internal orders).
- Identity keys and deduplication rules.
- Time handling: timezone, calendar type (fiscal/ISO week), last full period rules.
- Currency handling: base currency, conversion timing, rounding rules.
- Metric definitions with formulas and level of aggregation.
- Nulls and outliers handling (e.g., treat null price as 0? exclude?).
- Default filters and slicers; allowed filter values.
- Sorting defaults for tables and charts.
- Drill-through/drill-down rules and expected fields.
- Labels, units, number formats, and thresholds/conditional formatting.
- Accessibility: contrast, keyboard navigation, alt text/aria labels when applicable.
- Performance: load under X seconds with dataset size Y.
- Security: roles/row-level security expectations.
- Validation: reference queries or files and acceptable tolerance.
- Error states: what users see if data is stale or missing.
- Sign-off: who approves and how success is verified.
Worked examples
Example 1 — Monthly Sales Overview
- Data: Source is dw.fact_sales joined to dim_date, dim_customer, dim_product. Exclude test orders flagged is_test = true.
- Logic: Total Sales = sum(net_amount) after discounts; Units Sold = sum(quantity); AOV = Total Sales / count(distinct order_id).
- Time: Default period = last full month; calendar = Gregorian; timezone = UTC.
- Currency: Base USD; conversions use last business day FX rate of the reported month; round to 2 decimals.
- Presentation: Show KPI tiles (Sales, Units, AOV), a trend line (12 months), and Top 10 products table, sorted by Sales desc.
- Filters: Region, Channel, Product Category; defaults: All.
- Performance: Initial load under 4 seconds for 12 months of data.
- Validation: For last full month, Sales equals finance ledger within 0.2%.
- Error state: If refresh fails, display a banner: "Data may be stale. Last successful refresh: {timestamp}."
- Sign-off: Sales Ops and Finance Controller approve.
Example 2 — Customer Churn Report
- Data: Primary = dw.fact_subscriptions; churn event when status changes from active to canceled, excluding involuntary churn reversed within 7 days.
- Logic: Churn Rate (monthly) = churned_customers / starting_active_customers; Cohort by activation month; treat reactivations as new activations.
- Time: Cohort analysis uses customer activation month (UTC). Report default = last 6 cohorts.
- Filters: Plan, Country, Acquisition Channel.
- Presentation: Heatmap for cohort retention; table for churn counts with % and absolute numbers.
- Validation: For sample month, match SQL query results within exact counts (0 tolerance).
- Performance: Drill-through to customer list under 3 seconds for cohorts up to 24 months.
- Security: Show customer PII only to role = Support_Lead; else mask email.
Example 3 — Finance Revenue Recognition
- Data: Source = rev_rec_journal; exclude adjustments tagged "non-operational".
- Logic: Recognized Revenue by GAAP rules; rounding at monthly ledger level; differences vs. bookings shown as separate KPI.
- Time: Fiscal calendar; default = current fiscal quarter to date.
- Validation: Reconcile to general ledger export within 0.05% and exact sign (no negative/positive flips).
- Presentation: Stacked bar by product line; variance waterfall MoM.
- Operations: Refresh daily at 02:00 UTC; load under 5 seconds; show last refresh timestamp.
Steps to write strong acceptance criteria
- Clarify the goal: What decision or action will this report support?
- Fix definitions: Write metric formulas, keys, time and currency rules.
- Design behavior: Defaults, filters, drill paths, sorting, and error states.
- Set operations: Refresh cadence, performance, access, validation method.
- Make it testable: Add exact references, tolerances, and who signs off.
Quick starter template
Goal: Data sources & scope: Metric definitions: Time & currency rules: Filters & defaults: Presentation rules: Performance & refresh: Validation & tolerance: Security & access: Sign-off:
Practice exercises
Note: The quick test is available to everyone. Only logged-in users have their progress saved.
Exercise 1 — Turn a vague request into acceptance criteria
"We need a monthly sales performance report for the US that helps managers track trends and top products." Create acceptance criteria covering data, logic, presentation, and operations. Include at least one criterion using Given/When/Then.
- Deliverable: 10–14 bullet points of criteria.
- Timebox: 15 minutes.
Show solution
Sample acceptance criteria:
- Data: Use fact_sales joined with dim_date, dim_product, dim_region; Region filter default = United States.
- Exclude: Test orders is_test = true; internal channel = "Employee".
- Metrics: Total Sales = sum(net_amount); Units = sum(quantity); AOV = Total Sales / count(distinct order_id).
- Time: Default = last full month; trend range = last 12 full months; timezone = UTC.
- Currency: USD only; round to 2 decimals.
- Presentation: KPI tiles (Sales, Units, AOV), monthly trend line, Top 10 products table sorted by Sales desc.
- Filters: Category, Channel; default = All.
- Performance: Load under 4 seconds.
- Validation: Last full month Sales matches finance ledger within 0.2%.
- Given/When/Then: Given Category = "Electronics" and Date = last full month, when the report loads, then Top 10 products show correct ranking by Sales and totals equal the detail table sum.
- Error state: If refresh older than 24h, display "Data may be stale" banner.
- Sign-off: Sales Ops Manager approves.
Exercise 2 — Find gaps and improve criteria
Review this draft for a churn report and improve it:
- "Show churn rate and churned customers. Update every day. Allow filter by plan."
Identify gaps and rewrite the criteria to be testable.
Show solution
Issues: Missing churn definition, time window, cohort logic, exclusions, validation, performance, access, and presentation specifics.
Improved criteria:
- Definition: Churn event = status from active to canceled; exclude involuntary churn reversed within 7 days.
- Metric: Monthly Churn Rate = churned_customers / starting_active_customers.
- Time: Default = last 6 months; timezone = UTC.
- Filters: Plan (required), Country, Channel; defaults = All.
- Presentation: KPI (Churn Rate), line chart by month, table with counts and %; sort by month asc.
- Validation: For July, churn count equals SQL reference query exactly.
- Performance: Initial load under 3 seconds.
- Security: Mask email unless role = Support_Lead.
- Error state: If today’s refresh fails, banner shows last success timestamp.
- Sign-off: Lifecycle PM approves.
Self-check for your exercises
- Are metric formulas and time rules explicit?
- Can each criterion be tested with a clear pass/fail?
- Do you cover data, logic, presentation, and operations?
- Is there a validation step with a reference and tolerance?
- Did you specify defaults, filters, and handling of nulls/outliers?
Common mistakes and how to self-check
- Vague metrics: "show active users" without definition. Fix by writing the formula and scope.
- No time rules: Missing timezone, last full period rules. Fix by stating both.
- Ignoring edge cases: Refunds, test data, duplicate keys. Fix by listing exclusions and dedup rules.
- Untestable criteria: "looks clean" is subjective. Fix by adding measurable thresholds.
- Missing validation: No reference query/file. Fix by naming the source and tolerance.
- Overlooking performance or refresh. Fix by adding SLA and schedule.
- Security gaps: PII exposed without roles. Fix by masking and defining access.
Quick self-audit
- Can a tester with no context run checks and conclude pass/fail?
- Would two testers reach the same result independently?
- Are there any "TBD" or ambiguous words (some, often, usually)? Replace them.
Mini challenge
Write 8–12 acceptance criteria for a Marketing Funnel dashboard (Impressions → Clicks → Sign-ups → Paying Users) for the last 90 days, with daily refresh and drill-down to campaign.
One possible answer
- Data: fact_campaign_events with dim_date, dim_campaign; exclude internal campaigns tagged internal = true.
- Metrics: CTR = clicks/impressions; CVR = signups/clicks; Paid CVR = paying_users/signups.
- Time: Default = last 90 days, UTC, daily grain.
- Filters: Campaign, Channel, Country; defaults = All.
- Presentation: Funnel chart + daily trend lines; campaign table sorted by Paying Users desc.
- Drill: Clicking a campaign opens a detail view with daily metrics.
- Validation: For previous day, totals match source export within 0.5%.
- Performance: Load under 3 seconds; drill under 2 seconds.
- Error: If data stale > 36h, show banner.
- Sign-off: Growth Lead approves.
Who this is for
- BI Developers and Analytics Engineers creating or maintaining reports.
- Analysts and PMs defining report requirements and UAT.
- Data leaders seeking consistent, testable report standards.
Prerequisites
- Basic SQL and familiarity with your warehouse tables.
- Understanding of core business metrics in your domain.
- Comfort with your BI tool’s filters, visuals, and roles/security.
Learning path
- Gather stakeholder goals and key decisions.
- Draft metric definitions and data scope.
- Write acceptance criteria across data, logic, presentation, and operations.
- Create validation queries and sample datasets.
- Run UAT with the criteria; iterate and finalize sign-off.
Practical projects
- Retrofit acceptance criteria to an existing dashboard and run a UAT dry run.
- Create a one-page acceptance criteria sheet for a new KPI tile set.
- Build a validation notebook (SQL + notes) linked to your criteria.
- Define performance SLAs and measure current dashboards against them.
- Draft RLS/PII masking rules as acceptance criteria for a support report.
Next steps
- Do the exercises above and compare with the sample solutions.
- Take the quick test below to check your understanding.
- Apply this to your next stakeholder request—start by writing the criteria first.