luvv to helpDiscover the Best Free Online Tools
Topic 1 of 8

Information Architecture For Dashboards

Learn Information Architecture For Dashboards for free with explanations, exercises, and a quick test (for Data Visualization Engineer).

Published: December 28, 2025 | Updated: December 28, 2025

Who this is for

  • Data Visualization Engineers designing dashboards for executives, operations, and product teams.
  • Anyone turning raw reports into a clean, navigable dashboard experience.
  • Analysts who want reliable structure before styling charts.

Prerequisites

  • Basic understanding of KPIs vs diagnostics (what to track vs how to explain).
  • Familiarity with filters, drilldowns, and common chart types.
  • Access to sample metrics or a product/operations context.

Why this matters

In real projects, the first question from stakeholders is rarely “what chart style?”—it’s “where do I find the answer fast?” Good information architecture (IA) makes that instant. You’ll design: clear entry points, logical sections, consistent filters, and drilldowns that answer follow-up questions without getting lost.

  • Executive KPI decks: a single page that answers “Are we on track?” in seconds.
  • Operations monitoring: fast triage from summary to root cause.
  • Product analytics: structured exploration from outcomes to user behaviors.

Concept explained simply

Information architecture is the blueprint for your dashboard: what’s on top, what comes next, and how users move between parts. Think of it as arranging a store so customers find items quickly without asking for help.

Mental model

Use the 3L Mental Model: Level 1 (Decision), Level 2 (Diagnosis), Level 3 (Discovery).

  • Level 1 — Decision: top KPIs, status, targets, alerts.
  • Level 2 — Diagnosis: breakdowns (by segment, time, region, funnel step).
  • Level 3 — Discovery: flexible exploration (ad hoc slices, drill-throughs).
When to add a Level 4

Add Level 4 (Detail records) only when users must inspect raw rows (e.g., ticket, order, session). Keep it optional and scoped by filters from Levels 2–3.

Design steps (do these before you build charts)

  1. Define the core user and their trigger question (e.g., “VP Sales checks Monday mornings: Are we pacing to target?”).
  2. List decisions the dashboard must support. Map each decision to 1–3 KPIs.
  3. Group metrics into the 3L structure (Decision, Diagnosis, Discovery).
  4. Choose navigation: single page with sections, or multi-page tabs (by audience or workflow).
  5. Define filters: global (time, product) vs local (chart-specific).
  6. Specify drilldowns: click paths from KPI to breakdown to detail.
  7. Write acceptance criteria: time-to-answer under 10s for top 3 questions; two clicks from KPI to root-cause slice.

Worked examples

Example 1: Executive KPI Dashboard

User: CEO/VP

Top questions: Are we hitting targets? Any urgent risks?

IA:

  • Level 1 (Decision): North-star KPI, Revenue vs Target, Gross Margin, Active Users; traffic-light status; brief commentary space.
  • Level 2 (Diagnosis): Breakdowns by region, product line, customer segment; MoM/YoY trend tiles.
  • Level 3 (Discovery): Ad hoc slicer area with segment matrix; optional drill-through to account list.

Filters: Global: Time, Region; Local: Product on product charts only.

Drilldown: KPI tile → Region trend → Region x Product matrix → Account detail (optional).

Example 2: Operations Monitoring (Support)

User: Support lead

Top questions: Are SLAs at risk? Where are backlogs forming?

IA:

  • Level 1 (Decision): SLA Breach Rate, Median First Response, Backlog Count; alert banner for thresholds.
  • Level 2 (Diagnosis): Queue by channel (email/chat/phone), Priority bucket, Agent workload heatmap.
  • Level 3 (Discovery): Filterable ticket explorer (issue type, language, tier).

Filters: Global: Time (last 24h/7d), Channel; Local: Agent for workload section.

Drilldown: SLA Breach tile → Channel breakdown → Priority x Channel → Ticket list (Level 4).

Example 3: Product Growth (Acquisition Funnel)

User: Growth PM

Top questions: Which step leaks? Which cohorts respond to changes?

IA:

  • Level 1 (Decision): Conversion Rate, CAC, Activation Rate; sparkline trends.
  • Level 2 (Diagnosis): Funnel steps by traffic source/device; A/B experiment segment charts.
  • Level 3 (Discovery): Cohort explorer (start week x country), UTM slicing.

Filters: Global: Date range, Platform; Local: Experiment ID on experiment cards.

Drilldown: Conversion tile → Step drop-off chart → Source x Device → Session sample (optional).

Reusable patterns

  • Hero KPI strip at top with minimal interaction; keep it scannable.
  • Diagnosis zone grouped by common slice keys (time, region, product, channel).
  • Sticky global filter bar; local filters visually attached to their section.
  • Consistent drilldown affordances (e.g., right-chevron icons, “View details” labels) across all tiles.
  • Commentary box near KPIs for context (targets, anomalies, data caveats).
When to split into multiple pages
  • Different audiences (Exec vs Ops) with conflicting priorities.
  • Heavy content causing slow load; split to maintain snappy KPI load.
  • Distinct workflows (Plan vs Monitor vs Investigate).

Layouts that scale

  • Single-page with sections: Good for exec summaries; Level 1 at top, Level 2 in panels, Level 3 as collapsible details.
  • Tabbed pages by workflow: Overview, Diagnose, Explore. Keep the same filter bar across tabs.
  • Left-nav categories: Audience-based (Leadership, Sales, Support) with shared visual language.

Filter and navigation strategy

  • Global filters: apply to all content unless explicitly excluded. Keep to 2–4 critical fields (time, region, product).
  • Local filters: charts or sections that need specialized slicing (e.g., Experiment ID, Agent).
  • Drilldown: click from a summary tile to a pre-filtered diagnosis view. Preserve the user’s global filters.
  • Breadcrumb labels: show the path within the dashboard UI (e.g., KPI → Region → Product), even if not using page breadcrumbs.

Common mistakes and how to self-check

  • Putting everything on one page: causes decision paralysis. Fix: enforce 3L structure and hide Level 3 by default.
  • Global filters that silently don’t apply: users misread data. Fix: show filter scope labels near sections.
  • Too many KPIs: dilutes focus. Fix: cap Level 1 to 4–6 tiles that map to decisions.
  • Unlabeled drilldowns: users get lost. Fix: show the active selection in titles (e.g., “Revenue by Region — Europe”).
  • Mixed cadence: real-time mixed with weekly. Fix: label freshness on sections and separate pages if needed.

Self-check checklist

  • [ ] Can a new user answer the top question in under 10 seconds?
  • [ ] Are there at most 6 KPIs in Level 1 with clear targets/status?
  • [ ] Do global filters apply consistently and visibly?
  • [ ] Is every KPI two clicks or fewer from a root-cause breakdown?
  • [ ] Are discovery/record views hidden until requested?

Exercises

Exercise 1 — Card sort your metrics into 3 levels

Pick a team (sales, support, or product). List 12 metrics. Sort them into Level 1 (Decision), Level 2 (Diagnosis), Level 3 (Discovery). Write 2–3 sentences explaining why each Level 1 metric deserves the top strip.

  • [ ] 12 metrics listed
  • [ ] 3L grouping completed
  • [ ] Rationale for each Level 1 metric
  • [ ] Draft acceptance criteria: time-to-answer, click-depth

Exercise 2 — Define filter scope and drilldown paths

Using your Exercise 1 dashboard, define:

  • Global filters (max 4) and where they do NOT apply.
  • Local filters per section (if any).
  • Two drilldown paths from a KPI to a record/detail view.
  • [ ] Global filter list with scope notes
  • [ ] Local filter list by section
  • [ ] Two end-to-end drilldown paths documented

Practical projects

  • Executive Overview: 1-page dashboard with 5 KPIs, 3 diagnosis panels, and one discovery area hidden by default.
  • Operations Triage: multi-page (Overview, Diagnose, Explore) with sticky global filters and alert statuses.
  • Experiment Review: tabbed design with global time/platform and local experiment selector.

Learning path

  1. Master the 3L structure with a small dataset (10–20 metrics).
  2. Design filters and drilldowns for two distinct audiences.
  3. Scale to multi-page navigation with consistent patterns.
  4. Add performance criteria (load time, click depth) and validate with a user pilot.

Next steps

  • Finish Exercises 1–2 and review against the self-check checklist.
  • Take the Quick Test below to confirm you can structure and scope a dashboard IA.
  • Note: The test is available to everyone; only logged-in users will have results and progress saved.

Quick Test

Answer short questions to validate your understanding of dashboard information architecture. Your target is 70% or higher.

Mini challenge

You’re given a dashboard with 18 KPIs, mixed filters, and no drilldowns. In 10 minutes, outline a fix:

  • Pick 6 KPIs for Level 1 and justify each in one sentence.
  • Propose 3 diagnosis panels (what, by what slice, why these).
  • Define 3 global filters and 1 drilldown path.
Sample outline

Level 1: Revenue vs Target, SLA Breach Rate, Activation Rate, CAC, Active Users, Gross Margin. Diagnosis: Region x Product, Funnel Step drop-off, Queue by Channel. Global filters: Time, Region, Product. Drilldown: Revenue tile → Region trend → Product matrix → Account records.

Practice Exercises

2 exercises to complete

Instructions

Pick a team context (sales, support, or product). List 12 relevant metrics. Assign each to Level 1 (Decision), Level 2 (Diagnosis), or Level 3 (Discovery). Write 2–3 sentences explaining why each Level 1 metric deserves the top strip and what decision it supports. Finally, set acceptance criteria: time-to-answer for top questions and max click-depth from KPI to root cause.

Expected Output
A document or slide with 12 metrics grouped into Level 1/2/3, a brief rationale for each Level 1 metric, and acceptance criteria (e.g., answer in <10s; ≤2 clicks to root cause).

Information Architecture For Dashboards — Quick Test

Test your knowledge with 8 questions. Pass with 70% or higher.

8 questions70% to pass

Have questions about Information Architecture For Dashboards?

AI Assistant

Ask questions about this tool