Why this matters
As a Business Analyst, you turn messy stakeholder input into clear, buildable requirements. Mislabeling an assumption as a constraint (or missing a real constraint) leads to blown budgets, missed deadlines, rework, and compliance risk. Getting this right helps you:
- Set realistic scope and timelines.
- Prevent compliance and security issues.
- Choose feasible solutions early (build vs. buy).
- Forecast resources accurately.
- Align stakeholders on what is fixed vs. flexible.
Concept explained simply
Constraint: a non-negotiable boundary you must work within. Think of it as a wall (law, budget cap, deadline, tech limit).
Assumption: a belief you treat as true for now, but it could be wrong. Think of it as a ladder you can climb only if it holds—so you test it.
Quick examples
- Constraint: “Must comply with GDPR.”
- Assumption: “Users will accept SSO.”
- Constraint: “Delivery by Oct 31 due to contract.”
- Assumption: “Legacy system can handle 2x load.”
A dependable method: C-TRaD-S framework
Use this to find constraints first, then assumptions.
- Compliance: laws, standards, internal policies.
- Time: fixed dates, release windows, service-level targets.
- Resources: budget caps, people, skills, tools.
- Architecture/Tech: legacy limits, vendor contracts, performance ceilings.
- Data: retention, residency, privacy, quality.
- Scope: must-have features, out-of-scope items already decided.
Then articulate assumptions in four buckets (SLED):
- Stakeholder: availability, engagement, decision speed.
- Logistics: environments, access, dependencies.
- Experiential (user behavior/UX): adoption, usage patterns.
- Delivery: vendor cooperation, change approvals, risk tolerance.
Mini task: turn inputs into findings
- List every “must/shall/cannot” you heard. These are likely constraints.
- List every “we think/probably/should” you heard. These are likely assumptions.
- For each assumption, add a validation step (interview, POC, test data, pilot).
Step-by-step workflow
- Collect signals: scan meeting notes, emails, RFPs, contracts, policies.
- Extract candidates: highlight “must/cannot/by date” (constraints) and “we think/plan to/should” (assumptions).
- Classify: tag each as C or A. If unsure, treat as assumption until proven.
- Quantify impact: rate impact (High/Med/Low) and certainty (High/Med/Low).
- Act: constraints go into scope/requirements; high-impact low-certainty assumptions get validation tasks or risk entries.
- Trace: link each item to a stakeholder, document, and requirement or risk ID.
Checklist: good practice
- Each constraint has a source (law, contract, policy, signed decision).
- Each assumption has an owner and a validation due date.
- High-impact assumptions appear in the risk register.
- Constraints are testable (clear wording and acceptance criteria).
Worked examples
Example 1: Healthcare appointment app
Stakeholders say: “We must store data in-country. We think patients will reschedule within the app. Go-live is 12 weeks due to a partnership launch.”
Show analysis
- Constraints: data residency (policy/law), go-live in 12 weeks (external commitment).
- Assumptions: in-app rescheduling adoption.
- Action: add data residency requirement; add usability test or pilot to validate adoption.
Example 2: Finance reporting automation
Notes: “We have a $100k cap. Legacy ERP exports CSV monthly. We plan to switch to weekly feeds.”
Show analysis
- Constraints: $100k budget cap; monthly CSV export (current tech limit).
- Assumptions: weekly feeds are feasible (requires ERP or process change).
- Action: cost guardrails in scope; spike to test weekly export feasibility or introduce staging.
Example 3: Retail click-and-collect
Notes: “Stores open 9–21h; we assume staff can scan orders with existing tablets; payment provider requires 3DS.”
Show analysis
- Constraints: store hours; 3DS requirement (payment provider policy).
- Assumptions: tablets suitable for scanning (performance/OS support unknown).
- Action: confirm tablet capability via device test; add 3DS acceptance criteria.
Templates you can reuse
Constraint statement template
“The solution must [verb phrase] because [source: law/policy/contract/decision], measured by [acceptance criteria].”
Example: “The system must store EU customer PII within EU data centers because of GDPR and internal policy, verified by deployment region and audit logs.”
Assumption statement template
“We assume [belief] for [scope/decision], and we will validate by [method] no later than [date]. If false, then [contingency].”
Example: “We assume 60% users prefer SSO for login; validate via A/B test in Sprint 3. If false, prioritize email login flow.”
Exercises
Try the scenario below. Then compare with the solution. Your progress in the Test is available to everyone; only logged-in users will see saved progress and history.
Exercise 1: Classify items from a stakeholder brief
Scenario: For a B2B portal, stakeholders say: “ISO 27001 must be met. We plan to let partners self-register. Contract goes live on Sept 30. Current SSO supports SAML only. We think partners will upload CSVs weekly.”
- List constraints vs assumptions.
- For each assumption, add a validation method and owner.
- For each constraint, add an acceptance criterion.
Hints
- Look for must/required/dates/vendor limits.
- Unproven behaviors (self-register, weekly uploads) are assumptions.
Sample solution
- Constraints: ISO 27001 compliance; Sept 30 go-live; SSO supports SAML only.
- Assumptions: partners will self-register; partners upload weekly CSVs.
- Validation: quick partner pilot; analytics on CSV upload frequency.
- Acceptance: security audit pass; production live by Sept 30; SAML SSO login success rate ≥ 99%.
- I wrote each constraint with a clear source.
- I assigned an owner and date to each assumption.
- I defined acceptance criteria for each constraint.
- High-impact assumptions are in the risk register.
Common mistakes and self-check
- Mistake: Treating stakeholder preferences as constraints. Fix: Ask, “What happens if we don’t?” If the answer is inconvenience (not violation), it’s not a constraint.
- Mistake: Leaving assumptions implicit. Fix: Write them down with validation and contingency.
- Mistake: Vague constraints. Fix: Make them testable with measurable criteria.
- Mistake: Never revisiting. Fix: Review constraints/assumptions at each major decision or change request.
Self-check prompts
- Can I point to a document or authority for each constraint?
- Do our top 3 assumptions have validation steps in the current sprint/phase?
- Are any items mislabeled because of politics or convenience?
Practical projects
- Project 1: Take an old project charter and extract constraints/assumptions. Add sources, owners, and tests. Share with a mentor/peer for review.
- Project 2: Build a spreadsheet or board with columns: Item, Type (C/A), Source, Impact, Certainty, Owner, Validation/Criteria, Due, Status.
- Project 3: Run a 30-minute workshop with stakeholders using the C-TRaD-S and SLED prompts. Capture and classify items live.
Who this is for
- Business Analysts and Product Analysts working on new or evolving solutions.
- Project Managers needing clear guardrails for planning.
- Anyone preparing for BA interviews and case studies.
Prerequisites
- Basic understanding of stakeholder interviews and requirement types.
- Ability to scan policies/contracts for key clauses.
- Comfort organizing notes into lists and matrices.
Learning path
- Elicit inputs (interviews, documents, demos).
- Classify constraints vs assumptions (C-TRaD-S, SLED).
- Validate assumptions (tests, pilots, data checks).
- Document constraints with acceptance criteria.
- Trace to requirements, scope, and risks.
- Review at milestones and change requests.
Next steps
- Do Exercise 1 and capture your findings.
- Take the Quick Test below to confirm understanding.
- Only logged-in users see saved test progress; everyone can take the test for free.
Mini challenge
Pick any current initiative. In 15 minutes, list 5 constraints (with sources) and 5 assumptions (with owners and validation dates). Convert one high-impact assumption into a small validation task you can do this week.
Quick Test
Answer the questions to check your understanding. Passing score is 70%. Available to everyone; saved progress visible to logged-in users.