luvv to helpDiscover the Best Free Online Tools
Topic 2 of 8

IAM Roles And Least Privilege

Learn IAM Roles And Least Privilege for free with explanations, exercises, and a quick test (for Data Platform Engineer).

Published: January 11, 2026 | Updated: January 11, 2026

Why this matters

As a Data Platform Engineer, you design and operate lakes, warehouses, and pipelines that touch sensitive data. Least privilege reduces blast radius, simplifies audits, and keeps you compliant. Real tasks you will face include:

  • Creating a pipeline role that can write to a single bucket/prefix, but nowhere else.
  • Giving analysts read-only access to curated datasets, while blocking raw PII.
  • Issuing temporary, time-bound access for backfills or incident response.
  • Implementing environment isolation: dev/test/prod do not cross-access.
  • Proving access controls to auditors with clean, reviewable policies.
Quick win: 15-minute checklist to reduce risk today
  • Rotate long-lived user keys into short-lived, role-based credentials.
  • Replace wildcards in policies ("*") with specific actions and resources.
  • Add a break-glass role with MFA and explicit logging, restricted to on-call.
  • Enable access logs for your storage and warehouse.

Concept explained simply

Identity and Access Management (IAM) decides who (principal) can do what (action) on which data (resource), under which conditions (time, IP, tags), with deny-by-default.

  • Principal: user, group, service account, or role assumed by a workload.
  • Action: verbs like read, write, list, create, delete, admin.
  • Resource: the exact objects, tables, or paths that can be accessed.
  • Policy/Role: a reusable permission set you attach or assume.
  • Conditions: constraints such as time-bound access, tags, network, encryption keys.

Evaluation mental model:

  • Implicit deny unless explicitly allowed.
  • Explicit deny overrides any allow.
  • Least privilege: only grant what is needed, at the narrowest scope, for the shortest time.
Mental model: keys, rooms, and time-limited passes

Think of roles as keyrings. Each key opens one room (resource) for certain actions. You produce temporary guest passes (short-lived credentials) for visitors (workloads/people) and revoke them automatically after a short time. A bright red "STOP" sign (explicit deny) blocks entry even if someone holds a key.

Core principles checklist

  • Grant roles to workloads, not to individuals wherever possible.
  • Scope to specific resources: paths, schemas, tables, topics.
  • Restrict actions to the minimum set required.
  • Use temporary credentials with short durations for human access.
  • Separate duties: build vs deploy, pipeline vs analyst vs admin.
  • Isolate environments: no dev principal can touch prod.
  • Use conditions: time windows, tags/labels, IP/VPC, encryption keys.
  • Create a monitored break-glass role with MFA and high-friction approval.
  • Log all access and review regularly.

Worked examples

Example 1: Pipeline can write only to bronze/sales prefix

Goal: A batch job writes Parquet to a single prefix. It must list that prefix and put objects, but cannot delete or read outside.

{
  "Role": "pipeline_bronze_sales_writer",
  "Allow": [
    {"Action": ["storage:List"], "Resource": "storage://datalake/bronze/sales/"},
    {"Action": ["storage:PutObject"], "Resource": "storage://datalake/bronze/sales/*"}
  ],
  "Deny": [
    {"Action": ["storage:DeleteObject", "storage:GetObject"], "Resource": "storage://datalake/bronze/sales/*"},
    {"Action": ["storage:*"], "Resource": "storage://datalake/*", "Condition": {"StringNotLike": {"path": "bronze/sales/*"}}}
  ]
}

Notes: precise prefix, no wildcard actions, explicit deny for delete.

Example 2: Analyst read-only on curated schema with row-level filter

Goal: Analysts can SELECT from curated.sales, no write, no raw.

{
  "Role": "analyst_curated_reader",
  "Allow": [
    {"Action": ["warehouse:Select"], "Resource": ["warehouse://curated.sales/*", "warehouse://views/curated_sales_safe"]}
  ],
  "Deny": [
    {"Action": ["warehouse:Insert", "warehouse:Update", "warehouse:Delete"], "Resource": "warehouse://curated.sales/*"},
    {"Action": ["warehouse:Select"], "Resource": "warehouse://raw/*"}
  ],
  "RowLevelPolicy": {
    "View": "views/curated_sales_safe",
    "Predicate": "region = CURRENT_USER_REGION()"
  }
}

Notes: route access via a safe view implementing row-level security.

Example 3: 2-hour just-in-time access for backfill

Goal: Engineer performs a backfill in staging for 2 hours only.

{
  "Role": "jit_staging_backfill",
  "Allow": [
    {"Action": ["orchestrator:Run", "warehouse:Merge"], "Resource": "staging://jobs/backfill/*"}
  ],
  "Condition": {"TimeBound": {"NotAfter": "+2h"}, "Env": "staging"}
}

Notes: short duration, environment-scoped, defined actions only.

Example 4: Environment isolation via deny guardrail

Goal: Prevent any dev principal from accessing prod.

{
  "Policy": "guardrail_no_dev_to_prod",
  "Deny": [
    {"Action": ["*"], "Resource": "prod://*", "Condition": {"PrincipalTag": {"env": "dev"}}}
  ]
}

Notes: a top-level explicit deny guardrail is simple and effective.

Designing least-privilege roles (step-by-step)

  1. Inventory the workload: what actions does it truly need?
  2. Map exact resources: bucket prefixes, schemas, tables, topics.
  3. Create a dedicated role per workload persona (pipeline, analyst, admin).
  4. Grant only necessary actions; avoid wildcards.
  5. Add conditions: environment, time, network, tags, KMS/key constraints.
  6. Add explicit denies for dangerous actions (e.g., delete, admin) where helpful.
  7. Test with dry-run: simulate access and verify logs before enabling in prod.

Exercises

Do these in a notebook or your preferred editor. The quick test at the end is available to everyone. If you log in, we save your progress.

Exercise 1: Tighten an over-permissive storage policy

You have this current policy for a batch job:

{
  "Allow": [{"Action": "storage:*", "Resource": "storage://datalake/*"}]
}

Goal: The job should only list and write to storage://datalake/bronze/sales/. It must not delete anything and must not access other paths.

  • Write a least-privilege replacement.
  • Add an explicit deny for delete actions.
  • Keep it readable and auditable.
Exercise 2: Backfill + Analyst design

Constraints:

  • Backfill job in staging can read raw and write bronze for 24 hours only.
  • Backfill must never touch prod.
  • Data analyst needs read-only access to curated.sales via a view, with row-level filter on region = EMEA.
  • Use roles, conditions, and time limits. No long-lived user keys.

Deliverables:

  • Role and policy outline for the backfill (with time-bound condition and environment constraint).
  • Role and policy outline for the analyst (read-only via view with row-level filter).
  • Self-check when done:
    • No wildcard resources where a specific path exists.
    • No write actions for the analyst.
    • Backfill role cannot run after its end time or outside staging.

Common mistakes and self-check

  • Using "*" for actions or resources. Fix: enumerate exact actions and resources.
  • Mixing dev and prod access. Fix: enforce environment tags and explicit denies.
  • Granting user keys instead of role-based, short-lived credentials. Fix: use temporary sessions with MFA.
  • No row-level or column-level controls. Fix: expose safe views and data-masking.
  • Forgetting explicit deny guardrails. Fix: add top-level denies for cross-env or destructive actions.
  • One mega-role for all. Fix: split by persona and workload.
  • No logging or reviews. Fix: enable access logs and review last-used/unused permissions.
Quick self-audit checklist
  • Can I state each role’s purpose in one sentence?
  • Does each permission tie to a real action the workload performs?
  • Are all resources scoped to paths/schemas, not just accounts/projects?
  • Do destructive actions require extra conditions or are explicitly denied?
  • Are human accesses temporary and MFA-protected?

Practical projects

  • Lock down a lakehouse: implement three roles — pipeline_bronze_writer, analyst_curated_reader (via safe view), and break_glass_admin with MFA — and validate using dry-run tests and logs.
  • Environment guardrails: add an explicit deny so dev and test principals cannot access prod resources, then verify by attempting cross-env operations.
  • Row-level security: build a view over curated.sales that filters by region and grant analysts access only to that view. Confirm EMEA analysts see only EMEA rows.

Who this is for, prerequisites, learning path

Who this is for
  • Data Platform Engineers designing secure data lakes/warehouses.
  • Data Engineers creating pipelines that must access storage safely.
  • Analytics Engineers/DBAs implementing fine-grained access.
Prerequisites
  • Basic understanding of data storage (object storage, tables/schemas).
  • Familiarity with roles/policies concepts in a major cloud or warehouse.
  • Ability to read simple JSON/YAML-style policy docs.
Suggested learning path
  • Start: This lesson — principles, examples, exercises.
  • Next: Data encryption and key management to pair IAM with strong cryptography.
  • Then: Monitoring and incident response to detect misuse quickly.

Next steps

  • Complete the exercises and run the quick test below.
  • Refactor one real policy in your environment using the checklist.
  • Schedule a 30-day review of access logs and unused permissions.

Mini challenge

Audit this scenario and propose a fix in three bullet points: An analyst has storage:* on storage://datalake/ and warehouse:Select on warehouse://raw/* and curated/*, using long-lived user keys. What would you do in the next 24 hours to reduce risk without blocking work?

Practice Exercises

2 exercises to complete

Instructions

Current policy:

{
  "Allow": [{"Action": "storage:*", "Resource": "storage://datalake/*"}]
}

Goal: The job should only list and write to storage://datalake/bronze/sales/. It must not delete anything and must not access other paths.

  • Write a least-privilege replacement policy.
  • Add an explicit deny for delete.
  • Keep it readable and auditable.
Expected Output
A policy allowing only List and PutObject on storage://datalake/bronze/sales/ (and its children), with explicit deny for DeleteObject and any access outside that prefix.

IAM Roles And Least Privilege — Quick Test

Test your knowledge with 10 questions. Pass with 70% or higher.

10 questions70% to pass

Have questions about IAM Roles And Least Privilege?

AI Assistant

Ask questions about this tool