Menu

Topic 4 of 8

Schema And Contract Versioning

Learn Schema And Contract Versioning for free with explanations, exercises, and a quick test (for Backend Engineer).

Published: January 20, 2026 | Updated: January 20, 2026

Why this matters

As a Backend Engineer, you will evolve APIs and event schemas while real users and services depend on them. Good schema and contract versioning prevents outages, unblocks teams, and enables safe iteration.

  • Add new fields to REST/GraphQL without breaking mobile apps that update slowly.
  • Publish new Kafka event versions that old consumers can still read.
  • Deprecate and remove fields with a clear timeline and telemetry.
  • Roll out breaking changes using parallel versions and controlled traffic shifts.
Real tasks you might face
  • Introduce pagination fields to a popular endpoint with millions of requests/day.
  • Add an enum value to payment status while keeping existing analytics pipelines running.
  • Split a large event into smaller ones and migrate consumers gradually.

Concept explained simply

A contract is the agreed structure and behavior between services or between a service and its clients. A schema is one formal expression of that contract (OpenAPI, GraphQL SDL, Avro/Proto/JSON Schema). Versioning is the strategy to change the contract safely over time.

Mental model

Think of your service as a radio station and clients as radios. If you change frequency suddenly, old radios lose the station. Versioning is broadcasting old and new frequencies for a while, announcing the change, and giving listeners time to retune.

Compatibility basics

  • Backward compatible: New producer/service works with old data/clients. Commonly: adding optional fields with defaults, adding enum values clients ignore.
  • Forward compatible: Old clients/consumers can handle data from newer producers (tolerant-reader: ignore unknown fields).
  • Breaking change: Renaming or removing fields, changing types, or semantics clients rely on.
Typical safe vs. unsafe changes
  • Safe (usually): add optional field; add endpoint; add GraphQL field; add Avro field with default; add enum value if clients tolerate unknowns.
  • Risky/breaking: remove/rename field; change type; change meaning; reorder positional parameters; tighten validation without defaults.

Version identifiers

  • REST: version in path (/v1), in header (X-API-Version), or via media types (Accept: application/vnd.app.v2+json).
  • GraphQL: prefer additive changes + @deprecated; reserve server versioning for major overhauls.
  • Events (Kafka, etc.): schema registry with compatibility rules (BACKWARD, FORWARD, FULL), version tracked by schema ID, not topic name.

Worked examples

Example 1: REST — adding an optional field

Current response:

{
  "id": "u_1",
  "name": "Ada"
}

New need: add "nickname" (optional). Plan:

  • Add field as nullable/omitted. Do not require clients to send it on write.
  • Update OpenAPI with description and example.
  • Monitor error rates; keep old clients working.
Client impact check

If clients are tolerant readers, unknown fields are ignored. No breaking change.

Example 2: Kafka/Avro — safe addition with defaults

Old schema:

{
  "type": "record",
  "name": "Order",
  "fields": [
    {"name": "id", "type": "string"},
    {"name": "total", "type": "double"}
  ]
}

New schema (add currency with default):

{
  "type": "record",
  "name": "Order",
  "fields": [
    {"name": "id", "type": "string"},
    {"name": "total", "type": "double"},
    {"name": "currency", "type": "string", "default": "USD"}
  ]
}
  • Set registry to BACKWARD or FULL compatibility.
  • New consumers (reader schema = new) can read old messages (writer = old) due to default.
Why BACKWARD works here

Avro reader with an extra field uses its default when that field is missing in older messages.

Example 3: Breaking change — renaming a REST field

Old: "name". New: "full_name". Safe plan:

  • Phase 1 (additive): Add "full_name" while keeping "name" as alias; server writes both; document "name" as deprecated.
  • Phase 2 (migration): Telemetry to see remaining clients on "name". Communicate removal date.
  • Phase 3 (removal): Release v2 (path or media type). Remove "name" only in v2. Keep v1 for a sunset window.
Server-side mapping idea

On writes: if client sends name, map to full_name. On reads: include both for v1, only full_name for v2.

Example 4: GraphQL — deprecation flow

  • Add new field fullName.
  • Mark name with @deprecated(reason: "Use fullName. Removal on 2026-06-30").
  • Keep resolver until after the sunset date; track field usage.
Client experience

Tooling surfaces deprecation warnings in IDEs and CI without breaking queries immediately.

Safe change process (works broadly)

  1. Design: classify change (additive vs. breaking). Choose versioning approach (v2 path, header, media type; schema registry mode).
  2. Announce: document change, mark deprecated with a removal date.
  3. Implement: ship additive changes first. Add defaults for messages. Keep old behavior alongside new.
  4. Test: contract tests (consumer-driven where possible), compatibility checks (schema registry), and replay tests on sample traffic.
  5. Rollout: canary or blue-green; monitor errors, latency, and consumer lag.
  6. Sunset: remove after usage drops and deadline passes; keep a rollback plan.
Rollback tip

For events, preserve the old schema ID and producers until consumers confirm readiness. For APIs, keep v1 and v2 behind routing flags.

Exercises you can try

These mirror the interactive exercises below. Write your answers, then compare with solutions.

  • Exercise 1: Plan a non-breaking REST evolution for adding fields and deprecating an old one, including a deprecation timeline.
  • Exercise 2: Configure schema compatibility for a Kafka topic and choose the right mode for adding and removing fields.
  • Self-check checklist:
    • Did you avoid breaking changes where possible?
    • Did you include telemetry and a deprecation date?
    • Did you specify compatibility mode and defaults for events?

Common mistakes and how to self-check

  • Assuming clients update instantly. Mitigation: measure client versions and usage; keep sunset windows.
  • Renaming without aliasing. Mitigation: support both names temporarily; document mapping.
  • Adding fields to Avro without defaults. Mitigation: always provide defaults when readers might be newer.
  • Forgetting tolerant-reader behavior. Mitigation: tests ensuring unknown fields are ignored by clients.
  • Silent breaking validations. Mitigation: version or feature-flag stricter validation; collect error metrics.
Self-check routine
  • Can an old client call your service after the change without code changes?
  • Can a new consumer read old events published months ago?
  • Is there a clear removal date and a way to know when it is safe to remove?

Practical projects

  • Dual-version REST demo: Expose /v1/users and /v2/users. Add a new field in v2, keep v1 stable, add a simple dashboard showing v1 vs v2 traffic.
  • Event evolution lab: Create a Kafka-like simulation or local queue; publish events with schema v1 then v2 (with defaults). Write consumers pinned to different schema versions; verify who can read what.
  • Contract tests: Implement a small provider and two consumers with Pact-like tests (conceptually). Break the provider and watch tests fail, then fix with a backward-compatible change.

Who this is for

  • Backend Engineers evolving APIs/events with multiple clients.
  • Platform/Integration Engineers managing schema registries and compatibility policies.
  • Developers working on mobile/IoT backends where updates roll out slowly.

Prerequisites

  • Comfort with REST or GraphQL fundamentals.
  • Basic understanding of message brokers (Kafka, RabbitMQ) and serialization (JSON, Avro, Proto) concepts.
  • Familiarity with semantic versioning (MAJOR.MINOR.PATCH).

Learning path

  1. Review compatibility types and tolerant-reader principles.
  2. Practice additive API changes and deprecations.
  3. Learn schema registry modes and defaults (Avro/Proto/JSON Schema concepts).
  4. Implement contract tests and rollout strategies (canary/blue-green).
  5. Plan and execute one breaking change with a parallel version and sunset.

Next steps

  • Instrument usage metrics per version.
  • Automate schema checks in CI to block incompatible changes.
  • Adopt a deprecation policy template used by your team.

Quick Test note: The quick test is available to everyone. If you are logged in, your progress and scores will be saved.

Mini challenge

Your product team wants to change price from number to string to include currency symbols (e.g., "12.99 USD"). Design a plan that avoids breaking current consumers that expect a number. Include data shape, versioning choice, rollout, and sunset steps. Keep it to five bullet points.

Practice Exercises

2 exercises to complete

Instructions

Your /v1/profile endpoint returns:

{
  "id": "u_1",
  "name": "Ada"
}

Business asks to add nickname (optional) and later replace name with full_name. Propose a safe plan covering:

  • Response shapes in each phase
  • Versioning approach (path, header, or media type)
  • Deprecation message and timeline
  • Telemetry to track readiness

Write your plan, then compare with the solution.

Expected Output
A phased rollout where v1 remains backward compatible while v2 introduces the removal. Clear deprecation notice, telemetry, and a sunset date.

Schema And Contract Versioning — Quick Test

Test your knowledge with 10 questions. Pass with 70% or higher.

10 questions70% to pass

Have questions about Schema And Contract Versioning?

AI Assistant

Ask questions about this tool