luvv to helpDiscover the Best Free Online Tools
Topic 3 of 8

Camera And Optics Basics

Learn Camera And Optics Basics for free with explanations, exercises, and a quick test (for Computer Vision Engineer).

Published: January 5, 2026 | Updated: January 5, 2026

Why this matters

Real-world computer vision performance starts with photons hitting a sensor. Understanding lenses, exposure, and sensors helps you design reliable pipelines, reduce noise and blur, size datasets correctly, and debug issues faster.

  • Product CV: Choose lenses and exposure that keep barcodes or labels sharp on a conveyor.
  • Robotics: Balance shutter speed against light to avoid motion blur during navigation.
  • Mobile CV: Handle rolling shutter and wide-angle distortion in AR and SLAM.
  • Research/ML: Collect cleaner data to improve training signal and reduce domain shift.

Who this is for

  • Computer Vision Engineers and ML practitioners building vision systems.
  • Data scientists labeling or curating image/video datasets.
  • Developers integrating cameras into apps, robots, or embedded devices.

Prerequisites

  • Basic linear algebra and trigonometry (angles, tangent, arctangent).
  • Familiarity with coordinate systems and pixels.
  • Optional but helpful: basic Python and NumPy for calibration tasks.

Concept explained simply

A camera turns light into numbers. The lens guides light; the aperture and shutter decide how much and how long; the sensor collects and converts it to pixel values.

  • Focal length (f): how strongly the lens converges light. Short f = wide view; long f = narrow view.
  • Field of View (FOV): how much of the scene is captured. Roughly: FOV ≈ 2 × arctan(sensor_size / (2f)).
  • Aperture (f-number, e.g., f/2.8): affects brightness and depth of field. Smaller f-number = more light, shallower focus.
  • Shutter speed (e.g., 1/250 s): exposure time. Faster shutter reduces motion blur but needs more light.
  • ISO: sensor gain. Higher ISO brightens but adds noise.
  • Sensor size and pixel pitch: larger pixels gather more light and have better signal-to-noise in low light.
  • Distortion: wide lenses bend straight lines near edges (barrel); tele can pinch (pincushion). Corrected in calibration.
  • Rolling vs global shutter: rolling reads line-by-line (can skew fast motion); global captures at once.
  • Color filter array (Bayer): most sensors see through R/G/B filters; requires demosaicing and white balance.

Mental model

Imagine a pinhole camera. Shrink the hole: sharper but darker. Add glass (a lens) to get both sharpness and light. Now control three levers—aperture, shutter, ISO—to hit a target brightness while keeping blur and noise low. Finally, fix the quirks: distortion, color casts, and rolling-shutter artifacts.

Key formulas and quick checks
  • Horizontal FOV ≈ 2 × arctan(sensor_width / (2f))
  • Exposure change in stops: doubling exposure time = +1 stop; doubling ISO = +1 stop; f-number ×√2 = −1 stop of light
  • Radial distortion (one form): r_d = r × (1 + k1 r^2 + k2 r^4 + ...), where r is normalized radius
  • Motion blur distance ≈ scene_speed × shutter_time; in pixels ≈ blur_distance / scene_scale_per_pixel

Worked examples

Example 1: Compute horizontal FOV

Given sensor width = 6.3 mm and focal length f = 4.0 mm.

FOV ≈ 2 × arctan(6.3 / (2 × 4.0)) = 2 × arctan(0.7875) ≈ 2 × 38.4° ≈ 76.8°.

Result: about 77° horizontal FOV (wide).

Example 2: Exposure change in stops

From f/4, 1/200 s, ISO 100 to keep same brightness but increase shutter speed to 1/800 s (2 stops faster).

  • Need +2 stops from aperture/ISO: options include f/2 (open 2 stops) or ISO 400 (+2 stops), or mix: f/2.8 (+1), ISO 200 (+1).

Result: f/2.8, 1/800 s, ISO 200 is one valid combination.

Example 3: Estimate motion blur in pixels

Object speed = 0.5 m/s, shutter = 1/250 s → blur distance ≈ 0.5 × 0.004 = 0.002 m = 2 mm. If each pixel covers 0.05 mm, blur ≈ 2 / 0.05 = 40 pixels. Too high—use a faster shutter or reduce speed.

Example 4: Distortion quick correction value

Normalized radius r = 0.5, k1 = -0.2, k2 = 0. r_d = 0.5 × (1 + (-0.2) × 0.25) = 0.5 × (1 - 0.05) = 0.475. Points move slightly toward center after correction.

Exercises

These mirror the tasks in the Exercises section below. Try them here first, then check the detailed solutions.

Exercise 1: Lens selection and FOV planning

You have a sensor with 6.4 mm horizontal size and want about 70° horizontal FOV. What focal length should you pick (approximate)? Use f ≈ sensor_width / (2 × tan(FOV/2)).

  • Target: round to nearest 0.1 mm.
  • Self-check: does a shorter focal length give wider FOV?

Exercise 2: Keep brightness, cut motion blur

Current settings: f/2.8, 1/200 s, ISO 100. You see blur and want 1/800 s (2 stops faster) but keep the same brightness. Propose two valid setting combinations.

  • Constraint: noise should be moderate; prefer changing aperture before maxing ISO.

Exercise checklist

  • I computed FOV with the correct sensor dimension (horizontal for horizontal FOV).
  • I adjusted exposure by exact stops (no guesswork).
  • I confirmed trade-offs: faster shutter increases noise or requires wider aperture.

Common mistakes and self-check

  • Mixing sensor diagonal with width/height for FOV. Self-check: confirm which FOV you compute (H/V/Diagonal).
  • Changing multiple exposure levers without tracking stops. Self-check: write each stop change explicitly.
  • Ignoring rolling shutter when objects move fast. Self-check: inspect vertical edges for tilt or skew.
  • Assuming higher resolution always helps. Self-check: compare noise and motion blur; sometimes faster shutter at lower resolution wins.
  • Skipping lens distortion calibration for wide lenses. Self-check: do straight lines bow near edges? Run a checkerboard calibration if yes.

Practical projects

  • Build a FOV calculator: input sensor size and focal length; output H/V/Diagonal FOV and suggested working distance for a target object width.
  • Exposure triangle notebook: given desired blur limit (pixels), compute minimum shutter, then solve for aperture/ISO to keep brightness.
  • Distortion demo: render a grid, apply simple barrel distortion with a k1 term, then undistort it and measure residual error.

Learning path

  • Start: Camera model (pinhole), FOV, exposure triangle.
  • Next: Distortion models and camera calibration (intrinsics and extrinsics).
  • Then: Photometric effects (white balance, gamma, tone mapping) and RAW vs JPEG.
  • Finally: Sensor timing (rolling vs global shutter) and synchronization for multi-camera setups.

Next steps

  • Calibrate a real or synthetic camera to estimate focal length, principal point, and distortion.
  • Design a data capture plan: choose lens, FOV, and exposure to meet blur and noise limits for your task.
  • Document your assumptions (sensor width, pixel size, shutter type) alongside datasets.

Mini challenge

You must read 10 mm-wide text from 1 m away. Your sensor width is 7.2 mm and you want the text to occupy at least 400 pixels horizontally on a 1920 px-wide image. Choose an approximate focal length that satisfies both FOV and sampling. State your assumptions and trade-offs (depth of field vs light). There is no single right answer—justify yours.

Quick Test is available to everyone; log in to save your progress automatically.

Practice Exercises

2 exercises to complete

Instructions

You have a sensor with 6.4 mm horizontal size and want about 70° horizontal FOV. What focal length should you pick (approximate)? Use f ≈ sensor_width / (2 × tan(FOV/2)). Round to nearest 0.1 mm.

Expected Output
Approximately 4.6–4.7 mm focal length (about 4.6 mm).

Camera And Optics Basics — Quick Test

Test your knowledge with 10 questions. Pass with 70% or higher.

10 questions70% to pass

Have questions about Camera And Optics Basics?

AI Assistant

Ask questions about this tool