Overview#
The camera is the fundamental measurement instrument in photogrammetry. Unlike a consumer camera optimized for visual appeal, a photogrammetric camera is an optical-mechanical system engineered for geometric precision. Every three-dimensional coordinate extracted from imagery depends on knowing -- with high accuracy -- the internal geometry of the camera that captured the photographs. Without that knowledge, the mathematical relationship between image coordinates and ground coordinates breaks down.
Photogrammetric cameras have evolved dramatically over the past century: from glass-plate film cameras in the early 1900s, through large-format film aerial cameras that dominated mapping from the 1940s through the 1990s, to the digital frame and linear array sensors that define modern practice. Throughout this evolution, the core requirement has remained constant -- the camera's internal geometry must be known, stable, and reproducible.
"The camera is the basic instrument used in photogrammetry. In a broad sense, the camera exposes a sensitized medium (film or digital sensor) to light reflected from objects, thereby creating an image that is a central perspective projection of the objects." -- Wolf, Dewitt & Wilkinson, Elements of Photogrammetry with Applications in GIS (4th Ed.), Ch. 4, p. 63
For exam preparation, understanding camera systems means understanding the difference between metric and non-metric cameras, how interior orientation parameters define the camera's geometry, what camera calibration accomplishes, and how modern digital sensors have transformed data acquisition. These concepts underpin everything that follows in photogrammetric processing.
Metric vs. Non-Metric Cameras#
The distinction between metric and non-metric cameras is foundational. A metric camera is one that has been designed and calibrated for photogrammetric measurement. Its internal geometry is stable, known, and documented in a calibration certificate. A non-metric camera is any camera not specifically designed for measurement -- consumer DSLRs, smartphone cameras, and general-purpose industrial cameras all fall into this category.
Key characteristics of metric cameras:
| Feature | Metric Camera | Non-Metric Camera |
|---|---|---|
| Calibrated focal length | Known to micrometers | Nominal only |
| Principal point location | Precisely determined | Unknown or approximate |
| Lens distortion | Characterized and documented | Unknown |
| Fiducial marks | Present (film cameras) | Absent |
| Geometric stability | Designed for stability | Variable |
| Calibration certificate | Issued by manufacturer or lab | None |
Fiducial marks are reference markers built into film-based metric cameras, typically located at the four corners or four midpoints of the image frame. When the film is exposed, these marks are recorded on every photograph. During photogrammetric processing, the fiducial marks define the internal coordinate system of the image, allowing the principal point and all interior orientation parameters to be applied. Digital cameras do not have fiducial marks because the sensor array itself defines the image coordinate system through its pixel geometry.
"Fiducial marks define the coordinate axes of the image space. Their positions, together with the calibrated focal length and principal point offset, constitute the inner orientation of the camera." -- Ghilani & Wolf, Elementary Surveying: An Introduction to Geomatics (15th Ed.), Ch. 27, p. 803
Non-metric cameras can still be used for photogrammetric work -- modern Structure from Motion (SfM) software routinely processes imagery from consumer cameras -- but the interior orientation parameters must be solved during processing through self-calibration. This adds unknowns to the adjustment and generally reduces the geometric rigor compared to a pre-calibrated metric camera.
Film-Based Aerial Cameras#
Film-based aerial cameras dominated photogrammetric mapping from the mid-twentieth century through the early 2000s. These instruments were precision-engineered, large-format cameras designed for vertical aerial photography from manned aircraft.
Notable Systems
| Camera | Manufacturer | Format | Focal Length Options |
|---|---|---|---|
| RC-30 | Wild/Leica | 23 cm x 23 cm | 88 mm, 153 mm, 210 mm, 305 mm |
| RMK Top | Zeiss | 23 cm x 23 cm | 88 mm, 153 mm, 210 mm, 305 mm |
| LMK | Linhof/Jenoptik | 23 cm x 23 cm | 153 mm, 305 mm |
The standard aerial film format was 23 cm x 23 cm (approximately 9 inches x 9 inches), using 240 mm wide roll film. This large format provided high ground resolution and geometric redundancy. A typical 153 mm (6-inch) focal length lens combined with the 23 cm format produced a field of view well suited to topographic mapping at standard flight altitudes.
Film-based cameras used forward motion compensation (FMC) -- a mechanism that moved the film platen during exposure to compensate for aircraft motion, preventing image blur at slow shutter speeds. The Leica RC-30, for example, included FMC and was considered the gold standard for aerial survey photography through the 1990s.
"The aerial mapping camera is a precision instrument designed to make photographs suitable for use in preparing maps and for other photogrammetric applications. It must produce sharp images over a large format area, and its inner orientation must remain stable." -- Wolf, Dewitt & Wilkinson, Elements of Photogrammetry with Applications in GIS (4th Ed.), Ch. 4, p. 65
Film cameras are now largely obsolete in production environments, replaced by digital sensors offering immediate data availability, no film processing, and broader dynamic range. However, vast archives of film-based aerial photography remain important for historical analysis, change detection, and retracement surveys. The USGS Earth Resources Observation and Science (EROS) Center maintains an archive of millions of historical aerial photographs dating back to the 1930s, many of which are scanned and available for photogrammetric processing using modern softcopy systems.
Understanding film-based systems remains relevant for exam preparation because the principles of interior orientation, fiducial marks, and calibration that were developed for these cameras form the conceptual foundation for all photogrammetric geometry.
Digital Frame Cameras#
Digital frame cameras capture the entire image simultaneously on a two-dimensional sensor array, directly analogous to how a film camera exposes the entire frame at once. These systems have replaced film cameras for virtually all production aerial mapping.
Major Systems
| System | Manufacturer | Sensor Type | Ground Pixel Size (typical) |
|---|---|---|---|
| DMC III | Leica Geosystems | Multi-head CMOS | 3--5 cm (low altitude) |
| UltraCam Eagle M3 | Vexcel Imaging | Multi-head CCD/CMOS | 3--5 cm |
| UltraCam Osprey 4.1 | Vexcel Imaging | Multi-head, oblique + nadir | 3--5 cm |
| PhaseOne iXM-RS150F | Phase One | Single-head CMOS (150 MP) | 2--5 cm |
Large-format digital cameras achieve their sensor size through multi-head configurations. The DMC (Digital Mapping Camera), for instance, uses multiple camera heads whose imagery is stitched together in post-processing to create a single, seamless virtual frame. This approach overcomes the physical size limitations of individual CCD or CMOS sensor chips while maintaining the geometric properties of a frame camera.
Key advantages of digital frame cameras over film:
- No film processing -- images are immediately available as digital files
- Higher radiometric resolution -- 12-bit or 14-bit pixel depth vs. ~8-bit effective for film
- Broader dynamic range -- better performance in shadow and highlight areas
- Multispectral capability -- simultaneous capture of red, green, blue, and near-infrared bands
- Forward motion compensation via electronic TDI (Time Delay Integration) rather than mechanical systems
"Digital cameras offer significant advantages over film-based systems, including immediate availability of imagery, superior radiometric resolution, and the capability for multispectral data acquisition." -- Mikhail, Bethel & McGlone, Introduction to Modern Photogrammetry, Ch. 3, p. 95
Linear Array (Pushbroom) Sensors#
Linear array sensors -- also called pushbroom sensors -- capture imagery one line at a time as the platform moves forward. Each line of pixels is exposed sequentially, building up a two-dimensional image strip from thousands of individual scan lines. This is fundamentally different from a frame camera, which captures the entire image in a single exposure.
The most prominent airborne pushbroom system is the Leica ADS (Airborne Digital Sensor) series. The ADS captures forward-looking, nadir, and backward-looking image strips simultaneously using three linear CCD arrays, providing along-track stereo coverage in a single pass.
Frame vs. Pushbroom Comparison
| Characteristic | Frame Camera | Pushbroom (Linear Array) |
|---|---|---|
| Exposure | Entire frame at once | Line by line |
| Stereo method | Overlap between frames | Multi-line (forward/nadir/backward) |
| Exterior orientation | One set per frame | Varies continuously per line |
| Geometric model | Central perspective | Each line has own perspective center |
| IMU/GPS dependency | Moderate | Critical (position for every line) |
| Swath width | Fixed by sensor format | Fixed by array length |
| Along-track coverage | Discrete frames with overlap | Continuous strip |
The principal challenge of pushbroom sensors is that every scan line has its own exterior orientation. The aircraft's position and attitude change continuously during acquisition, so the position () and rotation () must be known -- or solved -- for each line. This makes pushbroom sensors heavily dependent on high-quality IMU (Inertial Measurement Unit) and GNSS data for direct georeferencing.
"Because each line of a linear array image is acquired at a different instant in time, the sensor position and orientation must be known for every line. This requires precise integration of GNSS and inertial navigation data." -- ASPRS, Manual of Photogrammetry (6th Ed.), Ch. 7, p. 357
Interior Orientation Parameters#
Interior orientation (IO) defines the internal geometry of the camera at the moment of exposure. It establishes the relationship between the physical sensor (or film plane) and the perspective center of the lens. Three parameters are fundamental:
-
Calibrated focal length ( or ): The perpendicular distance from the perspective center to the image plane. Not the same as the nominal lens focal length -- it is determined through calibration and accounts for the actual optical geometry.
-
Principal point offset (): The point where a line from the perspective center, perpendicular to the image plane, intersects the image plane. In a perfect camera, this would coincide with the geometric center of the sensor; in practice, it is offset by small amounts determined during calibration.
-
Lens distortion parameters: Systematic deviations from the ideal central perspective caused by imperfections in the lens system.
The relationship between image coordinates and the ideal undistorted coordinates is:
where is the principal point offset, are radial distortion corrections, and are decentering distortion corrections.
Camera Calibration#
Camera calibration is the process of determining the interior orientation parameters. For a metric camera, calibration is performed under controlled laboratory conditions or through analytical self-calibration during a photogrammetric adjustment. The result is a camera calibration certificate that documents the measured parameters.
Calibration Methods
| Method | Description | Application |
|---|---|---|
| Laboratory (stellar/collimator) | Camera images a known target field (e.g., goniometer, collimator array) | Metric cameras; gold standard |
| Test-field calibration | Camera photographs a precisely surveyed ground target field | Metric and non-metric cameras |
| Self-calibration (in-situ) | IO parameters solved as additional unknowns in bundle adjustment | Non-metric cameras, UAS workflows |
A calibration certificate typically reports:
- Calibrated focal length ()
- Principal point coordinates (, )
- Radial distortion coefficients (, , )
- Decentering distortion coefficients (, )
- Fiducial mark coordinates (film cameras)
- Date of calibration and serial number
"The camera calibration report provides the calibrated focal length, the coordinates of the principal point of autocollimation, and the radial and decentering lens distortion parameters. These values define the inner orientation of the camera and are essential for rigorous photogrammetric solutions." -- Wolf, Dewitt & Wilkinson, Elements of Photogrammetry with Applications in GIS (4th Ed.), Ch. 4, p. 78
For non-metric cameras used in SfM workflows, self-calibration is the norm. The software solves for IO parameters simultaneously with exterior orientation during bundle adjustment. While convenient, self-calibration requires strong image network geometry -- convergent imagery, multiple image scales, and sufficient ground control -- to produce reliable results. A weak network can yield IO parameters that absorb systematic errors rather than correctly modeling the camera geometry.
Lens Distortions#
Lens distortions are systematic departures from the ideal central perspective projection caused by imperfections in lens manufacturing and assembly. Two primary types are characterized in photogrammetric calibration.
Radial Symmetric Distortion
The dominant distortion in most lenses. It causes points to be displaced radially inward or outward from the principal point. The magnitude depends only on the radial distance from the principal point:
where are the radial distortion coefficients determined during calibration. The component corrections in and are:
where and are the image coordinates relative to the principal point, and .
Decentering Distortion
Caused by the imperfect alignment of lens elements along the optical axis. Unlike radial distortion, decentering distortion has both radial and tangential components:
where and are the decentering distortion coefficients.
"Radial lens distortion is the most significant systematic error in photogrammetric imagery. For wide-angle lenses, radial distortion at the corners of the format can exceed 100 micrometers if uncorrected." -- Ghilani & Wolf, Elementary Surveying: An Introduction to Geomatics (15th Ed.), Ch. 27, p. 806
In modern digital cameras with well-designed lenses, radial distortion is typically small near the image center and increases toward the edges. For high-accuracy mapping, even small residual distortions must be modeled and removed.
Integration: LiDAR + Camera Systems#
Modern airborne mapping platforms routinely combine cameras with LiDAR (Light Detection and Ranging) sensors on a single aircraft. This integration leverages the complementary strengths of both technologies:
| Capability | Camera | LiDAR |
|---|---|---|
| Spectral/visual information | Excellent | None (intensity only) |
| Vegetation penetration | Poor | Excellent (multiple returns) |
| Feature interpretation | Rich texture and color | Geometric only |
| Dense 3D point generation | Via dense matching | Direct measurement |
| Breakline definition | Good (stereo compilation) | Requires processing |
| Accuracy (vertical) | Dependent on GSD and GCP | Direct, typically 5--15 cm |
Integrated LiDAR-camera systems share a common IMU/GNSS navigation solution, ensuring that the LiDAR point cloud and the imagery are registered in the same coordinate system. The camera imagery is used for orthophoto production and visual interpretation, while LiDAR provides bare-earth elevation models (penetrating vegetation canopy) and high-accuracy 3D coordinates.
Multispectral cameras -- capturing near-infrared (NIR) in addition to RGB -- add vegetation analysis capability. The Normalized Difference Vegetation Index (NDVI) computed from NIR and red bands enables land cover classification, crop health assessment, and environmental monitoring directly from the same data acquisition flight.
"The combination of LiDAR and digital camera imagery acquired from a single platform offers significant advantages. LiDAR provides precise elevation data including ground points under vegetation canopy, while the camera provides the spectral information and visual context needed for interpretation and orthophoto production." -- ASPRS, Manual of Photogrammetry (6th Ed.), Ch. 13, p. 611
For surveying professionals, understanding this integration matters because deliverables increasingly combine both data sources. A topographic survey might use LiDAR-derived bare-earth elevations for contours while using camera-derived orthophotos for planimetric features -- and the accuracy of the combined product depends on understanding the capabilities and limitations of each sensor.
Key Takeaways#
- Metric cameras are designed and calibrated for photogrammetric measurement; non-metric cameras can be used but require self-calibration during processing
- Interior orientation (focal length, principal point, lens distortion) defines the camera's internal geometry and must be known for rigorous photogrammetric solutions
- Film-based aerial cameras (23 cm x 23 cm format) dominated mapping for decades; fiducial marks defined the image coordinate system
- Digital frame cameras (DMC, UltraCam) capture the full frame simultaneously and offer superior radiometric performance, multispectral capability, and immediate data availability
- Linear array (pushbroom) sensors capture imagery line by line, requiring continuous exterior orientation from IMU/GNSS -- they offer seamless strip coverage but greater processing complexity
- Camera calibration determines IO parameters through laboratory, test-field, or self-calibration methods; the calibration certificate is a critical document
- Radial and decentering lens distortions are modeled mathematically ( and ) and must be corrected for accurate coordinate extraction
- LiDAR-camera integration combines direct 3D measurement with rich spectral imagery, producing complementary deliverables from a single acquisition
References#
- Wolf, P.R., Dewitt, B.A. & Wilkinson, B.E. Elements of Photogrammetry with Applications in GIS (4th Ed.). McGraw-Hill, 2014.
- Ghilani, C.D. & Wolf, P.R. Elementary Surveying: An Introduction to Geomatics (15th Ed.). Pearson, 2018.
- Ghilani, C.D. & Wolf, P.R. Elementary Surveying: An Introduction to Geomatics (13th Ed.). Pearson, 2012.
- ASPRS. Manual of Photogrammetry (6th Ed.). American Society for Photogrammetry and Remote Sensing, 2013.
- Mikhail, E.M., Bethel, J.S. & McGlone, J.C. Introduction to Modern Photogrammetry. John Wiley & Sons, 2001.
- ASPRS. Manual of Photogrammetry (5th Ed.). American Society for Photogrammetry and Remote Sensing, 2004.