Photogrammetry is the science and technology of obtaining reliable information about physical objects and the environment through the process of recording, measuring and interpreting photographic images and patterns of electromagnetic radiant imagery and other phenomena. ASPRS online While the invention of the method is attributed to Aimé Laussedat, the term "photogrammetry" was coined by the German architect , which appeared in his 1867 article "Die Photometrographie."Albrecht Meydenbauer: Die Photometrographie. In: Wochenblatt des Architektenvereins zu Berlin Jg. 1, 1867, Nr. 14, S. 125–126 ( Digitalisat); Nr. 15, S. 139–140 ( Digitalisat); Nr. 16, S. 149–150 ( Digitalisat). There are many variants of photogrammetry. One example is the extraction of three-dimensional measurements from two-dimensional data (i.e. images); for example, the distance between two points that lie on a plane parallel to the photographic image plane can be determined by measuring their distance on the image, if the scale of the image is known. Another is the extraction of accurate color ranges and values representing such quantities as albedo, specular reflection, metallicity, or ambient occlusion from photographs of materials for the purposes of physically based rendering.
Close-range photogrammetry refers to the collection of photography from a lesser distance than traditional aerial (or orbital) photogrammetry. Photogrammetric analysis may be applied to one photograph, or may use high-speed photography and remote sensing to detect, measure and record complex 2D and 3D by feeding measurements and imagery analysis into computational models in an attempt to successively estimate, with increasing accuracy, the actual, 3D relative motions.
From its beginning with the used to plot on , it now has a very wide range of uses such as sonar, radar, and lidar.
The 3D coordinates define the locations of object points in the 3D space. The image coordinates define the locations of the object points' images on the film or an electronic imaging device. The exterior orientation of a camera defines its location in space and its view direction. The inner orientation defines the geometric parameters of the imaging process. This is primarily the focal length of the lens, but can also include the description of lens distortions. Further additional observations play an important role: With scale bars, basically a known distance of two points in space, or known fix points, the connection to the basic measuring units is created.
Each of the four main variables can be an input or an output of a photogrammetric method.
Algorithms for photogrammetry typically attempt to minimize the sum of the Least squares over the coordinates and relative displacements of the reference points. This minimization is known as bundle adjustment and is often performed using the Levenberg–Marquardt algorithm.
A 3D visualization can be created by georeferencing the aerial photos A. Sechin. Digital Photogrammetric Systems: Trends and Developments. GeoInformatics. #4, 2014, pp. 32-34 . and LiDAR data in the same reference frame, orthorectifying the aerial photos, and then draping the orthorectified images on top of the LiDAR grid. It is also possible to create digital terrain models and thus 3D visualisations using pairs (or multiples) of aerial photographs or satellite (e.g. SPOT satellite imagery). Techniques such as adaptive least squares stereo matching are then used to produce a dense array of correspondences which are transformed through a camera model to produce a dense array of x, y, z data which can be used to produce digital terrain model and Orthophoto products. Systems which use these techniques, e.g. the ITG system, were developed in the 1980s and 1990s but have since been supplanted by LiDAR and radar-based approaches, although these techniques may still be useful in deriving elevation models from old aerial photographs or satellite images.
It is also used to combine live action with computer-generated imagery in movies post-production; The Matrix is a good example of the use of photogrammetry in film (details are given in the DVD extras). Photogrammetry was used extensively to create photorealistic environmental assets for video games including The Vanishing of Ethan Carter as well as EA DICE's Star Wars Battlefront. The main character of the game was derived from photogrammetric motion-capture models taken of actress Melina Juergens.
Photogrammetry is also commonly employed in collision engineering, especially with automobiles. When litigation for a collision occurs and engineers need to determine the exact deformation present in the vehicle, it is common for several years to have passed and the only evidence that remains is crash scene photographs taken by the police. Photogrammetry is used to determine how much the car in question was deformed, which relates to the amount of energy required to produce that deformation. The energy can then be used to determine important information about the crash (such as the velocity at time of impact).
Rectification of imagery is generally achieved by "fitting the projected images of each photograph to a set of four control points whose positions have been derived from an existing map or from ground measurements. When these rectified, scaled photographs are positioned on a grid of control points, a good correspondence can be achieved between them through skillful trimming and fitting and the use of the areas around the principal point where the relief displacements (which cannot be removed) are at a minimum."Petrie (1977: 50)
"It is quite reasonable to conclude that some form of photomap will become the standard general map of the future."Robinson et al. (1977:10) They go on to suggest that, "photomapping would appear to be the only way to take reasonable advantage" of future data sources like high altitude aircraft and satellite imagery.
Overhead photography has been widely applied for mapping surface remains and excavation exposures at archaeological sites. Suggested platforms for capturing these photographs has included: War Balloons from World War I;Capper (1907) rubber meteorological balloons;Guy (1932) kites;Guy (1932)Bascom (1941) wooden platforms, metal frameworks, constructed over an excavation exposure;Guy (1932) ladders both alone and held together with poles or planks; three legged ladders; single and multi-section poles;Schwartz (1964)Wiltshire (1967) bipods;Kriegler (1928)Hampl (1957)Whittlesey (1966)Fant and Loy (1972) tripods;Straffin (1971) tetrapods,Simpson and Cooke (1967)Hume (1969) and aerial bucket trucks ("cherry pickers").
Handheld, near-nadir, overhead digital photographs have been used with geographic information systems (GIS) to record excavation exposures.Craig (2000)Craig (2002)Craig and Aldenderfer (2003)Craig (2005)Craig et al. (2006)
Photogrammetry is increasingly being used in maritime archaeology because of the relative ease of mapping sites compared to traditional methods, allowing the creation of 3D maps which can be rendered in virtual reality.
Google Earth uses photogrammetry to create 3D imagery.Gopal Shah, Google Earth's Incredible 3D Imagery, Explained, 2017-04-18
There is also a project called Rekrei that uses photogrammetry to make 3D models of lost/stolen/broken artifacts that are then posted online.
Apple introduced a photogrammetry API called Object Capture for macOS Monterey at the 2021 Apple Worldwide Developers Conference. In order to use the API, a MacBook running macOS Monterey and a set of captured digital images are required.
|
|