Image quality is a characteristic of an image that measures the perceived image degradation (typically, compared to an ideal or perfect image). Imaging systems may introduce some amounts of distortion or artifacts in the signal, so the quality assessment is an important problem.
Maps, Directions, and Place Reviews
In photographic imaging
An image is formed on the image plane of the camera and then measured electronically or chemically to produce the photograph. The image formation process may be described by the ideal pinhole camera model, where only light rays from the depicted scene that pass through the camera aperture can fall on the image plane. In reality, this ideal model is only an approximation of the image formation process, and image quality may be described in terms of how well the camera approximates the pinhole model.
An ideal model of how a camera measures light is that the resulting photograph should represent the amount of light that falls on each point at a certain point in time. This model is only an approximate description of the light measurement process of a camera, and image quality is also related to the deviation from this model.
In some cases, the image for which quality should be determined is primarily not the result of a photographic process in a camera, but the result of storing or transmitting the image. A typical example is a digital image that has been compressed, stored or transmitted, and then decompressed again. Unless a lossless compression method has been used, the resulting image is normally not identical to the original image and the deviation from the (ideal) original image is then a measure of quality. By considering a large set of images, and determining a quality measure for each of them, statistical methods can be used to determine an overall quality measure of the compression method.
In a typical digital camera, the resulting image quality depends on all three factors mentioned above: how much the image formation process of the camera deviates from the pinhole model, the quality of the image measurement process, and the coding artifacts that are introduced in the image produced by the camera, typically by the JPEG coding method.
By defining image quality in terms of a deviation from the ideal situation, quality measures become technical in the sense that they can be objectively determined in terms of deviations from the ideal models. Image quality can, however, also be related to the subjective perception of an image, e.g., a human looking at a photograph. Examples are how colors are represented in a black-and-white image, as well as in color images, or that the reduction of image quality from noise depends on how the noise correlates with the information the viewer seeks in the image rather than its overall strength. Another example of this type of quality measure is Johnson's criteria for determining the necessary quality of an image in order to detect targets in night vision systems.
Subjective measures of quality also relate to the fact that, although the camera's deviation from the ideal models of image formation and measurement in general is undesirable and corresponds to reduced objective image quality, these deviations can also be used for artistic effects in image production, corresponding to high subjective quality.
A Perfect Image Photography Video
Image quality assessment categories
There are several techniques and metrics that can be measured objectively and automatically evaluated by a computer program. Therefore, they can be classified as:
- Full-reference (FR) methods - FR metrics try to assess the quality of a test image by comparing it with a reference image that is assumed to have perfect quality.
- No-reference (NR) methods - NR metrics try to assess the quality of a test image without any reference to the original one.
For example, comparing an original image to the output of JPEG compression of that image is full-reference - it uses the original as reference. Most of the leading picture quality prediction models are variants of the Emmy-winning structural similarity (SSIM) model developed by the Laboratory for Image and Video Engineering (LIVE) at the University of Texas at Austin.
Image quality factors
- Sharpness determines the amount of detail an image can convey. System sharpness is affected by the lens (design and manufacturing quality, focal length, aperture, and distance from the image center) and sensor (pixel count and anti-aliasing filter). In the field, sharpness is affected by camera shake (a good tripod can be helpful), focus accuracy, and atmospheric disturbances (thermal effects and aerosols). Lost sharpness can be restored by sharpening, but sharpening has limits. Oversharpening, can degrade image quality by causing "halos" to appear near contrast boundaries. Images from many compact digital cameras are sometimes oversharpened to compensate for lower image quality.
- Noise is a random variation of image density, visible as grain in film and pixel level variations in digital images. It arises from the effects of basic physics-- the photon nature of light and the thermal energy of heat-- inside image sensors. Typical noise reduction (NR) software reduces the visibility of noise by smoothing the image, excluding areas near contrast boundaries. This technique works well, but it can obscure fine, low contrast detail.
- Dynamic range (or exposure range) is the range of light levels a camera can capture, usually measured in f-stops, EV (exposure value), or zones (all factors of two in exposure). It is closely related to noise: high noise implies low dynamic range.
- Tone reproduction is the relationship between scene luminance and the reproduced image brightness.
- Contrast, also known as gamma, is the slope of the tone reproduction curve in a log-log space. High contrast usually involves loss of dynamic range -- loss of detail, or clipping, in highlights or shadows.
- Color accuracy is an important but ambiguous image quality factor. Many viewers prefer enhanced color saturation; the most accurate color isn't necessarily the most pleasing. Nevertheless it is important to measure a camera's color response: its color shifts, saturation, and the effectiveness of its white balance algorithms.
- Distortion is an aberration that causes straight lines to curve. It can be troublesome for architectural photography and metrology (photographic applications involving measurement). Distortion tends to be noticeable in low cost cameras, including cell phones, and low cost DSLR lenses. It is usually very easy to see in wide angle photos. It can be now be corrected in software.
- Vignetting, or light falloff, darkens images near the corners. It can be significant with wide angle lenses.
- Exposure accuracy can be an issue with fully automatic cameras and with video cameras where there is little or no opportunity for post-exposure tonal adjustment. Some even have exposure memory: exposure may change after very bright or dark objects appear in a scene.
- Lateral chromatic aberration (LCA), also called "color fringing", including purple fringing, is a lens aberration that causes colors to focus at different distances from the image center. It is most visible near corners of images. LCA is worst with asymmetrical lenses, including ultrawides, true telephotos and zooms. It is strongly affected by demosaicing.
- Lens flare, including "veiling glare" is stray light in lenses and optical systems caused by reflections between lens elements and the inside barrel of the lens. It can cause image fogging (loss of shadow detail and color) as well as "ghost" images that can occur in the presence of bright light sources in or near the field of view.
- Color moiré is artificial color banding that can appear in images with repetitive patterns of high spatial frequencies, like fabrics or picket fences. It is affected by lens sharpness, the anti-aliasing (lowpass) filter (which softens the image), and demosaicing software. It tends to be worst with the sharpest lenses.
- Artifacts - software (especially operations performed during RAW conversion) can cause significant visual artifacts, including data compression and transmission losses (e.g. Low quality JPEG), oversharpening "halos" and loss of fine, low-contrast detail.
Source of the article : Wikipedia
EmoticonEmoticon