Prabu Kumar of e-con Systems takes a deep dive in to this important aspect of our industry.

Image quality is crucial for embedded vision applications, determining how accurately cameras capture the real world. This blog breaks down the key factors affecting camera image quality, including colour accuracy, white balance, lens distortion, and dynamic range. Explore what these parameters mean and why they matter when selecting cameras for your applications.
Image quality (IQ) determines how accurately a camera captures the real world. In embedded vision applications, from industrial automation to medical diagnostics, the clarity and accuracy of an image directly impact the performance of the full system.
As a result, image quality becomes a key consideration when selecting a camera for any application. Different components of the camera, like as the lens, sensor, and firmware, contribute to image quality. Many image parameters influence image quality, making it a complex and challenging task to measure accurately. Properly assessing a camera’s image quality requires a deep understanding of such contributing factors.
Understanding these parameters is crucial for producing high-quality images that truly represent reality. In this blog, you’ll find out the key parameters that define image quality, including signal-to-noise ratio, dynamic range, and quantum efficiency. You’ll also learn how these parameters are objectively evaluated for superior imaging performance across applications.
Understanding Image Quality
Image quality is an assessment of an image’s fidelity to the original scene. Multiple factors, including colour reproduction, distortion, sharpness, and noise levels, influence it. Different lighting conditions affect how images appear, making validation under various conditions a critical need.
Let us see how various parameters contribute to the overall quality of an image.
What are its Key Image Quality Parameters?
Color Accuracy
Colour accuracy describes how the camera reproduces colours. When light falls on an object, for instance, a red apple, it reflects only the red wavelengths and absorbs the rest of the light spectrum. The reflected red light is then processed by the human brain, allowing us to recognize the object as an apple. Similarly, all colours are perceived through the reflection of specific light wavelengths. White reflects all visible light, while black absorbs it entirely, reflecting none; hence, black is often considered the absence of colour.
Colour accuracy describes how the camera reproduces colours. When light falls on an object, for instance, a red apple, it reflects only the red wavelengths and absorbs the rest of the light spectrum. The reflected red light is then processed by the human brain, allowing us to recognize the object as an apple. Similarly, all colours are perceived through the reflection of specific light wavelengths. White reflects all visible light, while black absorbs it entirely, reflecting none; hence, black is often considered the absence of colour.

The colour checker chart is a standard colour checker consisting of 24 colour patches, including six neutral colours (grayscale), as well as primary and secondary colours for a comprehensive evaluation. The grey colour patches in the last row are particularly useful for gamma and white balance evaluations.
It can also be validated by the ColourChecker Digital SG Chart provides 140 colours, including:
- 24 patches from the original ColourChecker
- 17-step grayscale
- 14 unique skin tone colours

White Balance (WB)


White balance determines how accurately white is retained across different lighting conditions. The light source, such as the sun, creates an orange tone in the morning and a blueish tone depending on the time of day. However, effective white balance eliminates these tints, ensuring that white remains consistently white.
Auto White Balance (AWB) functionality enables cameras to automatically adjust to different lighting conditions through tuning.
Lens Distortion
Distortion is the bending of straight lines in an image, appearing as follows:
- Barrel distortion (negative value) lines bend outward
- Pincushion distortion (positive value) lines bend inward
- Mustache distortion (Waveform distortion) It is a combination of lines that bend outward and inward, simply putting a combination of Barrel and Pincushion distortion.
- Keystone distortion: This distortion occurs when the camera’s sensor plane is not parallel to the plane of the capturing object, causing a trapezoidal effect in the image.


Chromatic Aberration
Chromatic aberration happens due to different colours of light bending at slightly different angles when passing through a lens. This causes colours to focus at different points on the sensor, resulting in coloured fringes.
The two types of chromatic aberration are:
- Lateral chromatic aberration: Different wavelengths fall on different points on the image plane
- Longitudinal chromatic aberration: Different wavelengths fall on different image planes

Lateral aberration is more easily visible in images, while longitudinal aberration requires analysing image sequences captured at varying distances. This parameter is evaluated using dot pattern charts.
e-con Systems’ High Image Quality Cameras for Embedded Vision Solutions
Since 2003, e-con Systems specializes in designing, developing, and manufacturing high-performance cameras for embedded vision applications. Our cameras offer advanced features such as HDR, low-light optimization, resolutions up to 20MP, NIR, global shutter technology and more.
Explore their Camera Selector Page to find the ideal solution for your needs or, for expert guidance on selecting the camera solution, reach them