The color quality of a light source refers to the ability of the light source to faithfully replicate the colors of objects illuminated by the light source, as opposed to natural light. Luminaires based on light emitting diodes (LEDs) are becoming ever more popular, owing to rapid developments in LED technology, and improvements in brightness and efficiency. An LED is a semiconductor diode which has a biased p-n junction capable of generating narrow-spectrum light or electroluminescence. Obviously, the color quality of the light source is a vital characteristic of the light source basically, and to consumers in particular. Color quality has been among the major challenges facing LEDs as a general light source.
Quality artificial lighting generally endeavors to emulate the characteristics of natural light. Color reproduction is undoubtedly an important characteristic of virtually any artificial lighting, including LED lighting. Color reproduction is often measured using Color Rendering Index (CRI) or average Color Rendering Index (CRI Ra). CRI Ra is a modified average of the relative measurements of precisely how the color rendition of an illumination system compares to that of a reference radiator while illuminating eight reference colors, i.e., it's actually a relative way of measuring the shift in surface color of an object when illuminated by a specific lamp. CRI is essentially a measure of how well the spectral distribution of a light source compares with that of an incandescent (blackbody) source, which posesses a Planckian distribution between the infrared (over 700 nm) and the ultraviolet (under 400 nm). Daylight has a high CRI (Ra of approximately 100), with incandescent bulbs also being sort of close (Ra higher than 95), and fluorescent lighting being less accurate (typical Ra of 70-80). The selection of general illumination sources for commercial and residential lighting is generally influenced by a balance of energy efficiency and the cabability to vigilantly deliver colors as measured by the color rendering index (CRI). Certain types of specialized lighting, for example mercury vapor and sodium lights exhibit a relatively poor CRI (at only about 40 or sometimes lower). Artificial lighting commonly make use of the standard CRI to evaluate the quality of white light. If a white light yields a high CRI in contrast to sunlight and/or a full spectrum light, then it's assumed to have a better quality in that it is more "natural," and inclined to enable a colored surface to better rendered. Illumination with a CRI Ra of lower than 50 is very poor and merely found in applications where there just isn't any alternative for economic issues. Lamps with a CRI Ra between 70 and 80 have application for general illumination where the colors of objects are usually not important. For some general interior illumination, a CRI Ra>80 is good enough. CRI Ra>90 is more advantageous and offers superior color quality. Light sources having a high CRI getting close to 100 can be desirable in color-critical applications like for example photography and cinematography.
Nevertheless, the CRI of a light source merely takes into account color rendering, as the name suggests, and ignores a number of other attributes that impact overall color quality, which includes chromatic discrimination and common observer preferences. CRI Ra (or CRI) by itself is not a satisfactory measure of the benefit of a light source, considering that it confers little ability to forecast color discrimination (i.e., to perceive subtle difference in hue) or color preference. The application of the CRI as a reliable color quality metric for solid-state lighting sources, such as those employing light emitting diodes (LEDs), is particularly problematic given the inherently peaked light spectrum of LEDs. CRI only measures color rendering, or color fidelity, and disregards other aspects of color quality, such as chromatic discrimination and observer preferences. The Color Quality Scale (CQS) developed by National Institute of Standards and Technology (NIST) was established to incorporate these other aspects of color appearance and deal with many of the shortcomings of the CRI, particularly pertaining to solid-state lighting. Rather than using only eight low-chroma samples that do not span the full range of hues, the CQS takes into consideration 15 Munsell samples which have significantly higher chroma and are spaced uniformly along the entire hue circle. CQS also takes into account various other characteristics which have been identified to impact an observer's perception of color quality. The CQS has a range of 0-100, with 100 being an awesome score.
Correlated color temperature (CCT) is a specification of the color appearance of the light produced by a lamp, relating its color to the color of light from a black body reference source (a Planckian reference) when heated to a specific temperature, measured in Kelvin (K). The concept of color temperature is founded on the comparison of a visible light source to that of an ideal black-body radiator. The temperature at which the heated black-body radiator corresponds to the color emitted by the light source is that source's color temperature. CCT is designed to define the apparent "tint" of the illumination (e.g., warm or cool) generated by an electric light source. Color temperatures of 5000 K or more are "cool" and have green to blue colors although lower color temperatures of 2700 to 3500 K are regarded as "warm" and have yellow to red colors. General illumination can have a color temperature between 2,000 and 10,000 K, with a large percentage of general lighting devices being between 2,700 and 6,500 K. For incandescent light sources the light is of thermal origin and is in close proximity to that of an ideal black-body radiator. For a light source which does not noticeably replicate a black body radiator, say for example a fluorescent bulb or a light-emitting diode (LED), the CCT of the light source is the temperature at which the color of light produced from a heated black-body radiator is approximated by the color of the light source.
The ability of human vision to identify color differs from the others under correlated color temperature conditions providing the same CRI Ra. Such differentiation is proportionate to the gamut of the illuminating light. Gamut area of a light source can be calculated as the area enclosed within a polygon based on the chromaticities in CIE 1976 u'v' color space of the eight color chips employed to determine CRI Ra when illuminated by a test light source. Gamut area index (GAI) is a fairly simple way of characterizing in chromaticity space how saturated the illumination makes objects appear. Mentioned above previously, CRI represents just how well a light source delivers the true colors of different objects and its value is based on how close the spectral power distribution (SPD) of the test luminaire corresponds to that of the reference illuminant. The SPD of a light source is a graphical representation of relative energy at each wavelength in the visible spectrum. The ability of a light source to render colors will lie upon the amount of power it radiates in the various regions of the spectrum corresponding to distinctive colors. The color appearance of a lamp is explained by its chromaticity coordinates that could be calculated from the spectral power distribution based on standard methods.
To create uniform lighting, it is important that all the lights have the identical color, or further precisely, are visually matched. Because of the manufacturing tolerances, temperature variations, and varying drive conditions, the chromaticity of light sources will differ. Color variations in phosphor coated white LED arrays are also a crucial factor to consider. Disparity in color is inherent to the LED production process, so manufacturing process. An LED is frequently binned after a production run in accordance to a variety of characteristics derived from their spectral power distribution.