In the wechat group of the house guests, today, we discussed a topic, that is, how much brightness does the projector HDR need to meet the requirements. What follows is the topic: how to calculate the brightness of TV / display is different from that of projector. How to convert it? How many lumen projectors meet HDR brightness standards? So today, let’s have a science popularization bar for the small house. We have collected some information to make sure that we can understand these potential problems clearly before purchasing.
Unit of brightness calculation for TV / display and projector
There is a key measure of HDR, that is, brightness, that is, the amount of light output.
HDR aims to support increased optical output capabilities, making the displayed image more like the natural light conditions we experience in the real world.
So, with regard to brightness, there are two different units between TV / display and projector, one is cd/m ²， Nits, one is lumens.
The brightness of TV / display is calculated in cd/m ²， That is nits. Nit: direct light is the same as the sun when watching TV. Nit is a measure of the brightness of the TV screen sent to your eyes in a given area. As a technical point of view, nit is a candela (CD / m2 – standardized measurement of luminous intensity) with light output equal to one candela per square meter. In short, 1nits is equal to 1cd/m ²。 This one I have already explained in a previous article on HDR, and I will refer to the original text in detail:
Do you really know the HDR products now? Be careful to be fooled!
So, from this point of view, ordinary TV may have the ability to output 100 to 200 nits, while HDR compatible TV may be able to output 400 to 2000 nit.
The projector brightness is calculated in lumens, which is lumen. Lumen: a general term for describing light output, but for projectors, the most accurate term is ANSI lumen (ANSI stands for the National Standards Association of the United States).
It is noted that 1000 ANSI lumens are the minimum values that projectors should be able to output for home theater, but most home theater projectors have an average output of 1500 to 2500 ANSI lumens. On the other hand, a multi-purpose projector (for a variety of scenarios, which may include family entertainment, commercial or educational purposes) may be able to output 3000 or more ANSI lumens.
Conversion of nit and lumen
The conversion formula / structure between nit and lumen is very complex. However, when comparing TV with projector, one of the common methods is to use the approximation value of 1nit equal to 3.426 ANSI lumen as reference.
Using this common reference value, to determine the approximate number of nits equivalent to the approximate ANSI lumens, you can multiply the number of nits by 3.426. If you want to reverse, divide the lumens by 3.426.
For projectors, achieving a light output equivalent to 1000 nit (remember that you illuminate the same number of room areas and room lighting conditions are the same), it needs to output up to 3426 ANSI lumens, which is outside the range of most home theater projectors.
However, it is easy to implement a projector capable of outputting 1713 ANSI lumens, which can roughly match the TV with a light output of 500 nit.
Here is a little hint that in addition to this approximation, there is also a factor that affects the size, and the size of the TV screen will also affect the relationship between nit and lumen. For example, a 65 inch TV with a brightness of 500 nit will have about four times the lumen output of 32 inch TV with the same output of 500 nit.
Taking into account the above factors, when comparing nit, screen size and lumen, the formula used should be lumen = nit x screen area X PI (3.1416), and screen area shall be determined by multiplying screen width and height by square meters.
Using 500 nit 65 inch TV, the screen area is 1.167 square meters, and the lumen is 1833.
The optical output of TV and video projectors in the real world
Although all of the above “technical” information on nit and lumen provides a relative reference, in practice, the numbers are only a part of it.
When a TV or projector is touted as capable of outputting 1000 nits or lumens, this does not mean that the TV or projector always outputs so much light. Frames or scenes usually display a series of light and dark content, as well as various colors, all of which require different levels of light output.
If you are watching a scene of the sun in the sky, this part of the image may require the TV or projector to output the maximum number of nit or lumens. However, other parts of the image (such as buildings, landscapes, shadows, etc.) require much less light output, probably only 100, 200 nit or lumen. In addition, the different colors displayed contribute to different light output levels within the frame or scene.
The key is that the ratio between the brightest and darkest objects should be as much as possible or as close to the same as possible to produce the same visual impact. This is particularly important for OLED TV that supports HDR related to LED / LCD TV. OLED television technology cannot support as much light output (NIT) as LED / LCD TV technology. However, unlike LED / LCD TV, OLED TV can produce a deeper black bit.
Although the official best HDR standard for LED / LCD TV is to display at least 1000 nits, OLED TV has only 540 nit official HDR standard. However, remember that this standard applies to the output of the maximum nit, not the average nit output.
When comparing HDR supported TV with 1000 nit output and projector supporting HDR with 2500 ANSI lumen output, HDR effect of TV will be more obvious in “brightness perception”.
For projectors, there are differences in optical output capability between projectors using LCD and DLP technology. LCD projector can provide the same output level for white and color, while the projector using DLP does not have the ability to produce the same level of white and color light output.
Note: the above comparison may require, for example, viewing in dark rooms, rather than some illuminated rooms, screen size, screen reflectance (for projectors) and seat distance, which may affect the output effect of nit and lumen more or less.
Guide for optical output for reference when purchasing TV and projectors
It is more difficult to simplify the relationship between nit and lumen to measure the output of light, which requires a lot of mathematical and physical knowledge. So things get confused when television and projection companies use terms like nit and lumen to attract consumers without context.
Therefore, when considering the output of light, please remember the following guidelines:
For 720p / 1080p or non HDR 4K UHD TV, nit numbers will not generally increase, that is, between 200 and 300 nit, which is sufficiently bright for traditional source content and most indoor lighting conditions (although it will be significantly dimmed when 3D viewing). As users, more consideration should be given to nit ratings, including HDR’s 4K Ultra HD TV, which is: the higher the optical output, the better.
For HDR compatible 4K Ultra HD LED / LCD TV, 500 nit is the standard reference standard (generally, if there are tags such as HDR premium, it is passed), and the output of 700 nit TV will provide better HDR content results. However, if you are looking for the best optical output products, 1000 nit is the official reference standard (as you can see if you look for labels such as hdr1000), and the maximum optical output of the highest end HDR / LCD TV is 2000 nit.
If you buy OLED TV, the maximum optical output is about 600 nit. At present, all OLED TV with HDR function need to be able to output at least 540 nit brightness. However, on the other hand, OLED TV can display absolute black bits as mentioned previously, and LED / LCD TV cannot, so 540 to 600 nit ratings on OLED TV can show better results than equivalent LCD / LED TV. In other words, it may take 1000 nit LCD / LED TV to reach OLED 540 / 600 nit brightness.
Although both 600 nit OLED and 1000 nit LED / LCD TV look impressive, 1000 nit LED / LCD TV still has his advantage, especially in light rooms. As mentioned earlier, currently 2000 nit is the highest light output level that can be found on TV, but this may cause the displayed image to be too dazzling for some viewers.
If you are buying projectors, consider the minimum 1000 ANSI lumen output as described above, but most projectors can output 1500 to 2000 ANSI lumens, which can provide relatively better viewing in a room that may not be dark enough. In addition, if you are a projector with 3D functions, consider projectors with 2000 or more lumen output, because the 3D image effect is naturally darker than 2D.
HDR projectors lack “point to point accuracy” compared to small bright objects in dark background. For example, HDR TV can display stars (small and low brightness objects) in the dark, but some HDR projectors are difficult to display. This is because the projector is difficult to display high brightness in very small areas relative to the surrounding dark scene, so for the best HDR results available so far (still not reaching the perceived brightness of 1000 nit TV), you need to consider a 4K HDR projector capable of outputting at least 2500 ANSI lumens. Of course, there is no formal HDR optical output standard for consumer based projectors.
In a word, when buying 4K HDR TV or 4K HDR projector, we should consider the above categories. There is such a system there, and we believe that we will have a better understanding.