What Is An HDR Television?
What Is An HDR Television?
What is an HDR Television? Hmm let’s find out!
If there’s one thing I truly love about home theater, it’s just how broad the topic can be. From speakers to gaming consoles, it really is an all-encompassing subject. The more compelling fact is how fast things can progress though. No better example of this than with the television.
Just a few years back (actually in my life time) the change from standard to high definition was made. That was seen at the time as the pinnacle of what was visually possible, at least commercially.
Then not even a few years later 4K came along and completely changed the game with a resolution 4 times that of 1080p. Now we’re presented with yet another format that promises to revolutionize TV, HDR.
An HDR TV is one that takes advantage of this shiny new format and If I personally had to use one word to describe it, it’d be awesome.
But we need to get a deeper understanding of what HDR is all about though to truly appreciate how awesome it is.
What is HDR exactly?
HDR is an acronym that stands for the words High Dynamic Range. It’s a format that originates in photography, but happens to have recently made its way over to video and television.
This is extremely fortunate because what that means for you is a much more true to life picture on screen.
Basically, it’s mimicking how the human eye would see an image. It also touts a broader range of colors that couldn’t be displayed previously. How much more exactly? Try millions more colors.
It can even display all the elements of a picture at the same time. At least that’s the simple explanation. Let’s go deeper though so we can really understand it.
Why Was HDR Introduced?
So, with the way that current video signals work, the dark and light portions of a screen are usually missing elements of the scene it was originally intending to capture.
This is because it’s based on older standards set decades ago, that couldn’t adequately represent them due to technical limitations at the time.
Knowing this, industries set out to solve this dilemma. Fortunately, they didn’t have to look too hard because the answer already existed, albeit on a different platform; that platform being photography.
If anything it’s a borrowed technology since the concept isn’t new. HDR is something that originally stems from photography.
But with a little tweaking, this was made into a format usable with video. They could then offer a true presentation of whatever it is that they are filming. But how exactly you might wonder? Well it has a lot to do with brightness and contrast.
How Brightness and Contrast Changes With HDR
A television screen’s luminance is measured in something called candela per square metres (cd/m2) It is more commonly referred to as nits.
This measures how bright the TV is capable of going. The more nits, the brighter the image. A normal HD 1080p television is capable of going a little above 450 cd/m2 or 450 nits.
With HDR, this number can be as high as 10,000. You might be thinking well isn’t that too bright? Surprisingly no. It’s not like the entire picture is that bright at all times. It simply means that the elements that need to be bright can remain so while the other parts stay dark.
It’s a lot more true to how you would see it in person. Think of what happens when you look at the sky during a sunny day. Do you simply see one bright splotch?
Or are you able to make out details in the sky as well?
It works the same way here. When an image is displayed, you can see all the fine details that are usually left out.
Same thing goes for brightness (the dark parts of the image)
Typically, all of these small details would get crushed in shadows prior. This was the only way industries could display the image, since showing everything would have taken up too much data to do normally.
It simply wasn’t a feasible option. However technology has advanced far enough now for this to be something that is possible.
It allows for portions of an image to get both lighter and darker at the same time, showing much more detail in the shadows, while representing much greater detail in the highlights. So there can be something that’s reaching peak brightness on screen at the same time something is completely dark with nothing being left out.
But it also allows for a lot more nuanced shades too; meaning it can also reproduce everything in between absolute white, and absolute black in a much smoother gradient. A lot of the little shades you couldn’t see before are now visible.
This is an example of how television represents the change from light to dark portions of the screen right now.
With HDR, notice how much smoother the gradation is.
That difference between light and dark is referred to as the contrast ratio. The higher the contrast ratio, the better looking the image. With this, the contrast ratio is amplified to a considerable degree.
As you could probably imagine, this would hypothetically result in a picture that looks vastly better than before. I can certainly vouch for that sentiment.
I have an HDR screen right now, and for me to even try and explain to you how good it looks in person is honestly rather difficult. I know it’s cliché’, but this really is one of those instances where you would need to see it for yourself.
It’s pretty unbelievable looking. But do you want to know the interesting part? Not all HDR is the same.
The Different Types Of HDR
Yep that’s right. To make things even more confusing, there are actually different types of high dynamic range. The good news though, is that there’s only 2 types “main” types available as of right now.
I say main because technically there are 5 (Hybrid Log-Gamma, HDR10+, & Advanced HDR by Technicolor) but those haven’t gained enough traction to be fully commercialized since HDR is already new enough as is.
But if you wanted to be really technical then you could say that the latter two formats are already offered since there are televisions that support it — even though there is a severe lack of accompanying content as of right now.
The 2 main ones used right now though are Dolby Vision and HDR10. So what’s the difference? Well I’ll tell you the technical difference, and then I’ll tell you what I found out.
HDR10 is a format that’s backed by an industry called the UHD alliance. This is the same industry that is pushing UHD (ultra-high definition otherwise known commercially as 4k)
They’re responsible for a lot of the recent video format pushes into the mainstream.
It’s also typically offered on ultra HD Blu-rays. HDR10 supports a brightness up to 4000 nits and 10 bit color (an extended color palette). It also has what is called static meta data meaning everything within the film is mastered at a predetermined brightness.
Dolby Vision on the other hand is Dolby Digital’s version of HDR. If a screen meets their specific criteria, then it’s allowed to tout that it adheres to their standards, thus gaining their certification.
It’s a lot more personable too because Dolby Vision usually is calibrated for each display’s maximum capability. So basically each specific screen can utilize HDR to the best of its ability. It also supports a luminance of up to 10,000 nits and 12 bit color.
While no televisions as of right now can reach that level of brightness, the capabilities of the standard are already set so that once those displays are available, they’ll be able to fully utilize Dolby vision.
It also has what is known as dynamic meta data. Where HDR10 displays the picture based on the predetermined parameters of the movie, Dolby Vision can do so on a scene by scene basis within the movie (hence the dynamic portion). This theoretically allows for a much more accurate and vivid image.
Sounds like Dolby Vision would be the clear winner right?
Well not exactly…
It’s really not as clear cut as you might think.
This is what I found to be true (this is just from my perspective by the way) in terms of difference personally.
To be honest with you, I didn’t really notice that much of a difference between the two. I mean they both looked downright gorgeous on the two different displays, but the difference was negligible.
I mean maybe Dolby Vision screen was slightly brighter, but any major differences I didn’t notice.
(Now the following 2 pictures aren’t actually HDR 10 and Dolby Vision pictures since you would need the same video file that supports both formats, and those panels side by side to see the difference. These are just to demonstrate to you basically what I saw in terms of difference and how close they were)
The fact that they were so close is good news for you because you at least know that it’s not going to make the biggest difference in the world with regards to what version you pick.
At least for right now. Once displays start releasing with higher nit ranges and more bit rates, the difference in image quality should widen pretty dramatically making Dolby Vision a much better choice, eventually.
When this happens remains to be seen of course, so it might be a few years before we start seeing any 10,000 nit panels.
What’s The Difference Between Photo HDR And HDR Television?
You might also be curious what the difference is between photography HDR and the video version is since I did mention that before.
I don’t want to get too technical but here’s what you should at least know in reference to that. When you take a picture with a camera, it can only represent one exposure at a time. This is for any given image.
As you can imagine, this may leave a lot more to be desired in terms of quality. However to combat this, what the camera sensor does is combine multiple exposures together into one image to get a broader range of light.
This broader range of light is what gives that image depth, and makes it pop. This is not only useful to see all the hidden elements of a picture that are normally lost, but it also looks better in general.
Video HDR is different because it doesn’t need to do that.
What it’s able to do is represent both at the same time so that nothing is overexposed or crushed. Naturally this results in an exceedingly dynamic image. Again it can’t be overstated how incredible this looks on an adequate display. It does make a world of difference.
Do You Need Anything Special For HDR?
By now you might be asking yourself if you need anything special to take advantage of HDR, and the answer to that would technically be yes you do. For one it’s not something you can just download or anything like that.
Being a format, a device has to explicitly state that it supports it. This where you might see the UHD or Dolby Vision certification on the outside of the box that says this. Then you would need a television that says it’s capable of it as well.
Other than that though, there’s really not much else to it. There isn’t any special ways you have to hook up something. As long as you have the latest HDMI cables, then you’re good to go. This one supports it fully. Zeskit HDMI Cable 6.5ft
As far as content goes, cable television as of right now doesn’t support it though this may change in the future. If you want to watch movies with it, then you would need to invest in a 4k Blu-ray player since UHD Blu-ray is what is supports it.
Both the Xbox Series X and the PlayStation 5 can utilize it too meaning the games on those consoles have the capability for it as well.
What Format Might Come Next?
Now this is going to sound shocking, but Samsung and LG have already introduced 8K displays that are available. But that said, 8K is nowhere near prime time. In fact by my estimate it’ll probably be a good 10 years before it’s mainstream so don’t worry.
But the fact that there’s already talk of resolutions of 8K and (gasp) above that really boggles the mind. I think it’s going to be so cool to see what new formats start to show in the future.
Then there’s the aforementioned HDR10+ which is actually Samsung’s entry into the HDR battle. What many may not know is that manufacturers who want to use the Dolby Vision license actually have to pay a fee first before being able to implement it.
So Samsung decided that they would just launch their own HDR format where this wasn’t required.
Where Dolby Vision adjusts the image on a scene by scene basis, 10+ takes it a step further by adjusting things on a frame by frame basis. This should theoretically allow for an even greater improvement in picture quality.
However since there isn’t exactly a wealth of content to test, we’ll have to play the waiting game to see how it stacks up once enough content is available. This will ultimately decide how successful it becomes.
Or hey, maybe it’s even Micro Led TVs?
What do you think is next big thing in display technology? I’d love to hear from you guys down in the comments below (like I’m genuinely curious please let me know lol).
Anyway hopefully all of this helps you to understand a little bit about this new standard and why it’s so exciting. Again without nerding out too much, it really is something you need to see in person; pictures won’t do it justice.
And hey, now when you hear someone ask well what is an HDR Television?? You’ll be able to tell them ?
If you need help with choosing, here’s an article that I did that should help you immensely to do just that.
Until next time, make it easy, keep it simple.
Hey everyone it’s nice to meet you. I’m Jay & I’ve been with this hobby for many years now. I decided to create this site to share everything that I’ve learned from personal experience with you. I also happen to be a huge gamer, lover of all things tech related, and a major fitness buff (love weightlifting)