Technology

Tech InDepth: Understanding HDR and standards like HDR10, HDR10+ and Dol Vision

HDR 10+, HDR 10 and Dol Vision support. If you’ve bought a TV in recent times, it is likely you’ve seen this term as one of the key features. In fact these days, there are many displays boasting of support for the same. However, what exactly do these terms mean? Do they really improve your display quality, or are they simply marketing gimmicks?
One thing all experts will agree on, is that the HDR standards can be really confusing, especially if you’re new to displays. In today’s edition of Tech InDepth, we will be answering these very questions about HDR, so you can be a well-informed customer the next time you buy a TV or a monitor.
Understanding HDR
Contrary to popular belief, HDR has nothing to do with the resolution of content, even though ‘HD’ does. This is because HD refers to high definition, which is basically more pixels in the same area, churning out sharper photos and videos. Meanwhile, HDR stands for High Dynamic Range, which is actually a standard of the picture quality itself. The term simply means that your display supports a higher ‘dynamic range’ of colours. Let’s understand what this means via an example.

Say you look at a beautiful rainbow, but you can’t identify the seven colours (Violet, Indigo, Blue, Green, Yellow, Orange and Red) that make the rainbow up. Instead you can identify just Violet, Blue, Yellow and Red. So even though what you’re looking at is seven dinct colours, what your brain interprets is four colours.
If you were to go home and paint a picture of what you saw outside, you’d use only four colours, and end up painting a rainbow that doesn’t look as realic as the real thing. This is the fundamental difference between Standard Dynamic range (SDR) and High Dynamic Range (HDR).
HDR allows your device, say a TV, to understand more shades of colours, or a higher range of colours, which can then be dynamically reproduced when you watch content on the screen. HDR also lets your TV understand various contrast levels better in the content you watch, making the device capable of reproducing much better quality pictures and videos compared to SDR.
A good HDR display will show darker looking blacks and lighter whites in the same picture compared to an SDR display. This also works for various shades of all colours as HDR displays usually also come with a wider gamut of colours supported.
A visual representation of how SDR (left) would look different from HDR (right). Notice the darker and more realic shadows between the seats and on the wall behind in the HDR image. (Image Source: The Indian Express/ Chetan Nayak)
Also note that for content to be viewed in HDR, it needs to be shot in HDR. This means that if you have an HDR-enabled TV or monitor, it will still show the same display quality if what you’re watching was not shot with HDR-specific equipment.
HDR-recording imbues the content with more data that lets HDR-enabled displays like TVs and monitors recognise the higher range of colours and contrast levels. Without this extra data, even the most expensive HDR TVs will also output the same dynamic range as a regular TV.
Why HDR can be confusing?
Unlike resolution standards like HD, 2K or 4K where the name implies what you get in terms of Pixel density, HDR is not that easy to interpret. The technology relies on several encoding standards that you will find on various products, making it really difficult to gauge if the TV you’re eyeing at the mall is a good or bad one simply because it ‘supports HDR’.

What exactly does supporting HDR mean? It simply means that your display is capable of receiving and understanding HDR signals, but unfortunately, it says nothing about how well the TV can reproduce these signals when it displays content. That is exactly where various HDR certification standards like HDR10, HDR10+ and Dol Vision come in.
HDR standards – HDR10, HDR10+ and Dol Vision
Here’s a look at the most common HDR certifications you will come across and what exactly they mean. We’ll also look at the pros and cons of each so you know which certification you must look for before putting your money on a new TV.
HDR10
HDR10 is an open-source certification that can be considered the entry level HDR certification. This is likely the certification that you will find on many modern budget TVs that claim to be HDR-compliant.
HDR10 is not completely redundant, but it doesn’t offer a lot of enhancement to your picture quality in today’s. With HDR10, devices provide ‘Static Metadata’ to your display, basically telling it how bright a movie should be based on two points of reference. These are the highest brightness and lowest brightness points throughout the footage.
How Static Metadata works on HDR10 displays. (Image Source: The Indian Express/ Chetan Nayak)
As you can probably tell, this isn’t the best way to bring out the best of HDR content, as a two-point reference system is still very limiting. If you watch a movie like ‘The Batman’ (that has an abundance of dark scenes and very few bright scenes) on an entry-level HDR10 display, the display will not do the dark scenes any justice.
This is why you shouldn’t be blown away anymore HDR10 displays and TVs. What you should be looking for instead is HDR10+ or Dol Vision.
HDR10+
HDR10+ fixes the main problem with HDR10 – and uses something called Dynamic Metadata. As the name suggests, Dynamic Metadata allows your display to gauge brightness levels on a scene–scene, or a frame–frame basis. Now, the two-point reference system can adapt according to what is on the screen at that point in a movie or episode, offering a more dynamically changing HDR experience that can offer a much superior viewing experience compared to HDR10.
How Dynamic Metadata works on HDR10+ displays. (Image Source: The Indian Express/ Chetan Nayak)
However, there is still one point of concern which is the open-source nature of an HDR10+ certification. Since brands are free to use its certification without any mandatory set of specifications, you may run into a bad TV even if it is HDR10+ certified.
The maximum brightness is one such factor that is often compromised with. A good HDR10+ panel should go up to at least 1000 nits of brightness to make the most of the certification. However, you can find many HDR10+ TVs that don’t meet this requirement. On the contrary, good HDR10+ TVs, which can be much more expensive, can be as good as Dol Vision TVs too. More on that below.

Dol Vision
The best HDR certification right now for TVs and displays is Dol Vision. It uses Dynamic Metadata as well to allow the brightness and contrast levels to be adjusted per frame or scene in content, enhancing your watching experience making content more realic and pleasing to watch. Dol Vision is also a paid certification, which means that for every TV with a Dol Vision certification, the manufacturer has to pay a sum of money as royalty fees.
A Dol Vision certification also has a set of specification requirements, which are necessary to make the most out of HDR content. Because of this, it would be really difficult for manufacturers to make an inferior TV with Dol Vision. Instead the certification is considered many to be a mark of good image quality. This also makes Dol Vision TVs more expensive compared to HDR10/HDR10+ TVs.
While Dol Vision doesn’t have the same content library has HDR10 content, you will find Dol Vision supported content on platforms like Netflix, Amazon Prime Video. Note that Dol Vision-shot content can also be watched on HDR10 and HDR10+ TVs, but the experience will be a little (or very in the case of HDR10) inferior.

Related Articles

Back to top button