HDR TV

HDR TV

The constant development of the television set has been as relentlessly consistent as any sector in tech during the last half-century. The good thing is, there’s no sign the pursuit of visual perfection is slowing down. With the 3D fad over and app-loaded Smart TVs now standard, the focus of TV-makers is back on creating the best possible two-dimensional image for viewers to enjoy. While the vast majority of families still use high-definition sets with a pixel resolution of 1920 x 1080, the gradual shift to 4K Ultra HD is gathering pace. These new sets have display resolutions of 3840 x 2160, which offer four times the detail, provided the content is filmed and broadcast at that resolution. While these TVs have been available for a while, broadcast content is now catching up with the technology.

4K Ultra HD films and TV shows are emerging through streaming services like Netflix, while BT Sport Ultra HD brings live Champions League and Premier League football as well as Premiership rugby at the higher resolution. 4K sets are no longer prohibitively priced, meaning there’s far less of a barrier between adopting the new tech. However, 4K isn’t the only big step forward for TV tech. Many observers argue the emergence of HDR (High Dynamic Range) technology may make a bigger difference to the overall quality of the image than the resolution boost.

 What is High Dynamic Range TV?

It stands for High Dynamic Range (HDR). Basically, it means better contrast, greater brightness levels and a wider colour palette. It’s about making your films and TV shows look that bit more like real life. The idea is that your eyes can perceive brighter whites and darker blacks – greater dynamism – than traditional TVs have been able to display. HDR aims to improve on that. HDR content preserves details in the darkest and brightest areas of a picture that are lost using old standards such as Rec.709. It also allows for more natural, true-to-life colours that are closer to how we see them in real life.

HDR10 is the standard form of HDR and has been around for a while, competing with Dolby’s own version of the technology, Dolby Vision. But now Samsung has its own standard, known as HDR10+ which Amazon Video has just announced it will be supporting, and which we’ll tell you more about later in the article. Now you know the basics, it’s worth keeping in mind that contrast and colour are the two key things to consider when thinking about HDR. Here’s a full breakdown:

HDR compatible

The safest way is to look for the Ultra HD Premium logo. This is a stamp of approval by the UHD Alliance, a group made up of technology firms and content producers. The idea is to limit the amount of confusion when it comes to buying new kit, as chaos is easy to abuse.

Previously, HDR was rushed out to consumers before anyone had really agreed on a set of standards to define it, which led to many TVs having an HDR sticker on the box, regardless of specs or quality. TV manufacturers and content providers had very little in terms of clearly defined specs to work to when creating HDR screens and content. With the UHD Premium label, we now know the precise, minimum specifications a TV needs to be considered truly HDR compatible.

High Dynamic Range TV

Colour & Contrast

There are two things that define an HDR TV. Their contrast performance and the number of colours they can display. Let’s start with the first.

Contrast – Contrast is one of the most important factors in how good a TV picture looks and it’s a key part of what makes an HDR TV. It refers to the difference between light and dark. The greater the difference, the greater the ‘contrast’.

There are two components to consider here. One is peak brightness, which rather unsurprisingly, refers to how bright a TV can go, measured in what’s known as ‘nits’. Think of one nit as the equivalent of one candle’s brightness. TVs must meet a specific number of nits in order to be given the HDR label.

The other measurement is black level. Similar to peak brightness, black level refers to how dark a TV image can appear and is also measured in nits. So, for example, a TV could have a peak brightness of 400 nits and a black level of 0.4 nits. The difference between the peak brightness and black level is known as the contrast ratio. HDR TVs have to meet specific standards for peak brightness and black level which helps give them the dynamic appearance.

Colour – This is the second of the most important aspects of HDR. When it comes to colour, a TV must be able to process what’s known as 10-bit or ‘deep’ colour. 10-bit colour equates to a signal that includes over a billion individual colours. In comparison, Blu-ray uses 8-bit colour, which amounts to around 16 million different colours. With 10-bit colour, HDR TVs will be able to produce a vastly expanded range of colour shades, reducing overtly obvious gradations between shades. Subtle shading helps to make a scene look far more realistic.

However, as is always the case with these things, it isn’t quite as simple as this. In order to be considered HDR compatible, a TV doesn’t need to be able to display all the colours in a 10-bit signal. It just has to be able to process the signal and produce an image based on that information.

And it doesn’t stop there. If you’re still with us, there’s more colour stuff to go over. An HDR TV must be able to produce a certain amount of what’s known as ‘P3’ colour. P3 colour refers to the range of the colour spectrum which is included. The best way to think about this is imagine an overall colour spectrum, and within that a set of defined spaces. The P3 colour space is a larger than the what standard TVs use, Rec. 709, which means it covers more colours. Essentially, HDR means a TV can cover a wider space within the colour spectrum, and within that space, the various gradations of shades will be much smoother than on current TVs.

HDR10+

As mentioned, there are two competing standards when it comes to HDR: HDR10 (the dominant standard) and Dolby’s own, more advanced version, Dolby Vision. You can find out more about Dolby Vision in our guide. But now, Samsung’s own take on the technology, HDR10+ is gaining some attention, with Amazon announcing it will be supporting the standard. So, what is it?

HDR10+ is an open standard, created by Samsung and available on all the company’s 2017 TVs (it’ll be coming to 2016 models via a firmware update sometime later in 2017). It improves on HDR10 by using dynamic metadata instead of the static metadata used by HDR10. That means it can dynamically alter the brightness of individual scenes and even individual frames throughout a particular TV show or film. For example, if a scene was meant to be shown at lower brightness, HDR10+’s dynamic approach will drop the brightness level in real-time to match what the director intended.

Commenting on Amazon’s adoption of the technology, Greg Hart, Vice President of Amazon Video, worldwide added: “At Amazon, we are constantly innovating on behalf of customers and are thrilled to be the first streaming service provider to work with Samsung to make HDR10+ available on Prime Video globally later this year.”

HDR10+’s use of dynamic metadata brings it closer in line with Dolby Vision, which also uses the dynamic approach. Whether the HDR10+ will become the dominant standard is entirely unclear at this moment, but stay tuned as it seems the technology is increasing in popularity.

Affect of HDR

The two big display technologies in the AV industry are OLED and LED LCD. For a full explanation of these two approaches check out our ‘OLED vs LED LCD’ feature. In short, LED TVs use lights to illuminate the pixels in a traditional LCD screen, while the pixels in OLED displays produce their own light. LED TVs are capable of producing high peak brightness and as such, offer the best way for manufacturers to create HDR compatible TVs. Many argue that OLED technology isn’t a great option for HDR due to its difficulties in producing a very bright image versus LCD/LED. So how can OLED, with its brightness issues, qualify for HDR compatibility? Well, the UHD Alliance has got around the problem by introducing two standards, either of which qualifies you for UHD Premium status:

STANDARD 1: More than 1,000 nits peak brightness and less than 0.05nits black level.

STANDARD 2: More than 540 nits brightness and less than 0.0005 nits black level.

While standard one demands higher brightness and tolerates a higher black level, standard two tolerates a lower brightness and demands a lower black level. This means manufacturers looking to make LED HDR TVs, which most are, will abide by standard one, while OLED TVs will be able to gain the Ultra HD Premium label by conforming to standard two. Ultimately, it’s not about how bright you get, but how much of a jump there is between light and dark. And that’s it. In the grand scheme of things, it won’t matter which type of TV you have as to whether it will be HDR compatible or not. LED TVs will give you an HDR image with better peak brightness but less deep blacks, whiled OLED TVs will give you an HDR image with lower peak brightness but deeper blacks.

Everything I watch will be in HDR?

If only it were that simple. Content has to be mastered for HDR in order to work with the standard. In other words, both the source, and the TV have to be HDR compatible. Luckily, with the advent of Ultra HD Blu-ray and advancements in online streaming from Netflix and Amazon, content creators will be able to deliver HDR content more easily.

Should I buy?

Now that there’s an official HDR standard, in the form of Ultra HD Premium, the danger of buying a rubbish TV claiming to be HDR compatible has been minimised. If you buy a Ultra HD Premium TV you’ll know you’re getting a TV capable of meeting the HDR standards set by the UHD Alliance.

It’s still worth doing some research on the product before you buy, just to ensure you’re getting the specs you need for a true HDR experience. That said, now is a better time to invest in HDR than ever. Although 4K has been the big thing thus far, the combination of the relative ease with which HDR content can be produced (versus data-heavy 4K) and (because it’s less data intensive) distributed to consumers really seem to be exciting content producers in a way 4K struggled to do.

Related posts