Over the last couple of years TV manufacturers have all boasted about their 4K Ultra HD TVs, despite the lack of content available for it. Now that content is finally available in 4K, the likes of LG, Samsung, Sony, Panasonic and Philips are focusing on a new technology: HDR.

At CES 2016 every single TV manufacturer had at least one product supporting HDR; in fact, at many of the major manufacturers, all of their 2016 Ultra HD line-up included the technology in one way or another.

So what exactly is HDR and why is it quickly becoming the must-have feature?

For many of you the first experience of HDR was likely to do with photography. Apple and Google quickly jumped on the bandwagon of HDR photos, which worked by combining multiple images with different exposures to create a single image that mimics a greater dynamic range.

That’s somewhat different to the way HDR TVs work however. TV HDR essentially expands the display’s contrast ratio and colour palette to offer a more realistic, natural image than what’s possible with current TVs on the market.

Any installer knows that contrast ratio and colour accuracy are two of the most important factors when choosing a TV that can truly wow a customer. Sure you can pump more pixels through a display and show off an ultra-sharp image, but compare that to a normal 1080p display that has better colours and a better contrast ratio – then the customer is likely going to choose the latter over the former.

Thankfully TV manufacturers have opted not to make you choose between two technologies and have instead decided to produce screens that boast both an Ultra HD resolution and HDR.

So how does HDR provide a better, more realistic looking image as opposed to standard 4K or 1080p content?

For HDR content to look better than standard content there are two pieces to the puzzle; much like 4K. Firstly the TV needs to support the technology, which is the easy part as there will soon be a flood of new TVs boasting HDR support from all of the major manufacturers.

Ultra HD Premium

For a TV to be HDR-compatible it should be able to produce more light than a normal TV in certain areas of an image; it’s similar to local dimming technology but with an even greater range. That does not immediately make the TV HDR compatible however, it also needs to have a wider colour gamut – which many TVs already have, but have no way of putting those extra colours to good use.

The UHD Alliance recently announced a higher specification for HDR TVs dubbed ‘Ultra HD Premium’ – which requires the TV to have a screen that displays an image resolution of at least 3840 x 2160, with a display reproduction capable of 90% of P3 colours and a 10-bit signal for colour depth.

HDR has even more requirements depending on the display technology; an LCD display needs to be capable of more than 1,000 nits peak brightness at less than 0.05 nits black level, while an OLED needs 540 nits brightness with 0.0005 black level.

The second piece of the puzzle is a little more difficult and could slow the adoption of the technology, much like 4K.

As the saying goes; content is king. The same is true for HDR and if movie and TV studios don’t release content in HDR, then no matter what TV you install, the customer is not going to get the added contrast and colour benefits that come along with it.

So far both Netflix and Amazon have committed to HDR content for streaming services, while a range of Ultra HD Blu-rays should bring the content to physical discs at some point in 2016.

Despite HDRs huge presence at CES 2016, expect consumers to remain confused for some time yet. While there are HDR standards set by the UHD Alliance, there are still a number of other technologies around promising to deliver a better HDR experience.

One such technology is called Dolby Vision; and it’s the technology that is likely to one day be adopted as a standard across all TV manufacturers. Expert reviewers widely regard it as the best HDR format out on the market right now and companies such as LG are already supporting it on some of its 2016 TVs.

Dolby Vision comparison

So how does Dolby Vision differ from standard HDR (known as HDR 10)? Well with HDR 10 the masters of movies are done at 10-bits, hence the name, while Dolby Vision uses 12-bits; meaning if your customer wants the greatest depth of colours, Dolby Vision is the one to go for.

Dolby Vision also has a number of other benefits, including the ability to reach 10,000 nits worth of peak brightness, which is far above the standard 1,000 nits managed by many HDR 10 releases thus far. The technology also delivers frame-by-frame metadata, which gives the TV the necessary information required to ensure it displays the movie or TV show just how the director intended.

Like standard HDR, Dolby Vision content will need to be created by the movie and TV studios, but there are already a number who are signed up to create content specifically for the format. Netflix has already thrown its weight behind the technology, while studios like Disney and Warner Bros. have also committed to Dolby Vision.

Unlike standard HDR however, Dolby Vision won’t be found on all TVs supporting HDR content. Specific hardware will be required and thus far your choice is fairly limited. There are currently no Blu-ray players available supporting the technology and in the UK just LG have announced a Dolby Vision-compatible television set.

More HDR

Dolby Vision, Dolby Atmos Comes To VUDU

LG’s 4K OLED TVs Dominate CES 2016 Thanks To Dolby Vision