DataPro
Login | Catalog | Contact | Support | Tech Info

CART Cart
 
DataPro International Inc.

DataPro's HDR Video Guide and FAQ

DataPro Tech Info > DataPro's HDR Video Guide and FAQ

Index:





What does HDR stand for?

HDR stands for High Dynamic Range.

What is HDR?

HDR stands for High Dynamic Range. Dynamic Range is a concept that refers to the ratio between the lowest and highest values in a range, like dark to light or quiet to loud. If something has High Dynamic Range, it is capable of recording or reproducing a greater portion of that range. In terms of HDR Video, it refers to the display's ability to reproduce a wider range of light levels than a traditional display, either by having brighter highlights, darker black areas, or a combination of the two.

What is HDR Video?

HDR Video is both a concept, and a collection of technical standards and technologies.

The concept, as explained above, is that HDR provides a greater range of reproduced light levels than Standard Dynamic Range equipment. This allows improved color reproduction, smoother gradients, and better contrast.

SpecSupported ByColor PaletteBit DepthMetadata
HDR10Acer, Asus, Dell, LG, Panasonic, Philips, Samsung, Sony, VizioRec. 202010-bitStatic
HDR10+Samsung, PanasonicRec. 202010-bitDynamic
Dolby VisionLG, Philips, Sony, TCL, VizioRec. 2020, Rec. 210012-bitDynamic
HLGGoogle, LG, Panasonic, Philips, SonyRec. 2020, Rec. 210010-bitNone
HDR-compatible televisions and monitors are capable of reading extra data about the image they're displaying, and translating that into information about light output levels. That extra data, and the process of encoding and decoding it, is what makes up a standard. Currently there are four popular standards:

HDR10 is one of the oldest formats, having been announced in 2015. Although its static metadata and Rec. 2020 color space make it less future-proof, its early introduction and royalty-free license have led to wide adoption.

HDR10+ is a format co-introduced by Amazon Video and Samsung in 2017. It has similar specs to HDR10, but is capable of transmitting dynamic metadata, allowing for greater control over light levels.

Dolby Vision is arguably the current state-of-the-art, though its per-unit royalties have led to it being implemented mostly in high-end devices. It can use both static and dynamic metadata, and supports up to 12-bit color depth.

HLG (Hybrid Log-Gamma) comes from a joint BBC and NHK project to create an HDR standard for broadcast use. Unlike other standards, an HLG signal does not include metadata. Instead it uses a unique method of encoding both SDR and HDR data in the same transmission. This allows the same signal to be received and processed by both types of televisions.

What is HLG and why is it different from the other HDR standards?

The BBC and NHK banded together and created an HDR standard intended for broadcast use: the HLG (Hybrid Log-Gamma) standard. Like HDR10, it is license free. An HLG HDR signal is essentially a regular broadcast signal with some extra data (the HDR part). Since there is no metadata being transmitted, standard definition TVs and monitors receiving an HLG signal can just ignore the extra data and display it as a "normal" image. Other HDR standards cannot do this - a "normal" TV won't know how to display an incoming HDR10 or Dolby Vision video.

What is HDR gaming?

Just as High Dynamic Range can improve the color and quality of video, it can also be applied to video games.

On game consoles, HDR is only supported on latest-generation systems, in a limited subset of games. On a PC it requires the right combination of game, graphics card, and monitor.

For lists of available HDR games, see below.

What's the difference between HDR, 4K and UHD?

These refer respectively to a concept, a resolution, and an industry specification.

HDR, as explained above, is the idea of improving an image by reproducing a greater and more realistic range of light levels.

4K is a somewhat broad term that refers to screens and content with a horizontal pixel count of around 4000 pixels. "Full frame" being 4096x2160, with the most popular resolution on consumer devices being 3840x2160 (four times 1080p's resolution of 1920x1080) - as defined by UHD.

UHD - "Ultra-High-Definition Television" is an industry specification that indicates a screen with at least 4K resolution (their definition of which is 3840x2160 pixels), as well as use of Rec. 2020 color space, which allows for improved color reproduction. A newer version of the spec released in 2016 is called UHD Premium and includes support for HDR. UHD also includes a spec for 8K, unsurprisingly called "8K UHD."

What does 10-bit color depth mean?

A bit is the smallest unit of data a computer can process. Each bit has two possible values, typically represented as 0 and 1. With 3 bits, you have 8 possible combinations of 0s and 1s: 000, 001, 010, 011, 100, 101, 110, and 111. With 8 bits, you have 256 (28) possible combinations of 0s and 1s.

123
RRR
GGG
BB 
Historically, 8-bit color depth meant that each pixel had 8 bits split between Red (3), Green (3), and Blue (2). In this way, it could display up to 256 colors (23 x 23 x 22).



12345678
RRRRRRRR
GGGGGGGG
BBBBBBBB
24-bit "true color" is also sometimes referred to as having an 8-bit color depth because each of a pixel's components (Red, Green, and Blue) have 8 bits. This makes 24-bits per pixel, enabling 16,777,216 (224) colors total.

Nearly all HDR specifications use at least a 10-bit color depth where each color channel uses 10 bits. This can also be called 30-bit color depth, and enables 1,073,741,824 (230) colors.

Is HDR better than 4K or UHD?

As described above, HDR and 4K are different features: 4K refers to how many pixels are on a screen and HDR indicates the range of brightness to darkness that those pixel can cover. UHD is an industry specification that combines the two (plus some additional features) into an easily recognizable term.

What is HDR resolution?

Dynamic range of a display refers to the light levels it is capable of producing, which is independent of the display resolution. While most HDR-related specs recommend that the screen supporting them be at least 4K (3840x2160), HDR computer monitors and smartphones cover a wide spectrum of resolutions both above and below that.

What is HDR photography?

In contrast to HDR video displaying content with increased dynamic range, HDR photography seeks to emulate the capture of more dynamic range, with the intention to view it on a Standard Dynamic Range display. In photography, Dynamic Range refers to the range of light levels captured. One classic example is taking a picture of a building with bright sun behind it -- the light level of the sky will be very high, while the shadows on the front of the building will be very low. Standard Dynamic Range equipment would capture only one of those light levels clearly, with the other lost in darkness or blown out to pure white.

What is HDR Pro?

HDR Pro is a marketing term used by LG Electronics to indicate a TV that is HDR10-compatible.

What is an HDR Smart TV?

A "Smart" TV has its own operating system, and can be connected to the internet in order to use various media streaming services (e.g. Netflix, Hulu, Pandora).

Generally speaking, an HDR-capable TV is a TV capable of displaying images with a much greater range of contrast (darker darks and brighter whites) and a wider color palette than standard televisions. That said, an HDR TV is not necessarily brighter than a traditional TV, and in some cases achieves its greater dynamic range by allowing darker black levels rather than brighter whites (though most do both).

Industry standards often bundle HDR with other technologies meant to deliver a specific level of quality or user experience. However these are not 100% required for a TV to be HDR compatible. The following is a recommended minimum by Ultra HD Premium:

  • High Dynamic Range: Brightness of at least 1,000nits with a maximum black level of 0.05nits OR a brightness of at least 540nits with a maximum black level of 0.0005.nits.
  • Color Bit Depth: 10-bit; more than 1 billion colors.
  • Wider Color Spectrum:Display production of more than 90% of P3 colors.
  • 4K Ultra HD Resolution: Minimum display resolution of 3840 x 2160 pixels.
?
Fun Fact:
A "nit" (from the Latin word nitere, meaning "to shine") is a measure of brightness equal to one candela per square meter. The higher the number of nits, the brighter the output.


How does HDR TV work?

After a video has been filmed, HDR "mastering" is (usually) manually performed on it, then it is encoded using a PQ or HLG transfer function. Metadata is bundled with the encoded video signal and transmitted. The TV or computer monitor then decodes the signal and displays it according to any instructions from the metadata.

Metadata is data sent along-side the image portion of an HDR signal that gives the monitor extra instructions about how to display the video (e.g. adjusting brightness levels and color range). HDR10 uses "static" metadata, meaning that one piece of metadata is used for the entire video. In HDR10+ and Dolby Vision, it's possible to have metadata for each scene, or potentially even for each frame.

Every monitor or TV display works a little differently, but most HDR-compatible displays use LED or OLED technology.

Is my TV HDR compatible?

If your TV has the Ultra HD Premium™ logo on it, then it is capable of HDR.

Even if it doesn't have that logo, it might still be HDR-compatible. You can do a visual check by playing an HDR quality video - it will be noticeably more vibrant than on a standard TV. If you do this check, make sure that your source video is of HDR quality and, if applicable, any cables connecting your source content to your TV are HDMI 2.0 or DisplayPort 1.4.

Do not confuse HDR with HDR+ or HDR Effect. The latter two attempt to mimic an HDR "look" by altering an existing SDR image. Actual HDR formats include: Dolby Vision, HDR10, HDR10+, and HLG.

Does HDR require HDMI 2.0?

No, it can also use DisplayPort 1.4. HDMI 2.0 and DisplayPort 1.4 are capable of over 18Gbps bandwidth, 4K resolution, and 10-bit color, which are needed for HDR. HDMI 1.4, DisplayPort 1.2, DVI, and VGA cables aren't capable of carrying that much information.

Recommended Cables:

Does HDR video work with DisplayPort 1.2 or 1.3?

No. HDR video requires greater bandwidth than DisplayPort 1.2 or 1.3 can provide, and requires DisplayPort 1.4 or better.

Where can I get HDR content?

As of mid-2018, most major streaming services support HDR and carry a selection of HDR content. These include Netflix, Amazon Video, iTunes, YouTube, Google Play, and Vudu.

An updated Blu-Ray format called Ultra HD Blu-Ray supports both 4K and HDR, and is indicated by branding on the package. These also require an Ultra HD Blu-Ray compatible player.

What PS4 games support HDR?

This full list of PS4 games indicates such features as HDR, Pro enhanced, and VR optional.

What Xbox One games support HDR?

This full list of Xbox One games indicates features such as HDR, Xbox Play Anywhere, and Xbox X Enhanced.

What PC games support HDR?

This list of PC games contains information about HDR-supported games and which operating systems they support.

Is HDR worth it for gaming?

Obviously a subjective answer, but general consensus seems to be that when properly implemented it can very effectively improve gaming image quality. If you're not the early adopter type, HDR isn't going anywhere, and will only become more widely available and affordable in the future.

Does HDR cause input lag?

It can! Some early model HDR-compatible TVs experienced additional input lag when in HDR mode.

If this is happening to you, try some of the following solutions:
  • Make sure your TV is in "Gaming" or low-latency mode
  • Disable any motion interpolation (frame rate boosting) features
  • Update your TV's firmware
?
Looking for more information?
Check out some of our other popular tech guides for even more information: