Creating a system you love shouldn't be difficult. The Acoustic Frontiers blog is here to help.
This comprehensive blog article is intended to demystify some of the technical details behind UHD or Ultra High Definition.
UHD is the next generation of video formats, and has many potentially picture enhancing features over our current HD format. As we’ll explain, 4K resolution is only part of the UHD concept. There are other important components such as wider color gamuts (more colors), higher color bit depths (smoother color gradations) and brighter image highlights (HDR).
Here are the items covered in our introductory section. Read this if you want the high level details:
Then we get more technical, and explain some of the aspects of UHD:
At this point a few hardware options:
Then there are the streaming services, which can be accessed from many hardware devices such as the those built into your TV, Roku 4K player, Amazon FireTV2, Nvidia Shield, etc.
For a comprehensive list of sources and available content, see the “Master List of 4K“… thread over at AVSForum.com.
Note that most of the streaming services have compressed audio soundtracks, even the ones advertising Dolby Atmos like Vudu. Since high performance home theater is about video and audio, that only leaves two sources of note: UHD Blu Ray and Kaleidescape’s Store. For more read My Search for Higher-Quality Soundtracks in Streaming Movies.
The Ultra HD Blu Ray spec is as follows (see the rest of this article for details on how to “decipher” these acronyms):
As many of these specifications are optional it seems that just because a disc is labeled Ultra HD Blu Ray does not mean it will have HDR, a wide color gamut or 10 bit color. Hopefully UHD Blu Ray players will have some kind of signal information menu to reveal what is actually on the disc.
Note also that many of the movies announced for release on UHD Blu Ray so far were actually not shot at 4K resolution! See this article for some insightful analysis.
Whilst not strictly necessary for UHD your display may also have:
Things there are confusion over with respect to UHD displays:
If you want to use your AVR or Pre-Pro to switch UHD sources then it needs to have HDMI2.0 (2.0a for HDR)/ HDCP2.2.
If your AVR/Pre-Pro does not have this then the workaround is to run one HDMI cable from the source to the display and another from the source to the AVR. Both the Samsung UHD and Kaleidescape Strato players provide this functionality.
This section covers most of the background behind the UHD specifications. We’ve tried to make it comprehensive, but beware that this means quite a few technical details! Where relevant we’ve linked to places where you can do further reading.
If you find any technical inaccuracies or things that require further clarification please leave a comment. It’s likely there are a few!
Resolution refers to the number of pixels displayed. Our current HD standard is 1920 horizontal and 1080 vertical. The new UHD standard is four times this amount, or 3840 horizontal x 2160 vertical.
Confusingly commercial cinema 4K as specified by the Digital Cinema Initiative (DCI) and VESA (the standard used by still and video cameras) is 4096 x 2160, which is slightly wider than Quad HD / UHD at 3840 x 2160.
Nearly all current consumer content, whether HDTV over cable or satellite, DVD or Blu Ray is created and distributed in the REC.709 color gamut.
The issue with this is that it only represents a small portion of the visible spectrum of colors. The standard used in commercial cinema has more colors and is called DCI P3. There is also an even wider color gamut called REC.2020 which is the “wide color gamut” standard used in UHD.
Note that just because content is labeled as UHD does not mean it has a wide color gamut. Much of the initial UHD content actually has the same REC.709 color gamut that we have today.
WCG content comes in a REC.2020 container. As we mentioned earlier not every display can accept REC.2020. If your display cannot accept REC.2020 then the source should “downconvert” to the REC.709 space. The new Samsung UHD Blu-Ray player does this.
There is some confusion over the P3 color space – whist the wide color gamut content actually encoded onto the disc may have a color gamut close or equivalent to the P3 space in UHD it is transmitted inside a REC.2020 container. There is no consumer P3 color space. The confusion arises because display manufacturers are now advertising % of P3, and this is also a specification that has been codified in the Ultra HD Premium TV certification. We guess the assumption being made is that despite the container being REC.2020 what we will see for the forseeable future is that the actual colors in the content are equivalent to the P3 space.
Bit depth describes the number of potential values that the encoding of color information in a video signal can have.
Historically, Blu Ray has been 8 bit, which means 256 possible values for red, green and blue. UHD Blu Ray is 10 bit, giving 1024 values for RGB. 12 bit color provides 4096 values for RGB.
One important reason that we have moved to a 10 bit system for UHD Blu Ray is to reduce color banding. This is an image defect where bands of color are visible. It’s more important in the UHD world because of the expanded color space and hence the greater color variations.
HDMI2.0 supports 8, 10 and 12 bit color in various formats, as covered in the following sections.
Sometimes you will see references to 24, 30 and 36 bit color. These references relate to the total color bit depth for all RGB channels, where 24 bit = 8 bit red, 8 bit green, 8 bit blue and 36 bit = 12 bit red, 12 bit green, 12 bit blue.
Consumer video is stored, transmitted, and processed in a color space called Y’CbCr. The three components stand for:
This standard was defined back at the start of the color TV era as a way of including color information in the broadcast signal.
Encoding in the Y’CbCr color space allows the resolution of the color channels Cb and Cr to be reduced through color or chroma subsampling. This technique takes advantage of the fact that human vision is more sensitive to light differences than to color differences.
There are three main types of color subsampling used today. These are 4:4:4, 4:2:2 and 4:2:0.
If 4:4:4 is a full bandwidth signal, then 4:2:2 occupies 2/3rds the space and 4:2:0 occupies 1/2 the space. Blu-ray and UHD Blu-ray both store the video signal in the 4:2:0 format. This essentially means that each pixel has a Y’ signal, odd pixel lines have no Cb or Cr and then Cb and Cr are alternated on each even pixel line.
To get the video displayed on the TV or Projector it typically goes through the following conversions:
Historically the source upsamples to 4:2:2, which is sent over HDMI, and then the display upsamples to 4:4:4 and converts to RGB. The reason for this sequence is that HDMI v1.4 and previous iterations did not support 4:2:0. HDMI2.0 does support 4:2:0 though only at 50/60 frames per second (FPS). At 24 FPS 10 bit only 4:4:4 and RGB are supported.
Note that there appears to be some confusion about exactly what is supported in the HDMI specification, even among manufacturers and industry participants. Some people we discussed this article said 4:2:0 was supported at 24/25/30, others said that as of 2.0a 4:2:2 was supported at 10 bit. In the absence of a clear industry wide understanding we will stick to the information published on the HDMI website.
There is no intrinsic benefit to the source upsampling to 4:4:4 or converting to RGB. With respect to UHD and HDMI it is actually beneficial if the conversion to 4:2:2, 4:4:4 and RGB is, as much as possible, left to the display, as this reduces the HDMI bandwidth requirements.
Further reading:
Placeholder for future section on HDMI handshaking between source and display and how this impacts data rates, color bit depth, color gamut and color space…
There have been essentially three types of of “4K capable” HDMI chipsets on the market. These have been implemented into various TVs, projectors, processors, AVRs and switchers since the 4K came onto the market in 2013.
We’ve summarized this information together with the formats and bandwidth requirements in the table below:
This is important information, as we are seeing some sources such as the Samsung UHD Blu-Ray player that only support 4:4:4 or RGB at 10 bit, both of which require the whole video chain to be 18Gbps capable. They could easily have used 12 bit at 4:2:2:, which would have enabled compatibility with 10.2Gbps chipsets.
HDCP stands for High-Bandwidth Digital Content Protection. The version used in UHD is 2.2.
Part of the deal with UHD is the potential requirement of using HDMI chipsets and HDMI cables that support “18Gbps” data rates.
The table below summarizes the data rates for the different frame rates and formats that are part of the HDMI2.0 specification.
As you can see it is possible that a 10.2Gbps chipset and cable infrastructure can support UHD Blu Ray, assuming the transfer medium is 4:2:2 at 12 bit.
For short runs (say up to 6′ / 2m) most passive cables will be able to support the data rates required for UHD. Between 6′ and 15′ you’ll find some passive cables that can support UHD and others that can’t. Above 15′ you’ll very likely need an active cable.
There are a few independent companies that are providing HDMI cable certification for UHD data rates. These include:
HDBaseT (HDMI over category cable) is capable of data rates up to 10.2Gbps.
The main display certification standard is one created by the UHD Alliance. This is an industry consortium of content creators (e.g. Hollywood Studios), distributors (e.g. Amazon, DirecTV) and hardware manufacturers (e.g. LG, TCL). They have created a set of specifications and a logo to help consumers.
It’s not clear at present if this “badge” will just get applied to consumer displays, or if it’s intended to be used across the content creation and distribution ecosystem as well.
To get a Ultra HD Premium “sticker” a display must meet a set of criteria, these are:
Interestingly, Sony, despite being a member of the UHD Premium alliance, has a different “sticker” that it is putting on it’s displays. We think they are doing this because they also have projectors, and to have a consistent “sticker” they need their own standard, since the projectors can’t hit the peak brightness / black level standards required for the UHD Premium “sticker”.
There are multiple HDR standards at this point, and it is not clear which one will become dominant in the market. HDR10 and Dolby Vision appear to be the front runners, but there are others lurking in the wings such as Hybrid Log Gamma (HLG).
It’s quite the “evolving ecosystem” at this point. Even if you buy a display with both HDR10 and Dolby Vision (which limits you to flat panel TVs), the amount of light output the TV can put out will likely be on an upward trajectory for the next few years. The TVs you can buy now are limited to about 1,000 nits, but the Dolby Vision standard can see a future with 10,000 nit displays!
This section covers the two main standards at a high level. We’d also recommend reading the State of HDR article if you really want the technical details (warning, it’s even more technical than what we’ve written).
Be aware that there are a number of articles out there about HDR, many of which were written in 2015 before the slightly clearer picture that emerged late in 2015 after the publication of the SMPTE HDR Imaging Ecosystem report. As such you’ll likely find conflicting and incomplete information. It’s likely that some of the things we have written are incorrect or incomplete too, so please leave a comment if you find things we should update or clarify!
Further reading:
HDR10 is an open standard.
Dolby Vision is one of the competing “standards” for HDR. More details can be found on the Dolby Vision page and in the Dolby Vision whitepaper.
Nyal Mellor, Founder, Acoustic Frontiers
Nyal Mellor
Author