As if 3D TV and LED
LCD vs. OLED vs. plasma and 120Hz and the Soap Opera Effect weren't
confusing enough, in the last year we have seen the rise of a new HDTV
technology called 4K. Or if you use its official name, Ultra High Definition (UHD).
UHD
is an "umbrella term" that encompasses higher resolutions (more pixels) than
HDTV, as well as more realistic color and higher frame rates.
Today and this year, pretty much the only one of those improvements available
in new TVs and content is 4K resolution, so that's what we'll talk about here.
Judging from the new TVs shown at CES 2014, manufacturers are tripping
over themselves to bring you a new array of 4K compatible products.
But
just like 3D and HD before it, 4K has a case of putting the hardware chicken
before the software egg. About 15 months after 4K
TVs first appeared on the market , there's little consumer 4K content
available: no TV channels or Blu-ray discs, just a few specialized video
players, YouTube and other clips of varying quality, and promises of streaming
video.
Still,
the shift from 1080p to 4K TV hardware is inevitable. This year 4K TVs will
replace high-end 1080p models as the best-performing LED LCD-based sets on the
market -- although the reason they're better will have nothing to do with
resolution.
Confused
again? Don't worry, we'll walk you through, starting with the question: So what
is 4K anyway, and what makes it different from high definition?
Today, the TV industry supports two HD formats; often referred to
as 720p (1280x720 pixels) and 1080p (1920x1080 pixels, Full HD). All our HDTVs
come with Full HD resolution but content – movies, TV shows, games etc. – is
720p in many cases. There are, in other words, two sides of the equation: our
screen and the content. Both are important but the truth is that content is
often the bottleneck. Here is a short overview of the typical resolution
classes for content.
·
SD: DVDs, standard TV channels
·
HD 720p: Most HD TV channels, some streaming services
·
Full HD 1080p: Blu-ray, some streaming services
Ultra HD is the next step on the resolution ladder. It also goes by the name UHD or 4K (4K is actually a movie theatre format but the name is often used for Ultra HD). Just like the HD standard, which covers both 720p and 1080p, Ultra HD covers 4K (also called 2160p) and 8K (also called 4320p). As you might have noticed by now the naming methodology is not straightforward and the industry likes to refer to things in different fashions. Again we can offer a roughly outlined overview.
Ultra HD is the next step on the resolution ladder. It also goes by the name UHD or 4K (4K is actually a movie theatre format but the name is often used for Ultra HD). Just like the HD standard, which covers both 720p and 1080p, Ultra HD covers 4K (also called 2160p) and 8K (also called 4320p). As you might have noticed by now the naming methodology is not straightforward and the industry likes to refer to things in different fashions. Again we can offer a roughly outlined overview.
·
Full HD = 1920x1080 pixels = 1080p = 2K
·
Ultra HD = 3840x2160 pixels = 2160p = 4K
· Ultra HD = 7680x4320 pixels = 4320p = 8K
What's in a name? '4K'
versus 'UHD'
In
August 2012, the Consumer Electronics Association introduced the term Ultra
High Definition, partly defined as resolutions of "at least 3,840x2,160
pixels". The idea was to replace the term 4K. The CEA's name lasted less
than a day, as Sony then announced it was going to call
the technology "4K Ultra High Definition".
This is the term now used by most other TV manufacturers too, who seem
interested in covering all the buzzword bases at the expense of brevity.
In
practice, you will often see UHD used interchangeably with 4K, whether
describing TVs, source devices, accessories or content. We at CNET say
"4K" instead of "UHD" almost exclusively, and our readers and Google strongly
agree.
Digital resolutions: A primer
The latest in a line of broadcast and media resolutions, 4K is due to replace 1080p as the highest-resolution signal available for in-home movies and television.
With
the arrival of 4K there are four main resolution standards for use in the home:
standard definition (480p/540p) , high definition (720p), full high definition
(1080i/p) and ultra high definition (2160p).
When
used in a home context, 4K/UHD means the TV's screen has a minimum resolution of
3,840 pixels wide and 2,160 pixels high, making it the equivalent to two 1080p
screens in height and two in length. This resolution was originally known as
"Quad HD," and it's used by basically every 4K TV.
Another
resolution, known as 4Kx2K (4,096x2,160 pixels), is used by some projectors and many professional cameras. It also
falls under the umbrella of 4K/UHD. Other shooting resolutions are also employed
in the pro realm, depending on the camera.
While
4K is relatively new, high definition (HD) itself has been with us for about a
decade, and is the format used in Blu-ray movies and HD broadcasts. There are
three main versions of HD: full high definition 1080p (progressive), 1080i
(interlaced), and 720p (also called simply "high definition").
Despite
the existence of HD and 4K, many television programs, online videos and all
DVDs are still presented in standard definition, loosely defined as 480 lines.
Standard definition began life as NTSC TV broadcasts before switching to
digital with the introduction of ATSC in
2007.
The beginnings of digital
cinema and 4K
While
it's currently being touted as a new broadcast and streaming
resolution--particularly with the appearance of the HEVC
H.256 codec--the roots of 4K are in the theater.
When
George Lucas was preparing to make his long-promised prequels to the "Star
Wars" movies in the late '90s, he was experimenting with new digital
formats as a replacement for film. Film stock is incredibly expensive to
produce, transport, and store. If movie houses could simply download a digital
movie file and display it on a digital projector, the industry could save a lot of money. In a time when cinemas
are under siege from on-demand cable services and streaming video,
cost-cutting helps to keep them competitive.
How 3D drove the takeup of
4K
Do
you remember seeing James Cameron's "Avatar 3D" in the theater? Cameron's movie about
"giant blue dudes" helped drive high-resolution 4K Sony projectors
into theaters around the world. Movie studios keen to maintain that momentum
then released a slew of 3D films -- mostly converted from 2D -- and continued
the expansion of 4K cinemas. While 3D has declined in popularity, 4K movies are here to stay.
The
industry has been quick to distance itself from 3D, and has taken care not to
make the same mistakes by equating 4K with 3D. But there are obvious benefits
for 3D on a 4K TV screen. In our extended hands-on with the Sony
XBR-84X900, we saw the best 3D TV we'd
ever tested. It delivered the comfort and lack-of-crosstalk benefits of passive
3D, while delivering enough resolution (1080p to each eye) to be free of the
interlacing and line structure artifacts inherent in 1080p passive TVs. Higher
resolutions like 4K are also necessary for new implementations of glasses-free
3D TVs.
From theater to the home
While
4K resolution makes perfect sense for huge theatrical screens, its benefits are
less visible on TVs at home, watched from normal seating distances.
ULTRA HD IS ABOUT MORE THAN RESOLUTION
In the last section we told you what Ultra
HD is, measured in pixels. But the Ultra HD standard covers more than
resolution. Ultra HD was recently approved as the official name, and in the
latest BT.2020 recommendation from ITU (who is responsible for the standards)
other exciting elements are included.
FRAMES PER SECOND
With Ultra HD it has also been proposed
that we increase “frames per second” (often referred to as frequency, Hz or fps).
We need to once again look at the history of fps to understand the future.
Today, pretty much all Hollywood movies are shot at 24 fps (24 new pictures per second) and TV programs at 25 fps or 30 fps (25 pictures per second in PAL countries and 30 pictures per second in NTSC countries). Games are typically rendered at a frame rates between 30 and 60 fps on a game console and up to 120 fps on a PC. In other words; there is a huge difference between the frame rates; and therefore how smooth you will perceive motion. Ultra HD proposes that movies and TV programs can be recorded and reproduced at frame rates up to 120 fps; 120 pictures each second. Ultra HD will support 24, 25, 48, 50, 60 and 120 fps if the full recommendation is implemented in practice.
If you have watched The Hobbit in HFR format you have experienced a movie shot in true 48 fps, and then you probably know what 48 can do for picture quality and the movie experience (read our thoughts on the The Hobbit in HFR here). The two coming Avatar movies will most likely be shot at 60 fps so we are not even close to 120 fps yet. Some movie producers even believe that a higher frame rate is a far greater improvement in picture quality than a step up in pixel resolution is right now. And yes, the move from 24 Hz to just 48 or 60 Hz is truly a small revolution in picture quality.
Today, pretty much all Hollywood movies are shot at 24 fps (24 new pictures per second) and TV programs at 25 fps or 30 fps (25 pictures per second in PAL countries and 30 pictures per second in NTSC countries). Games are typically rendered at a frame rates between 30 and 60 fps on a game console and up to 120 fps on a PC. In other words; there is a huge difference between the frame rates; and therefore how smooth you will perceive motion. Ultra HD proposes that movies and TV programs can be recorded and reproduced at frame rates up to 120 fps; 120 pictures each second. Ultra HD will support 24, 25, 48, 50, 60 and 120 fps if the full recommendation is implemented in practice.
If you have watched The Hobbit in HFR format you have experienced a movie shot in true 48 fps, and then you probably know what 48 can do for picture quality and the movie experience (read our thoughts on the The Hobbit in HFR here). The two coming Avatar movies will most likely be shot at 60 fps so we are not even close to 120 fps yet. Some movie producers even believe that a higher frame rate is a far greater improvement in picture quality than a step up in pixel resolution is right now. And yes, the move from 24 Hz to just 48 or 60 Hz is truly a small revolution in picture quality.
COLORS
But what about colors? Well, major
improvements are proposed in this area, too. With Ultra HD a new color gamut
called Rec.2020 is introduced. Today, we use the so-called Rec.709 color gamut
for Full HD. Do not mind the names, as it is relatively simple. It is best
illustrated with a graph.
The human eye can only perceive a specific set of colors. We cannot perceive, for example, infrared (you cannot see the infrared light coming out of your TV remote) and there are other ”colors” that are invisible to us, too. The typical human eye can perceive all the colors illustrated in the color spectrum graph below.
The human eye can only perceive a specific set of colors. We cannot perceive, for example, infrared (you cannot see the infrared light coming out of your TV remote) and there are other ”colors” that are invisible to us, too. The typical human eye can perceive all the colors illustrated in the color spectrum graph below.
However, our TV screens,
projectors and cameras cannot match all these colors so the industry has
defined a smaller color gamut that movie folks and TV manufacturers can
implement in products and productions; a color standard so to say. We need this
standard to make sure that movie and TV productions appear correctly on our TV
screens. In the graph above you can see both Rec.709 (used in the Full HD
standard) and Rec.2020 (that can be used in Ultra HD).
Rec.2020 is the larger of the two and as you can
see Rec.2020 gives us many more colors to work with. It gives us a larger color
gamut to pick colors from, so to say. So, therefore Ultra HD also proposes an
extension of what is often referred to as color depth; an increase from the
current 8-bit to 12-bit per color (and a change from 24-bit to 36-bit for all
colors combined). This may sound very technical but with the color gamut
explanation in mind it is fairly easy to grasp. When a TV or camera uses 8-bit
per color it means that it can define 256 shades (2^8 = 256) of red, green, and
blue, respectively. These three colors are used as basic colors to create all
other colors. In other words; it allows us to create 256 shades of red, 256
shades of green and 256 shades of blue. In total this gives us 24-bit or 16.777
million (256x256x256) colors. The 16.777 million colors can be “picked” from
inside the color gamut defined in Rec.709 that is used today for Full HD.
Great survey, I'm sure you're getting a great response.
ReplyDeleteStanford Arris