Will This Be The Year 4K Catches Fire?
How many pixels are enough?
People have a tendency to simplify things down to numerical comparisons. If you cast your mind back to the 1990s, you might remember the “Megahertz Wars,” where PC manufacturers seemed to debut a machine every few months with an ever faster processor. A similar war has been waged—and continues to this day—in digital cameras, where every successive generation of device boasts more megapixels, despite the fact that other factors—such as the lens or size of the sensor—now bear much more heavily on the ultimate quality of the image.
And now we have the Video Resolution War. While the number of pixels available in a display has steadily risen since the earliest days of computing, it’s begun to accelerate in the past few years. The fact that we have more devices with screens has only exacerbated it. We’ve had Retina displays in our smartphones and computers, and now we’re on the verge of jumping from HDTV resolutions of 1080p to mammoth 4K displays. Where does it end? How much resolution really matters?
Pixels or it didn’t happen
To understand where resolution starts, you need to think in terms of pixels. These are the tiny colored dots that make up an image, arranged in a grid—think of the Impressionist and Pointillist artists of the 19th century, only on a much smaller scale and much more evenly spaced. From far enough away, your brain resolves those colored dots into a single whole image.
Resolution measures how many of those pixels are present in a display. A 1080p display, for example, has 1080 horizontal lines of resolution. (The “p” stands for “progressive,” meaning that every time a new image is put onscreen, all 1080 lines change; by contrast, in a 1080i or “interlaced” display, the odd and even lines are refreshed separately, which helps reduce the bandwidth needed for transmission.)
Starting with 4K resolution, however, the metric changes a bit: a 4K display signifies there are 4000 lines of vertical resolution. And in case that seems too simple, don’t worry, it’s not: “4K” is actually itself a blanket term covering many resolutions, since displays can vary in exactly how many lines they boast. While the resolution is supposed to be 4096-by-2160, the standard for ultra high-definition TV (or UHDTV) is actually a little under 4K, at 3840-by-2160. Either way, we’re talking on the order of 8 million total pixels, compared to the 2.1 million pixels in a 1080p television.
So how do they cram those extra 6 million pixels in the same 50-inch television set? As in everything technology does these days, the name of the game is to generally make those pixels smaller.
The eyes have it
So, more pixels must be better, right? After all, in technology more of everything—gigahertz, RAM, storage space, chocolate cake—is better. In general, the answer is yes: higher resolution means the screen can generally display a higher-quality image. But as you might suspect, it’s never that simple.
One important factor to consider when you’re thinking about resolution is how far away you are from the screen. After all, a smartphone display that you’re holding about a foot from your eyes is going to be perceived differently from an 80-inch TV that you’re sitting ten feet away from. The pixels on your smartphone need to be a lot smaller, and thus a lot more densely packed, because you’re closer to it. Consider that the iPhone 6 Plus and Samsung Galaxy S4 both have 1920-by-1080 resolution screens—the same as my 55-inch HDTV.
iPad 2 vs. New iPad
In short, the closer you get to a display and the larger that display is, the more of a difference you’ll be able to see in image quality—that is, you’ll have to get closer and closer to the screen before you can discern individual pixels. Then again, I also don’t necessarily want to sit two feet from my television.
And, of course, there’s an upper limit to what your eyes can perceive. So-called “Retina displays” are dubbed such because the human eye supposedly can’t perceive individual pixels at a normal viewing distance. But part of the complication with establishing exactly what makes for a Retina display—or, in truth, determining the perceived quality of any display—depends on just how good one’s eyesight is. Many, if not most, people don’t have 20/20 vision, much less 20/10 vision, which means that adding in additional quality is often gilding the lily.
The limitations of human perception aside, there’s a somewhat more immediate problem, and that’s a lack of 4K content. Right now, you have a broad choice of HD content including broadcast TV, HD cable, and a variety of on-demand and streaming services. Thus far, 4K content is pretty limited. Netflix is making some shows, such as House of Cards and Breaking Bad, available in 4K, as is Amazon for some of its original programming. YouTube and Vimeo support 4K, but have a relatively small amount currently available.
4K on Netflix
That leaves a lot of room to grow. A 4K Blu-ray format is in the works, and at this year’s Consumer Electronics Show, Panasonic showed off the first prototype of a model that supports it—even though discs with it won’t start appearing until much later this year at the very earliest. Likewise, Dish recently announced a set-top box that will support 4K content—but only for video on demand, not live broadcasts.
The industry is, of course, interested in pushing the process along. A consortium of several big players, including Samsung, Sony, Panasonic, Disney, Fox, and more are trying to promote ultra high-def (UHD) content; the UHD Alliance wants to establish standards for the format in order to help speed its adoption. Roku is aiming to bring 4K support to its Roku TV platform, and LG launched a line of seven 4K TVs.
But shifting over to 4K is likely to be a timely and expensive project for some content providers—consider how long the transition to HD took in the first place, and how many things have still to be remastered in HD. (HBO’s decade-old critically acclaimed show The Wire, for example, only got the HD treatment this year, though that project was fraught with additional complexities over aspect ratio.)
For the time being, the easiest place to see 4K content might be down at your local multiplex, though once again, that depends on a host of factors, including how good your eyesight is, how big the screen is, and how close you’re sitting to it.
As gigantic as 4K seems, it’s only the beginning. Work is already underway on its successor, 8K, which boasts—you’ve got it—around 8,000 vertical pixels of resolution. (It’s known in some circles by the ridiculously hyperbolic moniker of “Super Hi-Vision.”) Right now, about the only place that you’ll actually see 8K footage is a dome screen of the type often found in planetariums, which is itself a special format called “8K fulldome.”
So if you’re holding your breath for an 8K HDTV in your living room, then please inhale—otherwise you’re going to pass out. LG premiered one off at CES, but the 98-inch monstrosity is still just a prototype. There’s also even less 8K content than 4K, so most of the companies demoing those high-resolution units are relying on high-quality photos or nature videos.
Moreover, the thing about resolution is that bigger isn’t necessary better. Take the Oculus Rift, for example. Its latest Developer Kit release has a resolution of “only” 960-by-1080 per eye, but given all of its other technology—head-tracking, 3D, wide field of view—and the fact that its screens are just inches away from your eyes, it can have an even more immersive effect than even a 4K display. And no doubt, its imagery will get only more high-quality as time goes by.
So, how much farther can we go? Well, there are plenty of improvements to displays that are orthogonal to simple resolution increases. Advances like “quantum dots”—tiny crystals that can be applied to LCD’s in order to greatly boost color fidelity—or Organic Light Emitting Diodes (OLED)—which provide better color and are thinner and more power efficient than LCDs—can arguably have more meaningful effects on our screens than simply increasing resolution. Curved screens, which a number of manufacturers are experimenting with, can reduce glare, which helps increase effective contrast.
The thinking has begun to shift laterally, as it did in the Megahertz Wars when hardware makers started thinking about more processors rather than simply faster processors. The resolution wars will continue for now, but for companies looking to compete on the title of best picture, the straight-up numbers game—pleasingly incontrovertible as it may be—will eventually be beside the point. Because if smartphones have shown us anything, it’s that the future is far smaller, and far more personal, than we’ve ever dreamed.