SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

In the early days of TV, there weren’t many choices beyond the size of the set. Later, color vs. black and white became a consideration, which eventually gave way to cathode-ray tubes, LCD technology (which is still common today), plasma displays, and rear-projection for larger TVs. However, during this time, the dimensions and resolution of TV screens were rarely a consideration. TV screens were built in a 4:3 ratio, and “standard definition” (SD) was both the signal broadcast for television, both over-the-air and cable, and on home video formats such as VHS and Betamax. The SD elements differed slightly between the NTSC standard used outside of Europe and the PAL standard used within but were essentially in the same category. 

The first resolution revolution came at the turn of the 21st century. A combination of the introduction of the HD (“high-definition”) television and the DVD disc-based data and video format shifted home viewing to something more akin to a home theater. Television sets adopted LCD and LED technologies (such as the best OLED and best QLED TVs) that allowed for flat panels, resulting in larger, cheaper, and lighter television sets (if all these acronyms look like gobbledygook to you, check out our QLED vs. OLED primer). The shift to HD television signals and DVD resulted in a change in the aspect ratio of televised broadcasts, with everyone adopting the “cinematic” aspect ratio of 16:9.

While Internet video formats—both downloadable and streamable—existed since the 1990s, streaming as a format for delivering television and movies didn’t become mainstream until around 2011, four years after Netflix first introduced the “Watch Now” streaming option. That year Netflix decoupled its streaming service from its DVD-by-mail service, marking the beginning of the streaming era of media content. Since then, nearly every major media company has started its own streaming service. While television resolution depended on display, video, and broadcast technologies to be able to deliver HD signals, streaming depended on internet bandwidth—basically, how much data could be sent over internet connections. Resolution jumps lagged behind on streaming not because the signal could not be produced, but because it took time to develop the technology that allowed that signal to be sent quickly enough to be effectively delivered. Those barriers have been broken and today streams can reach the same level of quality as recorded or broadcast video, assuming an adequate Internet connection.

HD resolutions were the standard through the first 12 years of the 2000s, but in 2012 the first 4K—also called ultra-high-definition (UHD)—televisions were announced. However, 4K wasn’t as drastic a change as SD to HD, more of an evolution than a revolution. Aspect ratios remained unchanged but resolution effectively quadrupled from SD to 3,840 horizontal lines (that’s nearly 4,000, giving 4K its name). In the late 2010s, 8K—available in televisions and monitors but unadopted by broadcast, video, or gaming tech—doubled that again. And that brings us to the present day. 

Samsung QN900B 8K QLED TV product image
There may not be content for 8K TVs yet, but that doesn’t make the Samsung QN900B any less pretty. Samsung

Tech specs between 4K vs. 1080p

The 4K vs. 1080p resolutions share the same 16:9 aspect ratio and the televisions sporting them can come in similar sizes—the difference lies in the sheer number of pixels. A pixel is a single point of digital information, one tiny circle of color or light that, when combined with millions of others, produces an image in its entirety. When the HD switch happened in the late ’90s/early 2000s, there were two competing HD resolutions: lower-end 720p and higher-end 1080p. 720p offered an image that was 1280 pixels wide by 720 pixels high. The name again comes from the number of horizontal lines of resolution. And 720p is still available in bargain-level televisions and monitors, so its presence in the marketplace hasn’t completely disappeared, but there are very few manufacturers competing in that space as 1080p is seen as the “bare minimum” for most consumers when it comes to a display. A 1080p screen produces an image that is 1920 pixels by 1080 pixels. 

With all this pixel talk, you might assume that the “p” in 720p/1080p stands for pixel, but it actually stands for “progressive,” as in progressive scanning, the method by which the display refreshes the on-screen image. Progressive scanning exists in comparison to interlaced scanning and, for a while, there were both 1080i displays and 1080p displays. Progressive scanning displays every line of pixels in every image sequence. Interlaced alternates every other line on every other image sequence, producing an image that is less sharp but looks similar due to the speed of the image sequences (typically anywhere from 24 per second to 60 per second mattering on the input/signal being shown). However, 1080p proved to be superior and it’s now quite rare to find HD monitors or TVs that use interlaced scanning.

The jump to 4K resolution is an effective quadrupling of 1080p. At 3840 pixels across and 2160 up and down, 4K jams four times as much information into the screen, with a whopping total of over 8 million pixels. While not quite the standard, prices on 4K TVs in smaller sizes without higher-end connection ports such as HDMI 2.1 are widely available in the $300 range. While the highest-end 4K TVs reach towards $2,000 at the largest sizes, some of the best models of more modest dimensions are available at a shade under $1,000. 

The most recent generation of video game consoles—PS5 and Xbox Series X/S—have made 4K resolution gaming a reality for non-PC players (though Nintendo’s older Switch console is still outputting an HD signal), and 4K streaming output is supported by Netflix, Amazon Prime Video, Hulu, Disney+, HBO Max, Apple TV+, Paramount+, and YouTube. While standard Blu-ray discs are not 4K, there are “premium” 4K Blu-rays available for those who like to own their media in physical form. But 4K isn’t “bare minimum” yet. Some cable television providers offer limited 4K content through a digital cable box, though it’s unlikely for over-the-air signals to go 4K any time soon (as they were the last to adopt HD roughly 15 years ago).

Nintendo Switch docked next to a flatscreen TV
Nintendo’s iconic, exclusive characters, like Mario and Zelda, look good even if the Nintendo Switch only outputs 1080p HD. Nick Ware

Yes, but do I need 4K?

For a new television, there are a few activities that in no way require a 4K TV. Most cable television only outputs an HD signal. Older-generation video game systems (PS3 and original PS4, Xbox 360 and Xbox One), the Nintendo Switch when played at home through a TV, and DVD and Blu-ray players are in HD, not 4K. A standard HDMI cable from any of those devices to a 1080p TV will give you the “intended” image quality.

However, in each of these cases, the quality and clarity of the image will benefit from what’s called “4K upscaling.” If you connect these devices to a 4K TV, 4K upscaling is essentially the creation of certain lines of pixels to “fill out” an HD image into 4K. The processing hardware inside the TV handles this using algorithms that vary from manufacturer to manufacturer. Not all upscaling is created equal, as certain companies have developed better programs to do it effectively. If each of the pixels in an HD image were simply doubled to make it 4K, it would lose clarity and sharpness and end up looking very distended, so upscaling creates sets of pixels that bridge the gap, rather than simply replicating an additional line. So if your media activities are limited to older-generation video games, terrestrial or basic cable television, and DVD/Blu-ray viewing in HD, you’ll lose nothing from having a 1080p television, but gain a noticeably better-upscaled image from doing those same things with a 4K TV. Most 1080p TVs also lack more modern features like HDR, which drastically affects the overall look of the onscreen image.

With most streaming services offering 4K streaming, newer game consoles offering 4K gaming, and high-end disc players offering UHD Blu-ray, there’s a world of higher-resolution content that is made for and best in 4K. Certainly, if you own the newest Sony or Microsoft consoles, you’re losing a lot from not having a 4K display, as the disc-based versions of those machines also play UHD Blu-ray. The jump to better streaming qualities is extremely noticeable as well because 4K is one of those once-you-have-it-you-never-want-to-go-back things, as HD will simply not look good to you anymore. In this generation, 4K is the best resolution for gaming, the best resolution for streaming, and the best resolution for movies.

At this point, with the entry price for a basic-but-solid 4K set so low, and the vast majority of entertainment options pushing towards 4K as a standard, it doesn’t make a lot of sense to buy a new 1080p television unless it’s an emergency stopgap set that you want to buy for less than $200 or it is used for business purposes, such as a menu/advertising/information display in a shop. If it’s a personal television for home use, 4K is the way to go. If you’re currently rocking a perfectly good 1080p set with a minimum number of devices and services that output 4K, there’s no real impetus to upgrade outside of your own desire for higher quality. There’s no TV show that will be available to stream only in 4K and no game that can’t output 1080p in the near future, even if those settings are not ideal. You won’t completely miss out on anything, you’ll just get a less ideal quality version of it.

What’s next in the 4K vs. 1080p battle?

While 1080p currently is “enough,” it won’t be that way for too much longer. Just as HD replaced SD fully, so too will 4K at some point. After that is 8K, a quality level four times that of 4K which is available in monitors and television sets currently at a premium price point. While PCs can output some things in beyond-4K resolution, no game systems, movie players, streaming services, or television services have created 8K versions of their content. As of right now, 8K is essentially just upscaled 4K because of the dearth of native 8K content. But 1080p sees the writing on the wall. When 4K becomes old hat, 1080p will become, well, not hat. Your habits and desires will let you know when you need to pull the trigger to get to 4K, even if that time is the latest possible moment in the future.