A German guy just came around with a cameraman, asking people in the press room what their favorite thing at CES has been this year. (He did not ask me. I was shooting eye-daggers at him.) Two guys at my table were asked, and both said one of the 4K TVs -- Sony's OLED and Samsung's curved set. 4K is clearly the buzzword at this year's CES.
What is it? 4K is a new, higher resolution for screens and projectors, with roughly 4352 x 2176 pixels. It's about four times as pixel-dense as 1080p. Cool! That means more detail--you can get your greasy face right up next to a screen and it's still crystal clear. This is a higher resolution even than what's in most movie theaters.
Why should you care? The pictures look amazing from up close. I tried out both Sony's OLED and Samsung's curved 4K sets, plus a 4K 3-D set, plus about a thousand other ones. They all look stunning. These are really great TVs. Another reason to care is that this is the wave of the future: Sony's starting to produce content in 4K resolution, and it's one way to get true movie-theater-quality video in your house.
Why should you ignore this? Well, a bunch of reasons. It's not new technology, exactly, but 4K in the consumer space is very, very early, and that comes with a whole mess of caveats. First: assuming you actually can buy a 4K set, they are ludicrously expensive. And you can't really buy one, not yet. Several companies have said they'll come out with 4K sets this year, but we won't hold our breath. Expect the first generation to cost tens of thousands of dollars.
The other big problems are content and distribution. 4K content is incredibly rare; NHK and the BBC both produce some 4K content, and Sony is making a big push to film sports and movies in 4K, but right now there's basically nothing out there to watch.
And even if there were a ton of content you'd want to watch, you wouldn't really be able to. 4K video takes an outrageous amount of storage--we're talking nearly 10 terabytes for a normal-length movie. So you have to figure out how to store that--there's certainly no disk that can handle it, since a Blu-ray tops out at about 100GB, or a hundredth of the space you'd need.
The other problem is that America's internet infrastructure is nowhere near robust enough to handle the demands of 4K. You'd need a 100-megabit/second connection to stream a compressed 4K video, which isn't an ideal option since you'll invariably lose some quality in the compression. And basically nobody in the States has a connection that fast anyway; an average Verizon FiOS connection hits about a quarter of that. In South Korea, where they have gigabit connections, 4K would be alright. But not here, not for years.
To deal with that major problem, Sony is providing a big machine you can plug into your new 4K TV, if you're will.i.am or whoever else buys one this year. It's basically a computer with a ton of storage, and Sony will package 10 movies and a few shorts onto it. That's right, you need a huge hunk of machinery and special installation from Sony just to watch 10 movies. The Sony rep I talked to was pretty vague about what happens after you watch those 10 movies. He said the machine could be "updated" with more content, but when I reminded him that it'd take about three days to download a single movie over America's highest-speed internet connection, he didn't really have a response.
So what's the takeaway? 4K is cool! Sony's big 4K OLED is probably the most beautiful TV screen I've ever seen. Absurdly bright, vivid, clear. Sharp enough to cut diamonds. It's awesome.
But this is a showoff tech, not a "here's what you guys will buy this year" tech. And that's fine! We like amazing futurey stuff! But let's keep it in perspective.
There will come a point (and some say we are either there or close already) where the TV companies will run out of feasible resolution. That is, from any decent viewing distance you won't be able to see pixels. The question is what will the TV companies push then as the latest and greatest reason to pay a huge premium to them for something. For decades resolution has been been the biggie, as well as size. Now as both of those are getting maxed out, either TVs will become ever-more commoditized or the manufacturers will have to try and find some other feature to convince us that we have to pay a premium.
It will be interesting to see which way it turns. I suspect some of each will happen, with a sharp difference in viewpoints amongst some companies.
I wonder if the Sony 3D goggles will soon provide a 4K experience?
Still, I do enjoy learning about the new technology coming down the pike. Yea, its new and expensive, but we all know in the future with electronics, it always gets cheaper and faster.
One day the resolution and sound will be so awesome, you humans will just give everything real of life and stay with your multimedia, mean while the robots will take over the Earth!
You are all so easily distracted, lol.
As for the storage issue, good thing we have Moore's law: the number of transistors on circuits doubles every two years. This goes for processing power, sensors, resolution, and memory capacity. Within a few years, a normal 4k box could store dozens of 100 terabyte 4k movies.
I don't see the point of the extra resolution, given the viewing distance. Instead, I would think that additional dynamic range (contrast) would matter more, given that the eye does perceive a much wider range than displays can presently render.
Brightside (aquired by Dolby) had an interesting product: an LCD TV with a LED backplane. The combination of the two allowed for very bright (LED fully on, LCD letting much light through) and very dark (LED very low, LCD letting very little through) parts of the image. This was demonstrated at SIGGRAPH 04, the name was HDRDisplay.
I don't know why there has not been products along those lines since Brightside, and instead TV makers are focused on 4K ;-)
Indirect good news, this will make all lesser type TVs become cheaper, YEA!
Heres another fun thing about 4K:
As someone who makes 3D animations, I know how long it takes to render stuff. On my decent computer it takes 6 hours to churn out about a minute of animation at 720p. Imagine the computing power it would take to render a full length animated movie in 4K.
On the subject of storage though, theres been a technology in the works for a while called "holographic versatile disc" which can hold 6 TB of data and read/write it fast enough to be feasible. Only downside is players are expected to cost 10,000+ $ and discs around 200$.
"insert joke about apple retina tv here"
I will wait until the 64K comes out. 4K no thanks I got caught in the Atari-Commodore wars in the eighties, today 4K tomorow 64K, you see were I'm going with this.....
Not in our lifetime. The technology is not in sync with the innovations.
@Mukuro Holmes - Moore's law specifically only applies to circuits on a chip, but it is used as as a standard to measure the rate of advancement in other fields to see if their fast. Hard drive capacity increases have slowed a bit recently, maybe theyll pick up again, but right now were seeing about a terabyte a year. which is much slower than moore's law.
&Blarg king A better idea would be to use that fancy new windows 8 storage pools feature and by a bunch of drives or use a NAS and if your really loaded, a SAN, which will cost as much as your holo drive and give you more functionality.
A 4k movie will not take 10 terabytes to store. An average movie would be around 300GB. The 10TB figure comes from other websites who work out the math, and I guess PopSci just feels they can steal those figures and not mention that that is only that large if uncompressed. It sounds much more sensational that way. I haven't read PopSci in a while, and now I know why. The writing on this website is even worse than Gizmodo. PopSci consists of short blurbs for articles that simply copy and paste bits from other websites.
A 4k movie takes up ~10TB UNCOMPRESSED.
As a point of comparison, a 2-hour movie at 1080p takes up ~600GB without compression (video only). And the standard dual layer Blu-ray discs we have been using for years only hold 50GB. So how have they been fitting 600+GB of data on a 50GB disc and still have room for audio and special features? They compress the video.
Using the same quality of compression on the average Blu-ray movie, I would guess they could get a 4K movie to well under 1TB. Still more than a standard Blu-Ray disc can handle, but a far cry from this 10TB nonsense.
@TheZomb Windows 8 storage pool? I have no idea what that is nor do I think it would be any use to people like me who use a Mac.
A fantastic tv (that first one), with a horribly simple design. If you want simple, at least make it simple and awesome like apple and sony usually do.
sounds like you need to try and sell the mac on Ebay, and come over to the winning team, windows owns 92% of the market share, that many people cant be wrong ;0)
Right. Or I could keep my Mac because its lasted me better than any PC I ever owned and apple has the largest growth of any computer company right now.
And its not a matter of being wrong, its simply a matter of choice and many people not wanting to try change.
Also your name is Im a tard bot. Hardly a name to convince people that your opinion is worth listening to.
@Blarg_King "because its lasted me better"
perhaps we should switch names LOL
well considering that mac and pc are the only real choices out there, oh yes and linux too, if mac grew .001% they would be the fastest growing company out there, the only thing keeping mac propped up is the fact that you can buy an emulator for mac so you can run windows software LOL
When I went off to college in the fall of 1997, my family bought me a computer. It was a 200mhz IBM with a 4.3 gig hard drive and 32 megs of RAM. I was king nerd amongst my friends because I had the fastests computer and 4.3 gigs was just obscenely huge for a hard drive. If you had a 4.3 gig hard drive kicking around nowadays you wouldn't bother to use it because it wouldn't even be big enough for your music collection. Geez, most modern smartphones have more storage capacity than that.
What I'm getting at is by the time these TVs are mainstream, I don't think storage will be a problem. By that time high capacity solid state drives will be a lot cheaper, and R&D folks will have found yet another trick to cram more data onto a hard drive. However, I predict that physical media (whatever the next generation of that may be) for movies will need to make a comeback. Either that, or people will need to download movies before watching them (download & store vs. streaming) because I don't see everyone getting high bandwidth fiber optic lines in their houses anytime soon.
@tardbot Or you know people actually like Mac and buy them. could be that. They grew 11% one year. Thats significant growth.
Once the data becomes so large, so fast, and so extreme and detailed to where we cannot tell the multimedia from reality, the MATRIX will come to life!
And after the humans are fully entertained and plugged in, robots will begin fully automate and dominate the planet!
And just think, you humans will ask and pay for this too, lol!
The growth you speak of is due to sales of the iphone and ipad and has nothing to due with the mac you first made reference to
So? Who cares? Its a computer! Its a matter of choice! There is not "winning" team theres just different options. Just because one company sells more copies of its OS than the other doesn't mean its better.
You are absolutely right, it is a choice, and most people (92%) choose not to by mac's, and it has always been my experience that if the majority of people buy something, it is simply a better product.
Or maybe people just never try mac and therefore have no idea?
A huge percent of consumers aren't that tech savvy and therefor just buy an inexpensive computer and most inexpensive computers run windows.
I've heard plenty of stories of people switching to Mac and being happy. never heard one story of a guy switching to Windows and being happy. My brothers a die hard Windows guy I asked him if he was going to get Windows 8 and he laughed and said he would never spend money on a tablet OS.
apple is perfect for the non-tech-savvy, from what I have read, most of the new apple products are non upgradable...so lets see if I have this right, buy an overpriced computer use it for 2 years, drill a hole in it and use it for a boat anchor, then run out and buy another overpriced computer.... ya that's about right I think ;0)
I've been using my MacBook Pro for 3 years without upgrades it still works fine.
Its hardly over priced for what you get. While my brothers cheap Windows laptops are all scratched and ugly my nice aluminum MacBook looks as good as the day I bought it.
Premium price for premium ingredients. Its like food. Sure you can buy a cheeseburger from McDonalds for 1.50$ and it will fill you up, but if you want good quality food that tastes good your going to have to pay a lot more for it.
Same analogy works with cars. You can buy a kia for 15,000$ but if you had the money you would probably spend 100,000$ on a ferrari or am aston martin. Quality is expensive.
Imatardbot, please don't call yourself that or imply that someone is stupid while saying "LOL" and using horrible grammer. This is coming from a teenager.
If you want people to understand what you are talking about you might want to put down a complete sentence.
are you saying LOL is bad grammar, or that I was using bad grammar, or that I need to change my name while using LOL and bad grammar? also are you the teenager or are you talking about blarg_king?
Also you might notice the proper spelling is grammar not grammEr.
you realize that apple switched over to intel cpu's so you are still paying more for the same thing I have in my PC right? hdd's and memory are from whoever gives apple the best deal. same hardware twice the price;o)
Yeah I'm talking about build quality not the parts inside.
And come on I can get a Mac Mini with a core i7 for 800$ thats hardly over priced your going to pay that for any computer with a core i7. Also apple get's in intels newest CPU's way before the PC market. We got nice Ivy Bridge CPU's while Windows users still had to use Sandy Bridge.