Latest TVs Are Ready for Their Close-Ups (wsj.com) 219
An anonymous share a WSJ article: The latest televisions have more pixels than ever. But can your eyes detect the difference? The answer is yes -- if you sit close enough. Old TVs had 349,920 pixels. High-definition flat screens bumped up the total to 2 million. Ultrahigh-definition sets inflated it to 8 million. And manufacturers are now experimenting with 8K TVs that have an astounding 33 million pixels. More pixels render hair, fur and skin with greater detail, but the benefit depends on viewing the screen from an ideal distance so the sharpness of the images is clear, but the tiny points of illumination aren't individually distinguishable. According to standards set by the International Telecommunication Union, that ideal distance is 3 times the height of an HDTV screen, 1.5 times the height of a UHDTV screen and .75 times the height of an 8K screen (Editor's note: the link could be paywalled; here's a PDF copy of the newspaper). Given those measurements, viewers should sit 6 feet away from a 50-inch HDTV with a 24.5-inch tall screen. But they should sit just 3 feet from a UHDTV of the same size, closer than most Americans prefer.
720p (Score:4)
I have a 720 32" TV. Its good enough for the shows and games I play. Does that make me some how evil? The way marketing is going I feel that way sometimes.
I like the higher resolution picture but I prefer content. That might be why I like to buy DVD's a lot of the time over a BluRay. Same content and cheaper.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
22" widescreen 1080p here, and I don't feel the need nor do I want want to upgrade to anything better. I use Netflix at the lowest possible setting because it's good enough.
Who's on top? (Score:2)
Same for me. It's perched on an small, wooden stereo cabinet with a DVD player I deliberated purchased with no BluRay support.
[*] As soon as they sell a feature to you, they count you as a user—see Google+—and then that statistic is rolled out to the studios to pressure them into dropping the format you actually use.
Our minimal yet adequate television resides in the corner of the room. Once a week we roll the stand very close to the couch, and pogo both of the speaker
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Laurence Fishburne, sitting in the front row of IMAX Matrix Reloaded. /shudder.
Laurence didn't look that good blown up that large either.
Re: (Score:2)
I have some 32" 720p displays and they are fine for TV. Of course, the "HD" TV channels aren't full HD 1920x1080 anyway, and over-compression makes them look pretty bad, so for broadcast 720p is okay.
Slightly odd reaction to feel "evil" at only having 720p... But still, I'm glad we have 4k because even though I'm sticking with my old 2k TV, it's great for computer monitors.
Re: (Score:2)
However in reality what you are, is an intelligent p
Re: (Score:2)
I like the higher resolution picture but I prefer content. That might be why I like to buy DVD's a lot of the time over a BluRay. Same content and cheaper.
So you don't actually care about higher resolution picture at all, otherwise you'd spend the money on it.
Re: (Score:2)
have a 720 32" TV.
We have you beat! We still have an analog TV - 23" I think. Though to be fair, you can watch "TV" on any of our tablets or laptops - and that's what the kids do. We are about to buy a real TV for our living room and I'll properly surround-sound it for movies... but I suspect only us old folks will really use it.
Re: (Score:3)
Content over quality. While there is some merit to what you say, I would be more opted to say it depends.
It depends on what you are watching. If I want to be mindlessly entertained, then to me the quality doesn't really matter. I have a plex playlist of old tv shows, MASH, Giligans Island, Night Court, Hogans Heros, ripped from dvd with a few that came right off late night TV. So, yes, mindless entertainment that works.
Same with most Hollywood movies. Most are crap, the ones where quality matter
Re: (Score:2)
Re: (Score:2)
And make sure the TV isn't set in "demonstration mode" because all the changes you make will revert back to the crap settings after a set period of time.
Re: (Score:2)
HDR is an increased color gamut at the source and gives a better true dynamic range. Simulated HDR or dynamic contrast are both obnoxious. Similarly for audio, there is dynamic range compression, which is only good if you need to keep your TV really quiet for others to sleep - not for everyday use. Yes, movies get loud and soft at times without it. It's supposed to.
Re: (Score:2)
Turn this crap off
Or you could just let people make up their own mind about it.
Re: (Score:2)
The problem is that they use specially produced content and image slideshows to demo the TV's in the showroom so the hyper-real soap opera effect is "looks like 4k" to many people. For the most part none of these sets have a refresh low enough to cause ghosting without these
Re: (Score:2)
From what you describe I'd also particularly check anything related to sharpness/noise and anything you'd make a reasonable guess could be a marketing term for an algorithm playing with the same. Also trying all the overarching preset options and turning them off as well, a movie or
At least it results in better monitors (Score:4, Insightful)
Re: (Score:3)
Re: (Score:2)
120" projector
So to benefit from this "progress" I'd have to dedicate a room to TV........
Yeah. I'm not going to do that. The best part is that going forward I can safely ignore all future 8K, 16K etc. hype. Wonderful.
Re: (Score:2)
It isn't for everyone. I love movies and I have a large enough room that there is a 12ft viewing distance in the living room and 20ft from the kitchen which is open concept and that large screen fits nicely
Re: (Score:3)
Most people do that,
More things I don't care about.
Re: (Score:2)
Re: (Score:2)
That said, some of us use projectors and 4k, even 8k would make a huge difference in the quality of my 120" screen in the living room.
Re: (Score:2)
almost everything we watch is compressed
This is the real reason I'm a cord cutter. A plain old DVD looks better than an HD cable/satellite channel.
Re: (Score:2)
Generally people can see the difference between 1080p and 4k if they sit pretty close - closer than normal couch distance, but still relevant for real-world viewing.
To see the difference between 4k and 8k the picture basically needs to fill your field of color vision (which is narrower than people think - the brain synthesizes colors at the edges). Relevant for movie projectors, but not much else.
Old TVs? (Score:2)
Which specifications were used to arrive at that number? I remember VideoCD used MPEG-1 at a resolution of 352x240 pixels (a total of 84480 pixels) and while it looked a bit pixelated on a CRT it didn't look four times worst than usual.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
VideoCD approximates VHS quality, but with a bit more clarity. You're definitely at fourfold worse than DVD, though the difference is a lot more noticeable on some of the last-produced SD TVs that had better resolution.
Re: (Score:2)
Re: (Score:2)
640x480 was the original highest definition of IBM's VGA cards for PCs, which is why early CRT monitors often had this definition--which early flat panels copied, as it tended to be the lowest definition supported by graphics cards at that point. It's certainly copied from the definition of standard def NTSC TV minus the overscan.
Usable resolution is 320x360 (Score:2)
Actual TVs couldn't show all 720x486 pixels. The analog bandwidth of NTSC RF video is 4.2 MHz, with roughly 0-3.0 MHz for luma and 3.0-4.2 MHz for chroma. The 3.0 MHz limits the usable resolution to 320 pixels across, as anything higher will start to produce color fringes as luma detail mixes into the chroma. (SD viewers can occasionally see fringing on neckties on news channels.) In addition, fine horizontal lines in interlaced video will shimmer on a CRT if not blurred vertically before interlaced samplin
So what? (Score:3)
So we can see even more clearer now that the programming stinks, that the sitcoms ain't funny and that the thrillers are formulaic, predictable and anything but thrilling?
Seriously, I recently find way more entertainment in 30 year old shows than in the rubbish produced today.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
It actually was better back in the 40's in all ways but two... The acting was better, the direction was better, the writing was way better.... The only thing that wasn't better was the quality of the image and sound.
Hollywood has turned all "good" TV shows into the same tired formula for political correctness reasons and then to pump up the ratings they throw in gratuitous sex or violence which usually adds nothing to the show, except shock value.
However, I think we are actually turning the corner on so
Re: (Score:3)
Back in the early days of radio, movies, and television, all of the talent was focused on producing a relatively small volume of content. Everything had to appeal to general audiences because the movie theaters only showed one movie at a time and there were only a small handful of radio/TV stations.
Today, content has evolved to fill hundreds of different niches. There are entire TV stations devoted to specific hobbies like fishing and podcasts about the most esoteric interests one can possibly think of. Tha
Re: (Score:2)
Depends on what you watch. Since I've signed up for Netflix I've watched a lot of enjoyable shows. What's on cable or satellite mostly sucks, the good shows are made by private networks such as HBO, Showcase, Sy-Fy (hate that name), Netflix, etc.
Re: (Score:2)
And lots of those were shot on film. Anything before about 1990 where the original prints still exist (and not just the telecined broadcast tapes) can be bumped up to at least 4K, though at 8K it's pretty much just sharper film grain. It was around 1990 when non-linear editing really hit its peak that shows were mostly shot directly on video. Everything from about 1990 to 2004 is hit or miss as to whether a high-fidelity transfer is even possible.
And for many of those older shows, they actually had good
Re: (Score:2)
Ironically, in terms of screen-minutes The Andy Griffith Show is the biggest thing in my house. And it stopped production nearly 50 years ago.
Re: (Score:3)
Seriously, I recently find way more entertainment in 30 year old shows than in the rubbish produced today.
There was better TV 30 years ago? A few TV shows from the top of my head, Knight Rider, ALF, Riptide, Manimal, and A-Team. Not great TV as I recall, just a mix of good to awful. I remember A-Team as something I looked forward to watching. ALF was campy fun. Manimal was memorable mostly because of how awful it was.
If you want to talk about formulaic and predictable then look at A-Team, Knight Rider, or just about any other man vs. evil in the world show. Come to think of it the man vs. evil trope is pr
More pixels render hair, fur and skin (Score:2)
Re: (Score:2)
Ow, my Balls! (Score:3)
Doesn't matter (Score:2)
Re: (Score:2)
Free tip: disable any "image enhancing" crap on your TV. Yes, video compression does destroy the image somehow (that's how it works) but all the "enhancing" done by the TV only makes the compression even more visible than it should.
And make sure your TV is not stuck in "demonstration mode" otherwise all your settings will revert to the "it looks great for the showroom" with all that crap enabled.
Re: (Score:3)
This is why streaming still isn't as good as physical media. It's hard to see artifacts on a well-mastered Blu-Ray at 1080p. I haven't seen a 4K Blu-Ray, but since they use H.265 they should have plenty of room for a good quality image.
In your face (Score:3)
Unless it's a phone, then just strap the bad boy to my face!
Re: (Score:2)
Is that a VR joke? I don't know anymore.
Americans? Really? (Score:2, Informative)
But they should sit just 3 feet from a UHDTV of the same size, closer than most Americans prefer.
This statement makes no sense. For one thing, what does being an American have to do with anything at all? Why would nationality affect how close people want to sit to their screens? Are the French so busy feigning disinterest that they need to sit that close to see what's on? Do the Brits need to sit that close because they have tiny screens they can quickly hide when the TV tax man comes around to collect? Would all of us Americans sit closer too, if not for the fact that our rampant obesity keeps us from
Re: (Score:2)
Sure, now let's clamp those heads to those chairs so none of those eyeballs move.
The other way to read this study... (Score:2)
... is, "as resolution goes up, you'll want a bigger screen if you don't want to move your couch".
In other words (Score:2)
"Given those measurements, viewers should sit 6 feet away from a 50-inch HDTV with a 24.5-inch tall screen. But they should sit just 3 feet from a UHDTV of the same size, closer than most Americans* prefer."
*Closer than anyone prefers
In other words, in a normal, living room application, HDTV and UHDTV with a 50" TV, the two are indistinguishable at comfortable viewing distances. Now if you have a 40" UHDTV as your computer monitor, you can still definitely tell the difference. The industry needs to fin
Way too close to be healthy (Score:2)
But they should sit just 3 feet from a UHDTV of the same size, ...
There are better ways of going blind, like masturbating.
sort of (Score:2)
"...the benefit depends on viewing the screen from an ideal distance..."
Actually, that only matters if FIRST the content is either delivered at that resolution or upsampled to it.
Upsampling can be good, but never of course reaches the primary resolution value.
Hell, what do I know? I'm still perfectly happy with my hi-def tv; I never jumped on the 3D or 4k bandwagons and don't feel like I missed a thing. Hell, I end up watching most of my films on my computer screen anyway.
I'm skeptical when I hear things like this (Score:2)
Re: (Score:3, Informative)
They had an electron beam than scanned across a shadow mask to the phosphors underneath.
The limit was the bandwidth of the analog signal, resulting in a measure called lines of resolution.
Well, the shadow mask kind of imposed a pixelization of a sort, so it was not simply a limitation of the analog signal bandwidth.
Shadow mask [wikimedia.org] <-- close up of one.
Another thing about old TVs (Score:5, Informative)
People tended to hold onto them for as long as they were functional, which could be a decade or more. We had a 27" tube television which was 16 or so years old and still going strong when we replaced it with an HD set 10+ years ago (that old beast weighed something like 90 pounds too! I had a lot of fun hauling it away...).
And while Slashdotters are always more prone towards acquiring the new shiny toy, I suspect the average television owner still follows that principle... but the manufacturers keep trying (and generally failing) to induce people into treating their TVs as disposable gadgets which should be replaced every couple of years. 3D television was their first attempt; then 4K; now 8K. Meanwhile fewer people than ever are sitting down and staring at a television screen without also constantly texting on their phone or doing Facebook - it's doubtful they'd notice the increase in TV resolution even if they were a foot from the screen.
Re: Another thing about old TVs (Score:4, Interesting)
My 32" 8 year old HDTV suits me fine. Given the amount of compression artefacts and detail loss in the signal I receive, I see little point in replacing it with something better until it dies.
Re: (Score:3)
In the analog TVs and monitors, there was pixelization of a sort.
- In the vertical dimension there were scan lines.
- In the horizontal there was bandwidth / sweep rate, which limited the sharpness of focus, creating what amounted to a "pixel width" and thus a "pixel count", though there weren't discrete pixels with well-defined boundary locations.
In black-and-white that was it. In color it got more complicated:
- The shadow mask and pattern of color phosphors created physical pixels
Re: (Score:2)
There was still a physical phosphors count though. Everything else was, in digital terms, resampling.
Re: (Score:3)
Old TVs still had resolution, measured in lines. The lines happened to be varying intensity, but you still only could see a few hundred lines per frame. (525 total, 483 active, and around 435 visible)
Re: (Score:2)
Re: (Score:3)
That is true for Black and white. However Color TV's had phosphors setup to make a pixel. Because the analog signal needed to go up and down to create the color.
Even with Black and White, The glow of a phosphor on the screen was of a particular size. So this would limit its practical resolution.
Re:Dumb (Score:5, Funny)
8k is supposed to be the ultimate, the final form of 2D television.
Son, who are you kidding?
Joe Sixpack knows it, marketing knows it: higher is better
I predict 16k devices in 3...2...1...
Re: (Score:2)
Re:Dumb (Score:5, Informative)
Of course, you're not competing with the real world. You're competing with the past. For the most part, this means film, but it can also just mean "at whatever resolution we eliminate all the digital artifacts we accidentally put it". This matters for both Hollywood movies, and for most TV shows.
Say you're watching a DVD source on a 4K screen. You interpolate to fill in the missing data, but that's more missing data than available data and the contrast is terrible. When your screen resolution is better than your source like this, you have to rely on little (but still visible) tricks like digital grain to make it look less unnatural (as they did relatively successfully with, say, the old A&E/BBC version of Pride & Prejudice, and less successfully with the movie 300. To avoid this entirely, you have to re-sample the original source at a higher rate, which means going back to a higher-resolution master. For older material that means film. For newer material, it may mean you're just SOL.
A case study: All the episodes of the original Star Trek were shot (including special effects), edited, and mastered on film. That master was broadcast using analog technology, or digitized to some resolution for DVDs, Blu-Ray, etc. When screen resolution goes up, you can't just upscale the DVD or Blu-Ray and get good results indefinitely: you have to go back to the master and re-capture a higher-resolution digital version from that. The resolution of 35mm film is roughly equivalent to 20 megapixels.
Most of the Next Generation was shot, edited, and mastered on film, but a few effects were produced and edited in using digital 3D. For those 3D models, they had to do some digital archaeology and re-creation to replace (not scale up) those effects without artifacts. And then you get to Star Trek: Deep Space Nine, were a lot was modeled in 3D, and it was all edited and mastered digitally assuming the TV resolution of the time. There's no film master of higher resolution to go to, so DS9 (even the human actors) will just look worse and worse as the screen resolution goes up, forever.
Theoretically, 8K is approaching the point where you can good and truly digitally re-master most older media -- getting as good as you ever got with film -- and thus the point where the technology tops out... at least until Hollywood starts digitally filming in 16K.
Re: (Score:3)
Not all film is the same. Some directors (Kubrick, Kurosawa) shot their later work in 70mm. So did some hacks.
At the other end, you have 16mm.
Re: (Score:3)
I believe this is the reason that we will never see HD episode of B5 on BR. While all the live action scenes where shot on 35mm film, all the space shots where CGI. Which was primitive at the time. They had banks of Amiga 2000/3000 doing the rendering.
All these scenes where shot in 4x3 format and at low resolution. They wouldn't scale well to a new format and would have to be completely redone. It would just be to expensive and the powers that BE don't believe it would be worth the cost.
I believe
Re: (Score:3)
Assuming the original models are available, there's no reason why a modern Beowolf cluster of couldn't be redone at a higher res
While I do believe the original models where lost, you still can't just drop a model in and render it at a higher res. Not if the original models where rendered and created at the low resolution. It's like blowing up a low rez jpeg image and sticking it on the side of a building. There just isn't enough information in model for it work.
You could resize the skeleton but you will still have to manually re-texture all the models. Like the jpeg example I believer there are algorithms that would help bu
Re: (Score:3)
You missed the point. All these are cgi of space shots and battles. There are no actors involved in this. I believe all the original scenes are rendered in the Amiga's default resolution, which is 320x200. To redo them to HD standards would take hundreds of man hours to redo all of them. Even with the original models.
With out the original models everything would have to be redone from scratch. I doubt any studio would be willing to foot the bill for a 20+ year old tv series that only a few die hard
Re: (Score:3)
DS9 (even the human actors) will just look worse and worse as the screen resolution goes up, forever.
No, DS9 won't look worse and worse. It just:
* won't look better
* will look perceptually "worse" relative to newer things
but it actually will never look worse per se.
Re: (Score:2)
Isn't that Calvin and Hobbes?
Re: (Score:2)
But that can't go on forever... everyone knows that "640k ought to be enough for anybody."
Re:Dumb (Score:4, Informative)
I used to print color positive slides with a Nikon 8K printer, it was a long time ago but I remember very well you can see the difference from 4K with a loupe. Undoubtedly the image will incredible though personally I would love one for my desktop.
Re: (Score:2)
Re: (Score:2)
The easiest and fastest way to disable any "image processing" crap is to set the TV in "computer monitor" mode.
Re: (Score:2)
One of the benefits of 8k is that with 120Hz support there is no need for "motion estimation" and interpolation. That gets rid of all the artefacts and allows the director to choose suitable settings for their preferred look.
Re: (Score:2)
Given that the actual content is 30fps or less even 60Hz (60 updates per second) is refreshing the screen twice as
Re: (Score:2)
Re: (Score:2)
For the longest time, we had the 72dpi standard for monitors for the 1080p display. Having a resolution limit, can actually lead to progress in different areas.
The 8k TV takes a lot of processing power. Which generates a lot of heat and uses more electricity, and in general make them expensive and lower frames per second.
Re: (Score:2)
And keeping it at 72ppi (points [wikipedia.org] per inch, not dots) will give a nice sharp image and reduce eye strain.
Few people want tiny print, except for people who don't have room for multiple screens.
Re: (Score:2)
Power consumption is certainly an issue with 8k, as it having powerful enough equipment to edit it. That's one of the main reasons it has taken so long to arrive. The cameras took a long time to get down to usable sizes, and they had to change the way filming was done because for example manual focus is basically impossible.
For computers the sweet spot for me is 28"/5k. Good amount of physical space, exactly double the standard HD resolution for perfect 2x scaling. For 24" monitors 4k is exactly double the
Re: (Score:3)
120Hz native for ultra smooth, realistic motion
- Much higher dynamic range and more accurate colour rendering
Neither of these have anything to do with 8k other than sharing the same HDMI standard. Though I'll grant you that cheap 8k panels will have slightly higher color accuracy than their 4k counterparts due to less visible dithering.
Re:Dumb (Score:5, Interesting)
You sound like an audiophile that think people can hear 96KHz/24bit audio. People don't even notice that cinema movies create less than 4K masters and blow them up on screens the size of a wall. And that most movies are shot in 24p because people want them to be. The biggest shortcoming of current screens is the contrast level and backlight bleeding, if you could get a screen that went from max HDR to perfect black that would be the biggest improvement. The second biggest improvement is color and there rec. 2020 is just huge compared to rec. 709, bigger than even reference monitors can provide. And despite stretching it for HDR the granularity of 10 bit color over 8 bit is also pretty huge. Oh yes and also the color volume, being able to do not only intense whiteness but also intense color.
Basically, if people saw a well-mastered 4K BluRay on a laser projector (which is as close as we get to a "perfect" image right now) I doubt anyone would care about 8K/12bit/120fps. The problems we have are far more mundane. And that goes doubly so for OTA broadcast, streaming or other bandwidth limited media. Personally I'm hoping for the "real" electroluminescent QLEDs to steal the show, not Samsung's latest quantum dot-enhanced LCDs but OLED-style perfect contrast with LED intensity and QD color accuracy. The first working early prototype was shown in May [avsforum.com], at least a few more years out.
Re: (Score:2)
But whether video exceeds the resolving capability of your eye depends on resolution and screen size and viewing distance. I have a 42" 1080p HDTV which is fine for most of my viewing. But on my projector which throws a 12' image, 1080p is woefully inadequate. The pixels are completely obvious and it's like viewing a movie through
Re: (Score:2)
While I agree with you in some cases, namely the 8K. other aspects are definitely visible. For example, I often see the steps in intensity gradients with 8-bit color. Now video often doesn't use the 0-255 but a lower range of this, so it's typically not even 8-bit color. 24p is another parameter that is clearly visible. There are plenty of times where I find it jarring because I can see the stepping between frames. It's usually worse with computers because when switching from 24p to 60p using 3:2 pulldown y
Re: (Score:3)
And that most movies are shot in 24p because people want them to be.
And that's because people are used to movies looking like garbage. It's embarrassing when low budget TV looks much better than juddery blurry multi-million dollar film content.
The only way to enjoy such horrible content is on home TVs that, since the early 2000's, interpolate PTZ motion and generate intermediate frames, giving some semblance of smooth motion on a 60 or 120 Hz TV. Efforts such as Peter Jackson's The Hobbit that were filme
Re:Dumb (Score:5, Insightful)
This entirely misses the point of 8k. It's not just a resolution bump, it addresses multiple use-cases:
There is no "use-case" of sufficient utility to provide value to the vast majority of consumers in the market.
- Very large screens / projectors
Very large as in IMAX large.
Most movie theatres are still running 2k and nobody cares. Heck most movies are not even filmed in 4k.
- 120Hz native for ultra smooth, realistic motion
Most movies are currently filmed at 24 fps. IMAX runs at 48 fps.
Why stop at 120Hz? Why not 240Hz for even better smoother more ultraer, realistic motion? Or even 480Hz?
- Much higher dynamic range and more accurate colour rendering
- Comfortably exceeding the capabilities of your eyes in all situations
4k already does. It's overkill for most users.
8k is supposed to be the ultimate, the final form of 2D television. NHK, the people behind it, skipped over 4k because it's just a stepping stone to perfection. If anything is to blame here, it's 4k being a half measure and 8k not arriving quickly enough.
When you put things into perspective you quickly come to realize resolution of TV is irrelevant.
The limit of human vision useful for discriminating useful detail is 10 degrees of arc at a resolution of 60 pixels per degree or 600 x 600 per eye. Anything much more than that is unnecessary assuming 100% efficiency of projecting photons into the fovea.
A 80" 4k screen at 5 feet distance already exceeds the limit of human vision at 64 PPD as well as most peoples budgets for TVs or place to put them to say nothing of natural unwillingness to sit so close.
Actual current real world problem with TV that people will actually benefit from addressing is not resolution or frames per second or color depth. It's the willingness of content distributors to provide sufficient bandwidth to drive current displays.. displays that have been available commercially for the last decade.. at quality they are capable of producing.
The largest national cable companies have in recent years *DOWNGRADED* HD broadcasts from 1080 to 720 (excluding local retransmission) and turned up the compression knob leaving very noticeable blocking and motion artifacts in order to maximize profit. Satellite TV broadcasts are a joke and even OTA is starting to degrade as broadcasters are able to cram more content into available bandwidth via sub channels. Internet streaming has the advantage of modern and more rapidly upgradable codecs yet still insufficient bandwidth to practically deliver at quality limit of current generation of televisions. It isn't cost effective and more importantly most people either don't care enough to affect market behavior or can't tell the difference.
I'm not going to hold my breath waiting for content or a delivery mechanism to meet the capabilities of displays having been commercially available for more than 10 years let alone 4k and 8k.
8k is the equivalent resolution of 36 720p displays the max currently broadcast by major US cable companies. If people are willing to accept 720p with heavy compression on what planet is a broadcaster going to make the calculus ... hey we should use the bandwidth we would normally transmit to 36 users over point-point or 36 channels over broadcast medium just to deliver a single 8k channel to the handful of people who would appreciate it. How does THAT generate profit?
My own opinion is VR/AR/display/lightfield/GPU technology is likely to advance far faster in the next decades with far better results vs the likelihood of bandwidth requirements for transmission being rendered trivial.
Re: (Score:2)
That's what they said about Super-VHS. "It can't get any better than this"
Re: (Score:3)
There's one point many here have overlooked... it doesn't necessarily require a "UHD" TV to directly benefit from it.
Back around 1994, I bought my first "big" TV back when I was in college -- a 27" Daewoo. Nothing earthshaking, but it had higher-end specs... s-video input, decent stereo audio, and significantly better dot pitch than most of the other TVs at BrandsMart USA.
For the next 5 years, connected to a cable box and VCR, it mostly looked like shit. VHS tapes looked marginally less shitty at my house t
Re: (Score:2)
International Telecommunication Union in the summary may say that higher resolution is of no added benefit, but a look at the cone density and focal length of the eye proves otherwise. While there are certainly diminishing returns, to say it is indistinguishable is simply incorrect.
Another issue with the standard analysis of distance v pixel size is that 20/20 vision, although typically used as a measure of normal visual acuity, is not perfect vision. Many people have much better vision than this.
The Wikipedia article on visual acuity says, 'Healthy young observers may have a binocular acuity superior to [20/20 vision]'. I am healthy (although definitely not young!) and throughout my life my visual acuity has consistently been measured by my optician to be around twice 20/20 vision.
So
Re: (Score:2)
Nothing like the crisp details of nasal, monotonous human voices on AM radio. Makes me glad I held onto my newspapers.
Re: (Score:2)
Or increasing the viewing distance to fit more people into the room.
Re: (Score:2)
That doesn't require more resolution.
Re: (Score:2)
If you're one of the people sitting closer, it does. It really depends on your seating arrangement.