Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Television Technology

Samsung and LG Unveil 8K TVs (cnet.com) 218

The latest TV "must have" that you actually don't really need -- at least right now -- has arrived at the IFA electronics show in Berlin. That's 8K, the super-crisp display technology that has four times the resolution of 4K screens. CNET: Samsung on Thursday showed off the Q900, which packs in more than 33 million pixels. The 85-inch TV will be the first 8K TV to hit the US market when it goes on sale in October, although Samsung didn't specify the price. Its arch rival LG a day earlier announced what it called "the world's first" 8K OLED TV. It showed the 88-inch device to some reporters in January at CES but didn't specify when there would be an actual product for consumers. Meanwhile Sharp began shipping the LV-70X500E 70-inch 8K monitor earlier this year to Europe after launching it in late 2017 in China, Japan and Taiwan. 8K TVs dramatically boost the number of pixels in the displays, which the companies say will make pictures sharper on bigger screens. "We ⦠are confident that [consumers] will experience nothing short of brilliance in color, clarity and sound from our new 8K-capable models," Jongsuk Chu, the senior vice president of Samsung's Visual Display Business, said in a press release.
This discussion has been archived. No new comments can be posted.

Samsung and LG Unveil 8K TVs

Comments Filter:
  • 8K content? (Score:5, Interesting)

    by PeeAitchPee ( 712652 ) on Thursday August 30, 2018 @03:07PM (#57227030)
    So what's the incentive to buy one of these things if the content world is pretty much still on 1080i/p, let alone 4k?
    • by darkain ( 749283 )

      The Olympics is the only content I've heard of thus far that is 8k broadcast. https://www.broadbandtvnews.co... [broadbandtvnews.com]

    • Wait you get 1080p? My cable company only supplies hd digital for premium access. I get SD for most of my stuff including football

      • by Luthair ( 847766 )
        The real question is why subscribe to cable ;)
    • Re:8K content? (Score:4, Informative)

      by fred6666 ( 4718031 ) on Thursday August 30, 2018 @03:27PM (#57227176)

      Even to see the difference between 1080p and 4k, you need a 100+ inches TV and move your couch forward a lot.

      • This guy's channel is great.

        https://www.youtube.com/watch?... [youtube.com]
        [Knowing Better - You Don't See In 4K]

      • Clearly you need to have your eyes checked, because me and most or my friends can easily tell the difference on a 55â screen 3 meters away.

        • Clearly you need to have your eyes checked, because me and most or my friends can easily tell the difference on a 55Ã screen 3 meters away.

          You may well see a noticeable difference but it isn't because of display resolution. For video it's video codec and bitrate that make the difference. For computer graphics DSR.

        • Yeah sure, on a static image with text, if you analyze it for a minute, you can tell the difference.
          But I bet you can't during a movie. Except of course if the 1080p movie is over-compressed.

        • http://i.rtings.com/images/opt... [rtings.com]

          At 3 meters on a 55 inch, you barely benefit from having 1080p over 720p. 4k is a waste.

      • by sycodon ( 149926 )

        HD resolution [studiodaily.com] is about two megapixels (1920 pixels horizontally multiplied by 1080 pixels vertically gives you 2,073,600 pixels). 4K is a little less than nine megapixels.

        The Human Eye Is 576 Megapixels [art-sheep.com]

        • The Human Eye Is 576 Megapixels

          The human eye is 1 megapixel and total bandwidth of optic nerve is on order of 10mbit/s.

        • It isn’t distributed the same way the eye vs a screen. We have a small area of highly dense vision then lower resolution around the edges. Also the color range differs as well. For a laser VR display that 576 megapixel that highlights every cone and rod. But for a display that normally takes up at best 25% of your field of vision past high focal regions the inverse square rules takes place and drops really fast.

        • It doesn't mean the human eye can tell the difference between a 576 megapixels image and a 9 megapixels one. Even less than there is a benefit to have such a high resolution.

      • There is a noticeable difference on my laptops 17” screen. Small text is easier to read and more crisp. I can actually do ANSI art with full readable characters that would had fit into a CGA-VGA pixel.

      • Even to see the difference between 1080p and 4k, you need a 100+ inches TV and move your couch forward a lot.

        That is a false and commonly repeated myth. Seeing the full detail of 4K requires a large screen and close proximity but to see a DIFFERENCE you merely need to be able to perceive ANY improvement over 1080p which is actually quite trivial and does not require being particularly close or a particularly large screen. I have a 60 inch 4K TV at home and I can tell there is a difference between 4K content and 1080p content from my couch which is 15 feet away. I have to get up close to really see all the detai

        • The difference for text is much more obvious since there are clear cuts. So of course your 4K monitor is better. But for a movie you need to be closer.
          At 15' on a 60 inch TV 4K isn't worth it. You may perceive a difference (non-geeks won't) but it's not worth it as you need 4x the number of pixels to get an image barely better. 1080p has reached a point it's good enough for this TV size class and there is no benefit (at least for movies/TV shows/pictures) to increase the resolution. 8K will be a joke, of co

    • The incentive is that copyright owners want this as a mechanism to control piracy. Make the files so big that it will be cost-prohibitive for anyone but those with the monopolies to serve it up.

      I can't see users going for those though but for far simpler concerns. Usability starts to tank as resolutions go up. At 720p a file can be skimmed no problem, at 1080p there's a noticeable delay in processing, at 4k it's like mud.

      • by Kaenneth ( 82978 )

        Except file size of compressed video does not need to increase linearly with pixel count, there will be more redundant data.

        • That's true, it doesn't compress linearly. However, for a 300Mbps HEVC/H.265 you'd be looking at a ~2.25GB per minute. You could get it smaller but the bitrate/quality will suffer immensely.

      • The incentive is that copyright owners want this as a mechanism to control piracy. Make the files so big that it will be cost-prohibitive for anyone but those with the monopolies to serve it up.

        But then making an analog copy becomes highly effective.

    • Content? Who cares about that? A 4K computer monitor makes you feel like a god. It is so much pixel real estate that people have trouble finding things to do with it.

      Instead of 3 or 4 monitors, you can just have one big one set at 4K. Edit 8 pages at once! Workflows for graphics, music, and video are really enabled by all of those pixels!

    • usa does not have the bandwidth for 8K and live 4K is only really on DISH / DIRECTV. Cable limited 4K PPV events.

    • Maybe for a comeback of passive 3D? Full 4K content for each eye with no flickering. There isn't a lot of 3D content I care about, but I literally can't buy a device to view the content I already have since I can't tolerate the flickering of active 3D.

    • by AmiMoJo ( 196126 )

      It drives the cost down and gets me closer to my 8k computer monitor.

      Thanks.

    • by MpVpRb ( 1423381 )

      Lower res material can be rendered at higher resolution
      It provides a bit of improvement

      I would love to have a 40", 8K monitor for my computer, but for the home theater, 1080p works fine for me

      Of course, I'm old, with less than perfect eyeballs

    • by johnnys ( 592333 )

      Let alone the content itself, which is mostly commercials and so-called "shows" that would insult the intelligences of the average 14 year old.

    • by AHuxley ( 892839 )
      The back catalogue https://en.wiktionary.org/wiki... [wiktionary.org] and the ability to scan film has lots of 8K content.
    • by hey! ( 33014 )

      You know, I've been part of a number of tech adoption waves -- email, the Internet, the Web, mobile computing etc. And each time I have encountered the well-known Roger's adoption curve [wikipedia.org].

      Early adopters in my experience are novelty-driven people. They tend to want new things because having a new thing that most other people don't is exciting. So Samsung and LG are going to be selling these things initially to people who just want to have something not many people have yet.

      This may sound stupid if you're a

    • Ok we can see the difference between a 1080 vs 4K. But can we see the difference at normal viewing angle between 4K and 8k.

      4K technology in general has pixels smaller then our eyes can distinguish. We are at a point of diminishing returns due to our biology. Just like 20 years ago the 32bit colordepth has become the standard At this phase I would like to see better frame rates, 3d rendering, scaling technology.... vs higer resolution that people can’t see.

    • No one can even distuinguish pixels at 4K when they're sitting at a reasonable viewing distance (ie, on the sofa). There are so very few programming offerings for 4K TV already.

      The only use I can see is to turn it into a large computer monitor, not for individual use because it's too big, but to fill up a wall so you can put lots of info up at once (ie, in a control center). As a computer user it'd be better to have ultrawide for lots of windows rather than in a TV format.

      A snag is all the bandwidth to kee

      • by mikael ( 484 )

        It is obvious when you watch movies of landscapes. The detail in forests and desert mountain ranges is incredible. 3D TV with those polarized glasses varies is also impressive. It really depends on what is being played. The worst videos are people having food fights or kicking stuff towards the viewer. The best videos are the visual effects movies like Transformers or Battleship, where there are things flying around.

    • What is the incentive to buy a 1080p TV when the content was all standard def? There's this thing called early adopters who we can thank for subsidizing our future technology.

      Also why are you not able to get content in 4K? Are you not trying? Do you not have a Bluray player, or Netflix?

    • So what's the incentive to buy one of these things if the content world is pretty much still on 1080i/p, let alone 4k?

      Probably none but your incentives are not the same as everyone else's. I watch a fair bit of 4K content (streaming and disc) and all my computer monitors are 4K monitors (I'm looking at three of them as I type this). I could probably make use of an 8K monitor for computing tasks to start and I'm sure some content will be broadcast in 8K eventually. Probably just for bragging rights at the moment for most people but if they don't make it then nobody will ever make the content for it either. Since people

  • I get a lot out of it, especially when those pics are shrunk down to fit in the middle of a web page displayed on 4k or 1080p devices.

    It really conveys a lot of information about how superior the new ones are.

    But what I really like about the advent of 8k crap is that 1080p projectors are going to get even cheaper.

  • by Tailhook ( 98486 ) on Thursday August 30, 2018 @03:16PM (#57227080)

    I guess both 3D TV owners will have to decide if they prefer non-existent 8K content over non-existent 3D content.

    • Nice for a monitor, if you have an office big enough. For video, I can only barely perceive a difference between HD and 4K as it is. On the other hand, when HD first came out, lots of folks said they couldn't tell the difference from SD, laughable in retrospect.

      • Radiologists can tell. You may start to see more of them working remotely from their cabins on cruise ships.
    • >"I guess both 3D TV owners will have to decide if they prefer non-existent 8K content over non-existent 3D content."

      I love 3D content and bought quite a bit of it for my 75" 3D 4K set. At the viewing normal distance of 15-20 feet from a 75" screen, there would be no visible difference between 4K and 8K. None. Zero. It actually exceeds any human's biology. There is already barely any difference between quality 1080P upscaled to 4K and actual 4K content, itself. Perhaps on a 200+ inch TV it might ma

  • by snapsnap ( 5451726 ) on Thursday August 30, 2018 @03:16PM (#57227082)

    We barely have any 4k content yet.

    • This is a gimmick. We don't have the storage space or the bandwidth for 8K, and some people say they can't even really tell much of a difference between 1080 and 4K (I can, but I use a 40" 4K TV as my monitor so I'm watching from 3 feet away). One thing it will help with, admittedly, is making dead pixels less noticeable.
    • Maybe it's a new way to need a bigger GPU?

    • We barely have any 4k content yet.

      We have plenty, just not the content you want from the sources you want. Pretty much every major movie has a 4K release on bluray. Nearly every Netflix production has been in 4K.

  • by UnknownSoldier ( 67820 ) on Thursday August 30, 2018 @03:22PM (#57227124)

    Do these TV's support HDR? Because there is no mention of it in the article. I'd rather have 4K with HDR then 8K without HDR.

    We barely even have content for 4K -- how do they expect to sell them with almost zero content for 8K? I seriously doubt anyone over 65 could even tell the difference between 1080p, 2160p, and 4320p. BluRay just got (relatively speaking) support for 4K -- so who is actually buying these?

    Maybe I should just hold out for the 16K TVs. =P

    • BluRay just got (relatively speaking) support for 4K -- so who is actually buying these?

      Who is buying Blu-ray? I still do, but I genuinely feel like a fossil every time I do. Blu-ray is just plain annoying as a content delivery system, with its crappy menus, unskippable FBI threats, slow loading, obsolete optical format, small capacity and absurdly high price.

    • by AmiMoJo ( 196126 )

      The don't expect to sell many, these are mostly just tech demos to build to hype ahead of the 2020 Olympics that will be shot in 8k. Japan is looking to supply the world with an the tech needed to shoot, edit and broadcast 8k.

      • by AHuxley ( 892839 )
        Re "The don't expect to sell many" A lot of old film and many classics can scanned and restored at 8K from "film".
    • by Kaenneth ( 82978 )

      8k 3d 144fps HDR or GTFO.

    • We barely even have content for 4K -- how do they expect to sell them with almost zero content for 8K?

      And 4K TVs are selling quite well, so they expect to sell them the same way.

  • by Rick Schumann ( 4662797 ) on Thursday August 30, 2018 @03:29PM (#57227184) Journal
    Instead of this pointless arms race of resolution that no one actually supports, how about nice, high-quality, reliable 1080p HDTVs at a price point of $100US or less? I think that'd drive more sales than some astronomically priced piece of useless technology.
    • Instead of this pointless arms race of resolution that no one actually supports, how about nice, high-quality, reliable 1080p HDTVs at a price point of $100US or less?

      Never happen. There are minimums. In particular, there's a minimum amount of effort required to assemble an LCD TV. Once the yields on the panels themselves are high enough, and the factory is paid for (and some of them are paid for by now), that's the controlling factor. Some of the steps are still too finicky for robots so humans perform them, which induces a price floor a good deal higher than just the cost of materials + cost of energy to refine and assemble them. Still, you can get a $120 24" 1080 [woot.com]

    • It already looks to me like TV sets are cheaper now then they ever have been. It's hard to compare because the technology has advanced but growing up a 50" "Big screen" TV was several thousand dollars. Today it's like $300 to get the same size with a much sharper picture. But even the fairly standard 30" sets were still $500-$600 in the 90's.

      BTW: my 12 year old 55" Samsung seems to be holding up just fine. Granted I paid over 2k for it but at the time that seemed like a good deal. So maybe your experience w

  • by ljw1004 ( 764174 ) on Thursday August 30, 2018 @03:32PM (#57227198)

    I don't watch TV much. But I watched the World Cup earlier this year and was *dismayed* at the poor picture quality. There weren't enough pixels to see which player had the ball when the camera was zoomed out. When I sat up close then I saw a load of compression artifacts from transmission. The whole picture also lost clarity when panning - which happens all the time.

    I would love to watch an ultra-high bandwidth transmission of 8K soccer in future. I'd happily go to whichever pub or IMAX could broadcast it, and I'd pay a decent amount for it.

    • That is just poor transmission. Having 4k won't help. In the US soccer matches are usually uplinked using shared transceivers because it saves money on transmission costs.
  • A person with 20/20 visual acuity would need to sit 0.39 screen heights away to get the full benefit of the 4,320x7680 resolution. That would be 25 inches on a 65 inch diagonal TV.

    • A person with 20/20 visual acuity would need to sit 0.39 screen heights away to get the full benefit of the 4,320x7680 resolution. That would be 25 inches on a 65 inch diagonal TV.

      That's 7680x4320 for most people...

      And that sounds like just what I want. I specifically want a display device that can exceed human visual acuity. When you've successfully exceeded reality, we're done. Until then, we're not done. If I can see the pixels, they're too big. If I can't see the pixels for any reasonable use-case, then they're finally small enough. 4320p might finally be enough. 2160p isn't. Not quite.

      • by SteveSgt ( 3465 )

        So do you really, typically sit closer than 0.78 screen heights (51" from a 65" TV)? Because that's the distance at which a person with 20/20 eyesight can discern pixels on a 4k (3,840 x 2,160) TV.

        If it's an HDR set, definitely wear shades and sunscreen.

    • by Kjella ( 173770 )

      The only people who need 8K for moving pictures are people with beyond 20/20 sight. If you look at this chart [audioholics.com] the THX recommended viewing angle is 40 degrees, the most they allow for the front row of the cinema is 53 degrees. If you got a 100" screen, you'd have to sit 7.3' away for that, while to see 8K you'd have to be close than 6'. That said, 20/20 is just "normal" vision, about 30% have 20/15 vision and 1% has 20/10 vision. That would give you an effective viewing distance of 5.48 and 3.55 feet, meanin

    • That's based on the false premise that the maximum resolution someone can distinguish is a 1:1 mapping of their cone density.
      Sharp did an experiment and proved people can discern the difference in images well beyond that.

      Eyes receive much more information than a 1:1 mapping. One pixel isn't perceived by a single cone, it's being seen by multiple rods and cones and the brain does the math. And there are 2 eyes doing this in conjunction.
  • Before everyone goes yelling and screaming about how they have no use for this right now and that the eye can't tell a difference, I would like to say that 8K monitors are something that is desirable for anyone who has to sit in front of a computer all day.

    I remember going from 320x240 to 640x480 to 1024x768 to 1900x1200 to 2560x1600, back down to 1366x768 to 1920x1080 to 2560x1440 and currently at 3840x2160.

    Each increase in resolution brought a general increase in DPI. We have been hanging at around 93DPI

    • Re:InB4 Why (Score:4, Interesting)

      by Solandri ( 704621 ) on Thursday August 30, 2018 @05:13PM (#57227802)

      At 3840x2160, the jagged edges of fonts are almost gone. I can still see them but at such a low level that it is almost subconscious. A doubling of resolution should make that all go away PERMANENTLY.

      Jagged font edges can be "fixed" by anti-aliasing. Your brain is incredibly good at making up details that aren't there if it helps it make better sense of what it's seeing. So if it sees what looks like the smooth curve of the letter O, then it will see a smooth curve even if it's actually made up of different-brightness dots [re-vision.com]. The illusion is only broken when other info (non-aliased pixels) makes it obvious that the curve isn't smooth.

      If you don't believe anti-aliasing fixes it, then prepare to have your mind blown. Every TV image you've seen has been displayed at non-native resolution. When you watch a 1920x1080 TV, you're actually only seeing about 1890x1060 pixels. For obscure historical reasons, TVs overscan the video image [cnet.com]. So if a show is recorded at 1920x1080, the image that's displayed on your 19201080 TV is actually a crop of the center portion of the original image, enlarged to fit the 1920x1080 pixels of your TV screen. That breaks the 1:1 correspondence between image pixels and display pixels. But it's fixed by anti-aliasing. Usually bicubic interpolation [wikipedia.org], although lately Lanczos has been becoming more popular (it's more processor intensive, but processing power is cheap nowadays). So every TV image you've seen since we moved to digital TVs has had jaggies, they're just hidden from view by good anti-aliasing.

      The real problem with modern displays is that the pixels are square. Pixels aren't supposed to be square. They're supposed to represent an infinitesimally small point, so the most accurate representation is a round blob called a point spread function [fsu.edu]. Brightest (greatest representation of the pixel's color) in the center, with the edges fading out (color info mixing with that of adjacent pixels). This is actually how the old CRT monitors and TVs displayed pixels, which is why you could use them to display any screen resolution.

      But modern displays typically use a LCD grid with fixed-sized square pixels. Those squares add nonexistent information to each pixel (the sharp edges and the corners). This extraneous information makes the display appear sharper when displaying perfectly vertical or horizontal lines. But that sharpness is an illusion, and you pay the price in jaggies whenever displaying anything that's not perfectly vertical or horizontal. It also doesn't work when the underlying pixel grid of the image doesn't fall exactly on the physical pixel grid of the monitor. Which is why LCD monitors look fuzzy when displaying a non-native resolution which isn't divided by an integer multiple (which are the only resolutions which maintain the correspondence between image pixel edges and display pixel edges).

      Anti-aliasing can help, but it's just a band-aid rather than a real fix. Moving to higher resolutions makes the band-aid less noticeable, and from a technical standpoint may be easier than a true fix (which I'm not sure can even be done with LCDs or even OLEDs). I use a 1080p projector to display a 150" image. And the reason I'm anxious to move up to a 4k projector is that I can actually see the pixel grid. It's easy to zone out and ignore it when watching a movie, but every now and then I notice it and it becomes annoying.

  • I just (as in this month) bought a 4K TV. And now they tell me it's already obsolete? FML.

  • to chasing CPUs and GPUs. Spending $2.5k every 3 years to buy a new PC was expected.

    My laptop is, what, 5-6 years old? But still fully functional. My PS3 is gonna get replaced in a few months with a PS4, plus a bunch of used games from craigslist.

    In other words, I'm old, I've played that game, I'll leave it to the more gullible younger folks to maintain that bleeding edge, then when it's more a rough, scratchy edge I'll look into buying.
  • by rnturn ( 11092 )
    We're about to dump our ancient Hitachi projection TV and when the 8K sets hit the stores--probably just in time for the Super Bowl--we'll be able to get a 4K set for much less.
  • I saw this technology in 2012 in Akihabara Japan where it was being used to livestream the London Olympics opening/closing. Japan's goal is to have 8K HDTV broadcasting nationwide for the 2020 Tokyo Olympics.
  • With a eye resolution of 1 arc-second, you won't be able to tell the difference between 4k and 8k (and likely normally 2k / 1080p) at normal watching distance on these TVs. You'd need to be sitting less than a half meter (or about 1.5 feet) from your 88" screen to see the difference. At a normal distance of 3m, your eye physically cannot see the resolution difference -- the fovea just didn't have enough receptors.
    If 8K brings in a better colour gamut, like 4k did, then you might notice that (which is why

Be sociable. Speak to the person next to you in the unemployment line tomorrow.

Working...