Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Displays Games Hardware

High End Graphics Cards Tested At 4K Resolutions 201

Vigile writes "One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080. Yes, 25x16/25x14 panels are coming down in price, but it might be the influx of 4K monitors that makes a splash. PC Perspective purchased a 4K TV for under $1500 recently and set to benchmarking high end graphics cards from AMD and NVIDIA at 3840x2160. For under $500, the Radeon HD 7970 provided the best experience, though the GTX Titan was the most powerful single GPU option. At the $1000 price point the GeForce GTX 690 appears to be the card to beat with AMD's continuing problems on CrossFire scaling. PC Perspective has also included YouTube and downloadable 4K video files (~100 mbps) as well as screenshots, in addition to a full suite of benchmarks."
This discussion has been archived. No new comments can be posted.

High End Graphics Cards Tested At 4K Resolutions

Comments Filter:
  • by Kjella ( 173770 ) on Tuesday April 30, 2013 @07:57PM (#43596465) Homepage

    I've done the distance/size check, I don't need an UHDTV from where I'm sitting. There's not content for it anyway. But I would like a 27-30" 3840x2160 monitor for my computer.

    • Re: (Score:3, Informative)

      by Skapare ( 16644 )

      3840 is not 4k. 4096 is 4k.

    • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Tuesday April 30, 2013 @09:20PM (#43596899) Homepage

      4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.

      • by crdotson ( 224356 ) on Tuesday April 30, 2013 @09:35PM (#43596997)

        Thank you. Most people just don't seem to understand that monitors aren't done until you can't tell the difference between a monitor and a window! It's "1920x1080 should be enough for anybody" mentality. You'd think people would learn after a while.

        • Comment removed based on user account deletion
        • Most people just don't seem to understand that monitors aren't done until you can't tell the difference between a monitor and a window!

          I just can't wait until I can open my monitor and get some fresh air into my flat...

      • by dfghjk ( 711126 ) on Tuesday April 30, 2013 @10:30PM (#43597259)

        "The image will look more life-like than any of the common TVs available today..."

        Not because of the wide gamut it won't. Having the gamut on your output device doesn't mean you have it on your input device. Content won't exist that uses it so it WILL be "relegated to photographers and graph (sic) designers", standard or not. The value is suspect and the cost is mandatory extra bit depth leading to higher data rates.

        The side effect of wide gamut displays displaying common content in non-color managed environments is that it looks worse, not better. This is television we are talking about, not Photoshop. Today's HD content won't look the least bit better on a wide gamut display, it could only look worse.

        • by JanneM ( 7445 ) on Tuesday April 30, 2013 @10:57PM (#43597423) Homepage

          It's different for different parts of the business of course, but the graphic designers I know personally (through a family member) don't care about monitor gamut or colour fidelity at all. Sounds odd, perhaps, but there's good reason for it.

          Most graphic design is not for the web, but for physical objects. Anything you see around you that's printed or patterned - kitchen utensils, tools, and household objets; clothes and textile prints; books, calendars, pamphlets; not to mention the cardboard and plastic boxes it all came in - has been designed by a graphic designer. And it's all printed using different kind of processes, on different materials, with different kinds of inks and dyes.

          A monitor, any monitor, simply can't show you what the finished design will look like, since it can't replicate the effect of the particular ink and material combination you're going to use. So they don't even try. Instead they do the design on the computer, but choose the precise colour and material combination by Pantone patches. We've got shelves of sample binders at home, with all kinds of colour and material combinations for reference. As an added bonus you can take samples into different light conditions and see what they look like there.

          The finished design is usually sent off as a set of monochrome layers, with an accompanying specification on what Pantone colour each layer should use. They do make a colour version of it too, but that's just to give the client a rough idea of what the design will look like.

        • Content won't exist that uses it so it WILL be "relegated to photographers and graph (sic) designers", standard or not.

          Except every 4K/8K UHDTV broadcast will be using Rec. 2020, in this wide gamut, and cameras have been able to capture images outside of the sRGB gamut for some time. The content will exist.

          The side effect of wide gamut displays displaying common content in non-color managed environments is that it looks worse, not better.

          Right. This is because the de-facto standard 8-bit output is sRGB. These monitors are doing something outside of this standard and require proper color management to make things look correct.

          The difference here is that we've got a fairly clean slate with 10-/12-bit UHDTV and Rec. 2020. There's no reason for any device to

        • by AmiMoJo ( 196126 ) *

          TVs will include colour correction as part of the up-scaling process for HD and SD video.

          4k is a stop-gap on the way to 8k. NHK has said they are not going to bother with it and go directly to 8k instead, which is a huge step up and needs a lot of special equipment to be developed. For example you can't visually check focus on a studio monitor at 8k, you need auto-focus to stand a chance.

          That's for video of course, for computers I'd love a 4k monitor.

      • by mjwx ( 966435 )

        4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.

        In theory yes,

        In practice people cant tell the difference between 6 bit and 10 bit colour. Besides this, most people wont be able or willing to configure or manage colour on their TV set properly. Most people cant be bothered to set their monitor at the proper resolution.

        It's the same with DVD and Bluray, most people cant tell the difference. They only think they can because they know it's bluray. I can easily convince people an upscaled DVD is bluray simply by telling them it's bluray. They think I'm

        • In practice people cant tell the difference between 6 bit and 10 bit colour.

          Some particularly problematic scenes involving mostly a single color should still benefit, but I tend to agree especially for movie watching. The real purpose of moving to 10-bit components is to accommodate the ~3x larger gamut without introducing banding compared to 8-bit.

      • by Luckyo ( 1726890 )

        That's not really true. The current limit on the display gamut is typically not the broadcasting spec but the display technology. Essentially all LCDs available on the market have significantly worse gamut then CRTs, and CRTs don't have enough to cover the current HDTV spec.

        The breakthrough that is currently waiting to happen is OLED. It's the only technology in addition to plasma that has a decent chance of actually making use of gamut available from the signal spec. Considering the difficulties in making

      • Oooh, that is awesome that 10-bit display is the minimum. I've hated banding in 24-bit images for ages.

        Now if we could get 120 Hz standardized we'd be all set. Checking the link, oooh, I see it is. Excellent!

        I really don't think there is enough demand though for 4K resolutions? Chicken-Egg problem of not enough content - not enough TVs to demand content.

        At least the spec is shaping up VERY nicely.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      A 9.7" retina display costs 55$ off the shelf for horsiest. That's about 40$ BoM or lower. Quintuple\Sextuple that for a 23" 4K and add a thick margin and you end up in the 300-350$ range without taking into account how this production will scale to make it all cheaper.

      As for your 27-30" 3840x2160 desire, it's actually quite easily doable now since it's really not that dense when you consider stuff like 5" devices having 1920x1080.

      I would imagine a small OEM could make an order for these right from an exist

    • Eh. I could see using this for a PC, but honestly, I drive all of my media through a HTPC, and I'll be darned if I'm going to have to buy something that can fit a full height video card just to watch videos, plus the video card. My 51" 1080p plasma display at 10 feet looks crystal clear, with no discernable pixels. Maybe in 5-10 years once the life has been zapped out of this plasma, I'll think about it. But until it is commodity hardware, no thanks. By all means though, other folks feel free. I'm more inte
  • No (Score:5, Insightful)

    by bhcompy ( 1877290 ) on Tuesday April 30, 2013 @08:02PM (#43596507)

    One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080.

    Really? You've never heard of the Dell U2410? Fuck 16:9

    • 1920x1200 have been around long before the Dell U2410, so it's silly they ignored this. But would you really reject a 2560x1440 display because it's 16:9? How about a 4k display? That's just silly.

      People need to get over this 16:9 vs. 16:10 garbage. What matters is the number of pixels. Once you get past 1200 lines or thereabouts, it's all gravy. I'm happily using two 16:9 displays, a pair of Dell U2711, and I'm well pleased with that. The extra cost to get an additional 160 lines from a 16:10 30" sc

      • Would I reject it? No. But I'd look to trade it for a 16:10 monitor ASAP. Some games (particularly strategy games) are better with taller screens.
      • by epyT-R ( 613989 )

        Those are your preferences.. I think anything over 24" is useless for desktop use as it requires too much neck panning and eyeball rolling. It's not just the number of pixels that matters, it's how many can be crammed into your visual range at a time. I'd like to see a 3840x2400 panel in 23-24", 120hz or better, no input lag/ghosting, and deep color support. Of course, this is unobtanium along with the gpu to drive it well, but everyone has different priorities.

      • by mjwx ( 966435 )

        But would you really reject a 2560x1440 display because it's 16:9?

        I'd happily pay a few dollars more for a 2560x1600 display because it is 16:10. 16:10 displays are superior to 16:9 for almost all computing purposes. For games it gives me a taller FOV, for work it's exactly 2 A4 pages side by side and gives me a taller view (yes, an extra 4 CM really does make a difference when working on a large spreadsheet, config file or script), with video editing you can have the tools on screen without overlaying

        • Probably because they are close to the golden ratio. 16/10 = 1.6, and the golden ratio is about 1.618. 16/9 = 1.78.

          We tend to find it pleasing (hence the name) so it makes sense to have a monitor that is around it.

    • by Twinbee ( 767046 )
      I wish we'd all standardize on 1:1. That way, we get a fair compromise between those who need height, and those who prefer width.
  • by Skapare ( 16644 ) on Tuesday April 30, 2013 @08:49PM (#43596761) Homepage

    ... like 4096x1728 (digital cinema size plus a few more pixels to make it mathematically right)? Feel free to make the actual LCD pixels a bit smaller so it can all fit in a decent size (not over 80cm, please). Hell, I'd be happy even with 2048x1280 for now so I can avoid the border bumping on 1920x1200.

    • Seriously, whining about a few extra pixels more or less is silly. The "double HDTV" version of 4k is fine, and works well for things given that it makes scaling a 1920x1080 signal easy. There is nothing special about 2^12 when it comes to monitors. We also wouldn't want a computer monitor with such a wide ratio. When you are doing computer work, vertical real estate matters. 2:39:1 CinemaScope is fine for a movie. It isn't what you want when programming or writing documents.

      If you need one for pro digital

  • by flayzernax ( 1060680 ) on Tuesday April 30, 2013 @09:28PM (#43596947)

    As ive seen pictures of peoples massive 6 monitor setups...

    Though as someone who's been a gamer since duke nukem... and the ultima games... I don't see what all the hype is about. The colors aught to be much nicer on a 4k display, but I know I won't be spending money on one until their dirt cheap or I get one as a gift (which means they'll be dirt cheap by then).

    Then again you can make a pretty game, that gets pretty boring pretty fast =) I've played some hideous monstrosities with the worst interfaces known to man just because the actual game was fun.

    • I remember seeing a setup someone made with 21 monitors to play Falcon 4.0 (IIRC). They had 180 degrees of monitors horizontally, one below the desk and several above in the middle area.
      • Thats pretty cool, I played Falcon 3.0 and their game was very good. Military grade simulation software =) It was definitely worth the custom treatment.

    • by mjwx ( 966435 )

      As ive seen pictures of peoples massive 6 monitor setups...

      There are a few games where a multi-monitor setup is really good. Flight sims in particular where you want 1 or even 2 monitors to your side to display the side windows, maybe one above or one for the instrumentation.

      In fact if you're learning to fly, a multi-monitor setup with HOTAS is a godsend.

      But so few games actually support multi-monitor setups. So for the most part they are just e-peen extensions.

      • Aye, simulations are the perfect application for that kind of set up. Artists like the multiple monitors, and the set up that comes to mind is from an artist who is also an avid Eve Online player.

    • They have to do with backlight and filters. You can already get monitors with much wider gamuts than normal (normal meaning sRGB). This can lead to much more realistic colours since it can cover more of what the eye is capable of seeing, and more importantly can cover all of the colours you are likely to encounter in the world, excluding some special cases like lasers.

      The issue currently is that most software isn't colour space aware, so it won't render things right, you'll get oversaturation when you use a

  • Please put spaces around your "/".

    25x16/25x14 is 3 dimensions.
  • feeding a 4K square screen is beyond the bandwidth of the pathetic "broadband" most of us can get.

  • I can play games optimized for 2005 era XBOX360 hardware in 4K, when at 1280x1024 I can already see the 128x128 textures clear as day?

    whats my motivation here?

  • by L4t3r4lu5 ( 1216702 ) on Wednesday May 01, 2013 @04:25AM (#43598579)
    I swear to Christ if the next "leap" is sandwiching two 1920x1080 monitors together and making a fucking TV for my computer desk will flip my shit.

    Bring Retina quality displays to the desktop. If I can get 1080p on a fucking mobile phone [wikipedia.org] I expect 4k on my desktop at 20".

    Make it so.
  • I'm actually in the market to replace my existing monitors (2xDell 2408WFP) with new monitors. I'm currently considering 3x27" 2560x1440 LED IPS monitors (Dell U2713HM or LG 27EA83-D are the top 2 choices right now) so this interests me greatly, and something I keep seeing popup.

    We're approaching "retina' resolution [isthisretina.com] on the desktop anyway at 2560x1440 already. I of course mean retina in the Apple marketing sense as "at the normal viewing distance the human eye cannot resolve an individual pixel".

    So

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...