High End Graphics Cards Tested At 4K Resolutions 201
Vigile writes "One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080. Yes, 25x16/25x14 panels are coming down in price, but it might be the influx of 4K monitors that makes a splash. PC Perspective purchased a 4K TV for under $1500 recently and set to benchmarking high end graphics cards from AMD and NVIDIA at 3840x2160. For under $500, the Radeon HD 7970 provided the best experience, though the GTX Titan was the most powerful single GPU option. At the $1000 price point the GeForce GTX 690 appears to be the card to beat with AMD's continuing problems on CrossFire scaling. PC Perspective has also included YouTube and downloadable 4K video files (~100 mbps) as well as screenshots, in addition to a full suite of benchmarks."
Now where's the cheap monitors? (Score:5, Insightful)
I've done the distance/size check, I don't need an UHDTV from where I'm sitting. There's not content for it anyway. But I would like a 27-30" 3840x2160 monitor for my computer.
Re: (Score:3, Informative)
3840 is not 4k. 4096 is 4k.
Re:Now where's the cheap monitors? (Score:4, Informative)
Re: (Score:2)
Yes - which is why I make a point to call it 2160p, not 4K.
Re: (Score:2)
It's not double, it's quadruple, which is why it's called 4k.
1920+1920=3840
1080+1080=2160
It's set in a grid like this:
1080p1080p
1080p1080p
That's 4k. Simple geometry error on your part. Understandable.
Re:Now where's the cheap monitors? (Score:4, Informative)
Quadruple is ***NOT*** why it's called 4k.
"4k" is short for 4000, e.g, pixels. The "4" in 4000 has absolutely nothing to do with the quadrupling. It's merely a coincidence.
Re: (Score:2)
of in this case, "almost 4000".
Re: (Score:2)
3840 is 4k, and 1920 is 2k. 8k is only 7680.
I guess it sounds better than 3.8k and allows perfect pixel doubling for high quality scaling.
Re: (Score:2)
Actually in broadcast 2k is 2048 - there's a good number of 2048x1080 standards. 1920x1080 (or 1050 or 1035) are generally referred to as HD.
Re:Now where's the cheap monitors? (Score:5, Informative)
4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.
Re: Now where's the cheap monitors? (Score:5, Insightful)
Thank you. Most people just don't seem to understand that monitors aren't done until you can't tell the difference between a monitor and a window! It's "1920x1080 should be enough for anybody" mentality. You'd think people would learn after a while.
Re: (Score:2)
Re: (Score:2)
Most people just don't seem to understand that monitors aren't done until you can't tell the difference between a monitor and a window!
I just can't wait until I can open my monitor and get some fresh air into my flat...
Re: (Score:2)
Smellovision.
Re: (Score:2)
Blame marketing of "HD"
There's been no market demand for higher resolutions.
Re: (Score:3)
Thats mostly because we got really good multi-monitor support a few years ago. we can already get 4, 6 or even 9 times the resolution of 1080p. And if you are willing to blow enough money or cut apart an LCD you can do it with almost no bezel.
Re: (Score:2)
Because most people can't tell and/or don't care. There's not much money in catering to the resolution queens.
You do know that if you have gold-plated HDMI cables (at only $35K each) then your High Definition Television satisfaction quotient rises by almost 3 points?
Re:Now where's the cheap monitors? (Score:5, Informative)
"The image will look more life-like than any of the common TVs available today..."
Not because of the wide gamut it won't. Having the gamut on your output device doesn't mean you have it on your input device. Content won't exist that uses it so it WILL be "relegated to photographers and graph (sic) designers", standard or not. The value is suspect and the cost is mandatory extra bit depth leading to higher data rates.
The side effect of wide gamut displays displaying common content in non-color managed environments is that it looks worse, not better. This is television we are talking about, not Photoshop. Today's HD content won't look the least bit better on a wide gamut display, it could only look worse.
Re:Now where's the cheap monitors? (Score:5, Informative)
It's different for different parts of the business of course, but the graphic designers I know personally (through a family member) don't care about monitor gamut or colour fidelity at all. Sounds odd, perhaps, but there's good reason for it.
Most graphic design is not for the web, but for physical objects. Anything you see around you that's printed or patterned - kitchen utensils, tools, and household objets; clothes and textile prints; books, calendars, pamphlets; not to mention the cardboard and plastic boxes it all came in - has been designed by a graphic designer. And it's all printed using different kind of processes, on different materials, with different kinds of inks and dyes.
A monitor, any monitor, simply can't show you what the finished design will look like, since it can't replicate the effect of the particular ink and material combination you're going to use. So they don't even try. Instead they do the design on the computer, but choose the precise colour and material combination by Pantone patches. We've got shelves of sample binders at home, with all kinds of colour and material combinations for reference. As an added bonus you can take samples into different light conditions and see what they look like there.
The finished design is usually sent off as a set of monochrome layers, with an accompanying specification on what Pantone colour each layer should use. They do make a colour version of it too, but that's just to give the client a rough idea of what the design will look like.
Re: (Score:3)
Content won't exist that uses it so it WILL be "relegated to photographers and graph (sic) designers", standard or not.
Except every 4K/8K UHDTV broadcast will be using Rec. 2020, in this wide gamut, and cameras have been able to capture images outside of the sRGB gamut for some time. The content will exist.
The side effect of wide gamut displays displaying common content in non-color managed environments is that it looks worse, not better.
Right. This is because the de-facto standard 8-bit output is sRGB. These monitors are doing something outside of this standard and require proper color management to make things look correct.
The difference here is that we've got a fairly clean slate with 10-/12-bit UHDTV and Rec. 2020. There's no reason for any device to
Re: (Score:2)
TVs will include colour correction as part of the up-scaling process for HD and SD video.
4k is a stop-gap on the way to 8k. NHK has said they are not going to bother with it and go directly to 8k instead, which is a huge step up and needs a lot of special equipment to be developed. For example you can't visually check focus on a studio monitor at 8k, you need auto-focus to stand a chance.
That's for video of course, for computers I'd love a 4k monitor.
Re: (Score:2)
Re: (Score:2)
4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.
In theory yes,
In practice people cant tell the difference between 6 bit and 10 bit colour. Besides this, most people wont be able or willing to configure or manage colour on their TV set properly. Most people cant be bothered to set their monitor at the proper resolution.
It's the same with DVD and Bluray, most people cant tell the difference. They only think they can because they know it's bluray. I can easily convince people an upscaled DVD is bluray simply by telling them it's bluray. They think I'm
Re: (Score:2)
In practice people cant tell the difference between 6 bit and 10 bit colour.
Some particularly problematic scenes involving mostly a single color should still benefit, but I tend to agree especially for movie watching. The real purpose of moving to 10-bit components is to accommodate the ~3x larger gamut without introducing banding compared to 8-bit.
Re: (Score:2)
In practice people cant tell the difference between 6 bit and 10 bit colour. .
That is unfair.
It's not unfair all.
Most people wont be able to tell. The graphs you linked to tell you the measurable difference, but that's using a device to measure it, we're talking about people here. People are terrible at measuring things.
In a blind test. Most people wont be able to tell the difference.
Re: (Score:3, Funny)
You insensitive clod! They are blind! Of course they can't tell the difference!
Jees!
(ducks)
Re: (Score:2)
And technology able to display this signal... is not really coming at all. We've taken a huge leap backwards in terms of color gamut displayed by TVs when we made the switch to LCDs, and the only decently promising tech that is coming that has decent enough gamut to compete with CRTs is OLED... which doesn't seem to scale all that well to big screens and has huge problems with longetivity (dimming).
What's the point of having the awesome signal being able to carry a much large color space, when you have no m
Re: (Score:2)
That's not really true. The current limit on the display gamut is typically not the broadcasting spec but the display technology. Essentially all LCDs available on the market have significantly worse gamut then CRTs, and CRTs don't have enough to cover the current HDTV spec.
The breakthrough that is currently waiting to happen is OLED. It's the only technology in addition to plasma that has a decent chance of actually making use of gamut available from the signal spec. Considering the difficulties in making
Re: (Score:2)
Oooh, that is awesome that 10-bit display is the minimum. I've hated banding in 24-bit images for ages.
Now if we could get 120 Hz standardized we'd be all set. Checking the link, oooh, I see it is. Excellent!
I really don't think there is enough demand though for 4K resolutions? Chicken-Egg problem of not enough content - not enough TVs to demand content.
At least the spec is shaping up VERY nicely.
Re: (Score:2, Interesting)
A 9.7" retina display costs 55$ off the shelf for horsiest. That's about 40$ BoM or lower. Quintuple\Sextuple that for a 23" 4K and add a thick margin and you end up in the 300-350$ range without taking into account how this production will scale to make it all cheaper.
As for your 27-30" 3840x2160 desire, it's actually quite easily doable now since it's really not that dense when you consider stuff like 5" devices having 1920x1080.
I would imagine a small OEM could make an order for these right from an exist
Re: (Score:2)
No (Score:5, Insightful)
Really? You've never heard of the Dell U2410? Fuck 16:9
Re: (Score:2)
1920x1200 have been around long before the Dell U2410, so it's silly they ignored this. But would you really reject a 2560x1440 display because it's 16:9? How about a 4k display? That's just silly.
People need to get over this 16:9 vs. 16:10 garbage. What matters is the number of pixels. Once you get past 1200 lines or thereabouts, it's all gravy. I'm happily using two 16:9 displays, a pair of Dell U2711, and I'm well pleased with that. The extra cost to get an additional 160 lines from a 16:10 30" sc
Re: (Score:2)
Re: (Score:2)
Those are your preferences.. I think anything over 24" is useless for desktop use as it requires too much neck panning and eyeball rolling. It's not just the number of pixels that matters, it's how many can be crammed into your visual range at a time. I'd like to see a 3840x2400 panel in 23-24", 120hz or better, no input lag/ghosting, and deep color support. Of course, this is unobtanium along with the gpu to drive it well, but everyone has different priorities.
Re: (Score:3)
I'd happily pay a few dollars more for a 2560x1600 display because it is 16:10. 16:10 displays are superior to 16:9 for almost all computing purposes. For games it gives me a taller FOV, for work it's exactly 2 A4 pages side by side and gives me a taller view (yes, an extra 4 CM really does make a difference when working on a large spreadsheet, config file or script), with video editing you can have the tools on screen without overlaying
They also look nice (Score:2)
Probably because they are close to the golden ratio. 16/10 = 1.6, and the golden ratio is about 1.618. 16/9 = 1.78.
We tend to find it pleasing (hence the name) so it makes sense to have a monitor that is around it.
Re: (Score:3, Insightful)
Why? Why does 16:10 make a difference at that resolution? I mentioned the 2560x1600 displays, but you know what, they cost hundreds more and they have lower pixel density. The premium for 160 pixels is 30% or more, hell with Dell on Amazon right now it's 50% more.
What exactly are people doing that requires 16:10? I've used 'em, I like 'em, but I'll take 2560x1440 over 1920x1200 any day of the week. Likewise I'll take 3840x2160 over 2560x1600.
If the premium for 16:10 was in the neighborhood of 10-15% fo
Re: (Score:3)
What exactly are people doing that requires 16:10?
There's really only one advantage that 16:9 has over 16:10, and that's smaller black borders (or no borders at all) for widescreen video content. Otherwise, the vertical real estate is very nice to have, and I've found 2560x1600 (which I've used for the last 5 years) somehow really hits the sweet spot between vertical size and widescreen.
Re: (Score:3)
16:10 tends to work out better for office work. Sure, the higher res makes it less important. But its the physical size that makes it less important ... depending on how much space you have to push the display back.
But once you get up close to the holy grail of true 4k which is 4096, why even bother with 3840? Cinema digital is shot in 4096 (and up). 3840 should be boycotted or even banned.
Re: (Score:3)
I wish they'd be honest about those resolutions. It is annoying that 3840 is being advertised as 4k. Clearly it's not. Sure it's double 1080p, but 1080p ain't 2k, so 3840 shouldn't be called 4k.
It's also annoying that they put 1080p and 2k in the graph, then just labeled this new display 4k. Dammit, so close, they acknowledged 2k as a valid format, but ignored real 4k.
Re: (Score:2)
It has four times the pixels. Because when you double both vertical and horizontal, total is quadrupled.
Basic math.
Re: (Score:3)
And if they called it 4xHD, I'd be fine with it.
Re: (Score:2)
But once you get up close to the holy grail of true 4k which is 4096, why even bother with 3840? Cinema digital is shot in 4096 (and up). 3840 should be boycotted or even banned.
No black bars. People hated it when we went widescreen, they won't accept another round. There is so much non-cinema 16:9 content that can't be remastered, and even if it could all existing DVDs/BluRays don't have it. The only thing that could give a hint of 17:9 adoption (4096x2160 seem to be the standard for cinema monitors) is if 4K BluRays come with the ability to ship both 3840x2160 and 4096x2160 on the same disc, like an extra 256x2160 slice added to the picture. Or since you're probably doing a bit o
Re: (Score:2)
because then you're wasting visible space. 16:10 takes advantage of more vertical space in your field of view.
Re: (Score:2)
Re: (Score:3)
I heard at one time that 16:10 came out of the video editing industry. Basically they were working on 16:9 video, so they had displays with extra space at the bottom for controls. These displays then were adapted to the higher end computer market. However once 16:9 displays were being manufactured in large quantities for consumer TVs, I imagine that drove the price down for manufacturing 16:9 computer monitors. I'm fairly certain the decision to use 1920x1080 in the TV industry had nothing to do with co
Re: (Score:2)
Re: (Score:2)
You're half right. More vertical space is great, but the ratio doesn't matter. For the work you are talking about what matters is the height.
There's a funky Dell out now that has an even wider aspect ratio, it's around 2.33:1 I believe. Now I'd like that, if it had more pixels. It's 2560x1080. That's a detriment. But imagine if that was more along the lines 3350x1440. Would you still complain that was too wide? You could have three documents, web pages, whatever up, side by side, and still have a lo
Re: (Score:2)
Resolution and pixel density a
Re: (Score:2)
There's a funky Dell out now that has an even wider aspect ratio, it's around 2.33:1 I believe. Now I'd like that, if it had more pixels. It's 2560x1080. That's a detriment. But imagine if that was more along the lines 3350x1440. Would you still complain that was too wide? You could have three documents, web pages, whatever up, side by side, and still have a lot of vertical space.
And if it was twice as wide again you could have six documents open side by side, but you'd probably want it curved inwards at each end, or else you'd have to be like a table football player sliding on your chair from side to side.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I just took a 21" CRT to the recycling place. In 1995, it cost about $2200 new. In 2001, my employer gave it to me as scrap when our building was closed and they decided that a lot of that stuff was cheaper to give away than to move to some warehouse across the country. (Plus it was a tiny bit of good will that the local management could show the laid-off employees when the Big Guys were being callous pricks kicking us to the curb while we were still going to 9/11 funerals.)
ObTopic: $400 is an expensive mo
Re: (Score:2)
I bought a Gateway open-box 21" monitor back in the late 90's for about $1k. I think it could do 1600x1200, but it wasn't real solid at that. That was one heavy beast to move around. I got rid of it some time ago, don't remember how. I got a Dell 19" Trinitron from one employer in 2000. That was sweet, although it had those two strange horizontal lines, but other than that the image was solid. Eventually its color started to go though. I also bought a 17" NEC monitor in '95 or '96 for just under $700
Re: (Score:2)
I got a Dell 19" Trinitron from one employer in 2000. That was sweet, although it had those two strange horizontal lines,
That was true of all Trinitron monitors. Here is what Wikipedia says [wikipedia.org] about them:
Even small changes in the alignment of the grille over the phosphors can cause the coloring to shift. Since the wires are thin, small bumps can cause the wires to shift alignment if they are not held in place. Monitors using this technology have one or more thin tungsten wires running horizontally across the g
When will we get REAL 4k displays ... (Score:3)
... like 4096x1728 (digital cinema size plus a few more pixels to make it mathematically right)? Feel free to make the actual LCD pixels a bit smaller so it can all fit in a decent size (not over 80cm, please). Hell, I'd be happy even with 2048x1280 for now so I can avoid the border bumping on 1920x1200.
Who cares? (Score:2)
Seriously, whining about a few extra pixels more or less is silly. The "double HDTV" version of 4k is fine, and works well for things given that it makes scaling a 1920x1080 signal easy. There is nothing special about 2^12 when it comes to monitors. We also wouldn't want a computer monitor with such a wide ratio. When you are doing computer work, vertical real estate matters. 2:39:1 CinemaScope is fine for a movie. It isn't what you want when programming or writing documents.
If you need one for pro digital
Re: (Score:2)
Two of the three most popular desktop OS famillies (windows and linux) don't have proper provisions for resolution independent font and window sizes.
The problem is that they tend to mis-report the physical size of the viewable area of the displays, without which you can't work out the scaling factor. (The low-level font rendering engine wants pixel-based sizes for obvious reasons, though you might well not be normally working at that level.) However, a bigger problem in practice is that the non-text parts of windows are not designed with scaling in mind. The most obvious example of that is where someone uses absolute positions for all the components wit
Re: (Score:2)
In reality, almost no desktop software is capable of displaying at this high DPI without messing up font size or layout. Two of the three most popular desktop OS famillies (windows and linux) don't have proper provisions for resolution independent font and window sizes. It's really leet that your have a bazillion ppi, but if that means that a web page renders at the size of a stamp, you can't properly read it a distance comfortable for your eyes. I work with computers. That means I have to look at a screen for 8 hours a day and then drive home in busy traffic. If my eyes, neck and back muscles are tired and sore from staring at a monitor all day, that's not going to be a huge success. I need a display that will comfortably display my 20+ application windows at a good arms length or a bit more. Until Operating Systems are capable of doing that even for legacy applications without depending on 86 PPI screens (still the standard for Windows 7) I don't want insane PPI numbers but actual screen real estate. I currently have two 30" IPS screens at home and those are comfortable. Increasing the PPI on those will not make them more comfortable and in reality, my productivity will not rise even with added pixel count.
are you just trolling or are you ACTUALLY USING OSX? because windows and (popular desktops on) linux have had scaling for fonts, window decors etc user elements(for properly built programs) for years(over a decade). - it's a must thing to have if using fullhd resolution on a 15" screen. and works just fine for most programs(it's an easy way to see if an application was coded crappily or not).
osx has just crappy pixel doubling for retina displays..
Re: (Score:2)
osx has just crappy pixel doubling for retina displays..
Bullshit. The text rendering engine properly uses the available resolution, and the major apps that use their own cross-platform rendering engines (MS, Adobe) have been updated. Vector graphics also get rendered properly at the actual screen resolution. Apple limits the APIs notion of screen resolution to "regular" and doubled just to make things easier on developers with regard to bitmapped graphics while avoiding crappy scaling of bitmapped graphics. (Of course applications that don't provide high-res ver
Well there is a niche... (Score:3)
As ive seen pictures of peoples massive 6 monitor setups...
Though as someone who's been a gamer since duke nukem... and the ultima games... I don't see what all the hype is about. The colors aught to be much nicer on a 4k display, but I know I won't be spending money on one until their dirt cheap or I get one as a gift (which means they'll be dirt cheap by then).
Then again you can make a pretty game, that gets pretty boring pretty fast =) I've played some hideous monstrosities with the worst interfaces known to man just because the actual game was fun.
Re: (Score:2)
Re: (Score:2)
Thats pretty cool, I played Falcon 3.0 and their game was very good. Military grade simulation software =) It was definitely worth the custom treatment.
Re: (Score:2)
As ive seen pictures of peoples massive 6 monitor setups...
There are a few games where a multi-monitor setup is really good. Flight sims in particular where you want 1 or even 2 monitors to your side to display the side windows, maybe one above or one for the instrumentation.
In fact if you're learning to fly, a multi-monitor setup with HOTAS is a godsend.
But so few games actually support multi-monitor setups. So for the most part they are just e-peen extensions.
Re: (Score:2)
Aye, simulations are the perfect application for that kind of set up. Artists like the multiple monitors, and the set up that comes to mind is from an artist who is also an avid Eve Online player.
Colours are independent of rez (Score:2)
They have to do with backlight and filters. You can already get monitors with much wider gamuts than normal (normal meaning sRGB). This can lead to much more realistic colours since it can cover more of what the eye is capable of seeing, and more importantly can cover all of the colours you are likely to encounter in the world, excluding some special cases like lasers.
The issue currently is that most software isn't colour space aware, so it won't render things right, you'll get oversaturation when you use a
3D TV??? (Score:2)
25x16/25x14 is 3 dimensions.
problem for usa-ers (Score:2)
feeding a 4K square screen is beyond the bandwidth of the pathetic "broadband" most of us can get.
so for only 2500$ (Score:2)
I can play games optimized for 2005 era XBOX360 hardware in 4K, when at 1280x1024 I can already see the 128x128 textures clear as day?
whats my motivation here?
We're finally getting higher PPI? (Score:3)
Bring Retina quality displays to the desktop. If I can get 1080p on a fucking mobile phone [wikipedia.org] I expect 4k on my desktop at 20".
Make it so.
Where? (Score:2)
We're approaching "retina' resolution [isthisretina.com] on the desktop anyway at 2560x1440 already. I of course mean retina in the Apple marketing sense as "at the normal viewing distance the human eye cannot resolve an individual pixel".
So
Re:Meh (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
For people who do technical work with a computer, the ability to have several high definition windows open at once is a tremendous benefit. Integrated circuit design, programming, CAD graphics, etc.
Did you forget "reading slashdot clearly at *any* zoom level?" ;)
Re: (Score:2)
It would help if those apps had better multi-monitor support. A lot of CAD apps don't seem to have updated their interfaces since the move from DOS to Windows 3.11. Having said that it took until WIndows 8 to get proper multi-monitor task bars.
Re: (Score:2)
Re: (Score:2)
I think it comes down to screen size and viewing distance. Personally I think 4K could provide a better movie experience than 3D. Sitting close to an enormous screen is pretty damn immersive. Then again, I don't particularly want my living room dominated by the TV.
I do like a large screen to work with, although I cannot use a 1080 screen on a 15" laptop (everything's too small for my eyes). Increasing the resolution and DPI better not make things smaller!
Re: (Score:2)
Re: (Score:2)
No. Just... no.
If I could have my perfect setup, I'd have a 32" 4096x2560 main monitor, with two 27" 2560x1600 monitors to each side. And running each at 144Hz, with full AdobeRGB coverage (or better), while we're at it.
I just bought a 1440p display, and it is hands-down the best single computer component I have ever bought. Better than getting an SSD. Better than any new processor, or new video card, or new sound card, or new RAM.
True, I'm probably never going to watch video at that resolution. And it's li
Re: (Score:2)
That's really the important thing. We've been stuck in a rut with display sizes for a long, long time. It's time to move pixel density forward. The 27" displays that have been on the scene for the past 2 years or so are great, but so far the price hasn't dropped a great deal (disregarding the generic Korean Dell/Apple rejects).
Once 4K TV production ramps up that should lead to more higher density monitors at reasonable prices. Sadly I have to admit that it really seems like Apple was the company that pu
Re: (Score:2)
Oh yeah I love using HiDPI mode on my 27" iMac to turn a 2560x1440 display into a virtual 1280x720 screen with twice the detail. This lets me sit way back 3-5 feet or more and have a nice readable picture. Helps avoid eye strain and is really nice how crisp everything is. Of course 1280x720 is limiting useable screw space and I occasionally have to switch it back, but I really do prefer to use it whenever possible.
It's sort of the opposite of what a true retina iMac would do for me though.
Re: (Score:2)
Re: (Score:2)
Re:4k for games? (Score:5, Insightful)
A 4K 50" display 4' or 5' away would give you a pretty damn immersive experience. Wouldn't that be nice?
I'm sitting with my eyes about 3' from my 27" 2560x1440 display with about 108ppi. I can make out some pixels as it is in the text. I'm not wearing my glasses, so that helps some. If this was a 4K 27" display, that would be 163ppi. That's a 50% increase right there.
Wasn't that long ago that running 1280x1024 on a 17" LCD was pretty damn nice, and that was 94ppi. So for a decade we've barely improved when it comes to density. Hell, a 24" 16:10 display that so many people love so much has the same density as a 17" LCD.
Of course my very first PC games ran in CGA, and I thought VGA was a huge step up. But at no time have I ever thought to myself "Nope, more wouldn't be better". Not when it comes to graphics, RAM, harddrive size, etc. Give me more and I'll use it.
Re: (Score:2)
So for a decade we've barely improved when it comes to density.
Yes, it was a decade long race to the bottom for price and performance. People seemed to just want large and cheap monitors that fit their widescreen DVDs.
NEC and Eizo kept on producing quality displays but, much as I hate to admit it, I think it wasn't until Apple came out with high resolution displays that people started to care again. They only got half way there though. The failure of 3D to really generate a lot of sales is what started the push to 4k.
Re: (Score:2)
Of course my very first PC games ran in CGA, and I thought VGA was a huge step up. But at no time have I ever thought to myself "Nope, more wouldn't be better". Not when it comes to graphics, RAM, harddrive size, etc. Give me more and I'll use it.
As a game programmer I have to say, pushing pixels is damn expensive. I make my 3D data visualization software and games resolution independent (because, why not?) but some AAA game devs don't (not in the budget). That means when you get a higher resolution monitor and throw more pixels on the screen your UI text can shrink. Oh, just scale them up? Yep, then what's the point more pixels if you're just going to upscale the textures?
I've done some experimenting with fractal & procedural generation t
Re: (Score:2)
does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!
Support varies by engine; but one (reasonably) common thing that people do with games that weren't originally designed with high-resolution widescreens in mind is mess with the field of view. Some games react badly, with all sorts of distortion effects; but it can also create a nice 'peripheral vision' sense that the game originally lacked.
This would also be engine-dependent, in terms of how much it can be tweaked; but it isn't uncommon for (even comparatively low resolution) games to have decent texture as
Re: (Score:2)
does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!
What does that matter when the game devs program for the consoles and port it to the PC? Sure, maybe on the PC you can go a bit higher resolution, but crappy textures still look like crappy textures.
Re: (Score:2)
Unless the game supports community content, like say, skyrim.
Then you drop in a high poly, high res pack, and play at 4k native all day long.
Re: (Score:2)
Skyrim is a pretty rare example. Most games are not modded to the moon the way Skyrim has been.
Most games will look the same on 2k or 4k.
Re: (Score:2)
uh wait what? no.. Bitmapped data (like an mpeg stream) will not show you more detail if simply scaled up. The detail has to be there in the first place. Some tvs and players can 'add' sharper edges and gradients, as well as add intermediate frames (temporal/motion smoothing) with filters, which are running at the higher resolution, but this is fake data added after the fact. Some people like this and some don't.
However, rendering native 3D graphics at higher resolutions reduces edge aliasing and tightens
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
On a side note, I wonder how much work would be needed to get current cards rendering 4k Surround/Eyefinity.
Buy the monitors and cables, and hook them up? My 6970 has 2 DisplayPort outputs, each of which can support up to 4 monitors with the correct cable/splitter. 4K would only take two monitors on each, and the 2-way splitters are fairly easy to get your hands on. I don't even need the splitters, as I also have 2 DVI outputs on the card, so I can drive two monitors with DP, and two with DVI (and I have never seen a monitor that supports DP and doesn't support DVI).
The card can easily handle that resolution, as
Re:Resolution? WHY? (Score:4, Insightful)
Most people read information on computer displays, reading web pages, emails, facebook updates, twitter feeds, wikipedia, and reference materials; and work in word processors, spreadsheets, and programming environments. All of these features are regularly constrained by vertical resolution.
For people watching cat videos and playing simple games (which comprises almost everybody else not doing the above), neither >1080p resolution nor fidelity matters.
For people doing high-end gaming and watching high-end media, your situation applies. However, it's a pretty tiny sliver of overall computer monitor time, all things considered.
Re: (Score:2)
So, obviously you have not ventured out of your cave for the last 10 years...
Re: (Score:2)
I would take a 1080P display with a 10% improvement in contrast ratio over a 4k TV any day.
I wouldn't. I have a 1680x1050 panel at home. Why would I move to a 1920x1080 panel? That's 30 more pixels vertically. Since most of the time I'm coding I have 2-3 code windows tiled hoizontally. 1680 pixels is more than enough for slighly more than 80 cols per terminal and a useable font size.
A 1080P monitor gives me nothing except the ability to watch high-def video without scaling. Since I don't do that on my PC anyway I don't see the point. Now, if you could give me a proper 1920x1200 monitor that would be a few more vertical lines and that is better.
rotate your display 90 degrees. there. problem solved. send me money.