NVIDIA's New Flagship GeForce GTX 580 Tested 149
MojoKid writes "Even before NVIDIA's GF100 GPU-based GeForce GTX 480 officially arrived, there were a myriad of reports claiming the cards would be hot, loud, and consume a lot of power. Of course, NVIDIA knew that well before the first card ever hit store shelves, so the company got to work on a revision of the GPU and card itself that would attempt to address these concerns. Today the company has launched the GeForce GTX 580 and as its name suggests, it's a next-gen product, but the GF110 GPU powering the card is largely unchanged from the GF100 in terms of its features. However, refinements have been made to the design and manufacturing of the chip, along with its cooling solution and PCB. In short, the GeForce GTX 580 turned out to be the fastest, single-GPU on the market currently. It can put up in-game benchmark scores between 30% and 50% faster than AMD's current flagship single-GPU, the Radeon HD 5870. Take synthetic tests like Unigine into account and the GTX 580 can be up to twice as fast."
Good write ups, good card (Score:5, Informative)
http://www.pcper.com/article.php?aid=1034 [pcper.com]
http://www.hardocp.com/article/2010/11/09/nvidia_geforce_gtx_580_video_card_review [hardocp.com]
http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580 [anandtech.com]
http://www.legitreviews.com/article/1461/1/ [legitreviews.com]
http://www.techreport.com/articles.x/19934 [techreport.com]
http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/1 [bit-tech.net]
Re: (Score:2)
Always been a big fan of [H]ard|OCP...they definitely have some of the best forums in the enthusiast scene.
Re: (Score:2)
Re: (Score:3, Interesting)
The problem is, how much does it cost? Radeon 5770s can be had for $120 at Newegg after rebate, so why the hell would I need to waste $500 on this card? I could hook up a pair of 5770's for much less and get similar performance.
And what the hell games on the PC is it actually supposed to be required to play?
The AMD cards do just fine from the last gen, when they were beating NVidia cards. And I'm willing to bet that the "next gen" AMD card will see similar performance increases as well when it hits by next
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
The 5770 will also cost you significantly less in electricity and cooling during the warm months =)
Yeah, I'm sure it's really competing against that 95% efficient multi-stage Gas Furnace you should have in your house, but don't for energy costs or the still electric base board running in your house, not to mention the electric range, on and on.
Re: (Score:3, Informative)
P.S. My furnace is 93%/16 SEER and my house is only 1200sq ft so in percentage terms it can be a large cost.
Re: (Score:2, Insightful)
Re: (Score:2)
"just fine" differs from person to person. No, GTX 580s aren't required to play PC games and most of the time the lower cost GTX 460/HD 6850s are fine. But sometimes more power is just better.
Re: (Score:2)
Precisely the point I was making.
Even Crysis - which at one point was the "go-to" benchmark game - performs very well on a single 5770.
There are "games" which are basically tech demos meant to stress cards, and that's the category into which Crysis falls. For everyone else, the existing games are either a console port (in which case they are tuned for 5 year old hardware anyways) or are tuned down enough to run on 5 year old hardware (sometimes even those crapass Intel-onboard video solutions that come from
Re: (Score:3, Interesting)
By now, we should have had a plethora of different applications running on such a card: audio encoding, compression, encryption, gaming AI. I know about CUDA, but why aren't we seeing such applications?
Flash, web based video, Power DVD, and various others at the consumer end of the spectrum (where accuracy is not important). When I first bought an ION based netbook (about 12 months ago), half the websites on the net could not play video on it without dropping a hideous number of frames. Since I've owned it, there has been a gradual stream of updates to various libs/SDK/apps (flash video was the most obvious!) that have made my netbook usable (by utilising the ION GPU).
Are they held back because of lacking OS support? Lacking driver support? Lacking deployment infrastructure? Lacking developer initiative? Is the GPU architecture (disparate memory) unsuitable? Or is CUDA just woefully inadequate to express parallel problems, seeing as it's based on (one of) the most primitive of imperative languages?
Disappointed minds want to know...
It's much simpler than that - it's al
Re: (Score:3, Interesting)
Re: (Score:2)
how about rendering a video of your assembly in action? or the "photo realistic" options in proE/solidworks?
Re: (Score:2)
Well, for me having the application utilize the GPU is a big minus, since the cooling fan will spin up and make noi
SLI/Crossifre isn't always valid (Score:3, Insightful)
For one, there are a lot of motherboards that don't support it. Even new, reasonably high end boards. I have an Intel P35 board with a Core 2 Quad at home, but it has only 1 16x slot. At work, a Dell Precision T1500 with an i7, again only 1 16x slot. Crossfire/SLI cannot be done in these cases. You have to buy a single, heavier hitting, card if you want performance.
Also you need to do a bit more research if you think multi-card solutions work well all the time. They can, but they also can have some serious
Re: (Score:3, Interesting)
If you have a 30" monitor and want to drive it at its native, beyond HD rez (2560x1600) you need some heavy hitting hardware to take care of that, particularly if you'd like the game to run nice and smooth, more around 60fps than around 30. You then need still more if you'd like to crank up anti-aliasing and so on.
Isn't the point of AA to make things look better at lower resolutions? Running at resolutions beyond the HD rez, even on large screens, eliminate any sort of need for FSAA. At that point, you just don't get jaggies that need to be smoothed.
Re: (Score:3, Informative)
Running at resolutions beyond the HD rez, even on large screens, eliminate any sort of need for FSAA. At that point, you just don't get jaggies that need to be smoothed.
You still get pixel-shimmer though, which FSAA greatly reduces.
Re: (Score:2)
As someone who has gamed at 1920x1200 for many years, I can say that FSAA is not really needed at that end of the market.
Re: (Score:2)
Running at resolutions beyond the HD rez, even on large screens, eliminate any sort of need for FSAA. At that point, you just don't get jaggies that need to be smoothed.
You most definitely have jaggies without AA, even at high resolutions. It's especially noticeable on "thin" objects like grass/foliage, power lines, etc. The more fine detail in the scene, the more jaggies.
Not until the pixels are invisible (Score:2)
On a 30" monitor you have ~100 pixels per inch. At normal viewing distance that means they are a bit smaller than an most monitors, but still plenty visible. They are not as small as on many laptops, and not down to the level you need for them to completely vanish. That is probably in the realm of 300PPI or so. You might be able to get away with 200PPI, but then just leaning in might be enough that they aren't truly blended anymore.
So until the display is that high rez, AA is useful. We've got a long way to
Re: (Score:2)
You missed tessalation that both OpenGL and DX11 offer... Go run the newest Ungine benchmark and watch your card cry....
Re: (Score:2)
Unless you have six monitors, you were probably duped. A single 5870 is a goddamned powerhouse.
Re: (Score:2)
Re: (Score:2)
I can buy two 5770's, AND have $50 in my pocket, for the price of one 5870...
Re: (Score:2)
Re: (Score:2)
So you can't plan ahead when you buy your mobo and PSU to get something that supports SLI ??
I bought a 5770 this year, and will pick up another 5770 either in Dec, or next year.
It isn't rocket science to predict what parts you are going to upgrade over the 2-5 year life cycle of your rig man.
Re: (Score:2)
You are correct, but a lot of the cheep PSUs are the ones with the beefed up 5V rail to make the PSU look bigger.
If the 5770 needs 40A@12V, thats 480W at full load on it's own.... in which case either the reviewer with the power meter messed up, or you are full of shit.... Are you reallying saying that a pair of 5770's at full load(furmark) will suck down 960 Watts? SPCR's (www.silentpcreview.com) review of the 5850 shows it using 132Watts(after correcting for PSU power loss when measuring at the wall), so
Re: (Score:2)
Alternatively, you could get two Radeon 6870s for slightly less - that's got some really nice Crossfire scaling results in reviews. Sadly this review doesn't include them, but they seem to pull ahead a fair amount. Also, ATI's new top end GPU is due out in a couple of weeks (which is probably why no-one's offering any kind of 6870x2 card).
Competition is good. (Score:5, Insightful)
I am very glad to see the performance crown handed back and forth.
Now if only this was happening in the CPU market...
Re: (Score:2)
It is, the period of oscillation is simply a lot longer.
Re: (Score:2)
Maybe in the ARM vs Atom championships?
When it comes to Atom, AMD is making a bid with their upcoming Ontario lineup [anandtech.com] for netbook/nettop dominance.
Re: (Score:2)
ARM CPUs cannot run windows 7(full not embeded) and cannot run silverlight so no media center box for netflix...
Re: (Score:3, Informative)
I'm gonna feed this troll.
What about Radeon 9700, 9800, x800, 4800, 5800 before Fermi, and 6850 before GF110?
Also, ATI cards play games and do it well. I don't know what driver issues you're talking about.
Re: (Score:2)
Re: (Score:2)
You are doing something wrong. nVidia cards don't just do that.
Re: (Score:2)
try the 260.xx drivers that are out?
258.xx are old now.
Purely out of curiosity... (Score:3, Interesting)
Re: (Score:3, Funny)
August 29th, 1997. At that point we lost all communication with Skynet.
Re:Purely out of curiosity... (Score:4, Funny)
August 29th, 1997. At that point we lost all communication with Skynet.
And Michael Jackson turned 39.
Coincidence? You decide.
Re: (Score:2)
I don't know, but I do know that we can be sure that transistors are not people. Hard drives though are another story. CAVIAR GREEN IS PEOPLE!!!
Re: (Score:3, Informative)
49.4 GigaTexels/sec Fillrate... (Score:2)
Re: (Score:2)
I dunno, why does it remind you of Bill Gates? It's not like he ever said that or anything...
Next gen? (Score:2, Interesting)
That looks like a 480 with the 4 replaced by a 5. Hardly a revolution.
Just watercool the 480, it's how it's supposed to be used.
Re: (Score:2, Informative)
From TFA:
Unigine Heaven Benchmark v2.0: 18% better
580: 879
480: 742
Quake war: 14% better
580: 176 FPS
480: 154 FPS
Farcry2: 14% better
580: 109 FPS
480: 95 FPS
Alien vs Predator: 16% better
580: 43 FPS
480: 37 FPS
Power consumption: 96% of that of the 480
580: 377
480: 392
Woot, 15% increase in performance for same consumption ! Clearly the 580 is "as its name suggests, it's a next-gen product".
If you mean same-gen as the 480, right. If you mean next-gen compared
Re: (Score:2)
Mods.... wtf? This is a valid point - this card is _not_ next generation, all it does is _exactly_ the same as the 480, only a little bit faster. It is in essence an overclocked 480 with better cooling and power characteristics. It's basically what the 480 should have been.
Parent is not a troll, as GP is not - replying to yourself always works wonders though.
Re: (Score:2)
Re: (Score:2)
All go 100% AMD when they get the newest GPU working same day, with drivers that play nice with WINE on linux, and never look back. Until then I'm "stuck" with nvidia. Which is a same because for the same size and power draw the 5770 is better than the gts450. When doing SFF computers physical dimensions are an issue.
Synthetic Benchmarks - (Score:3, Interesting)
Does anyone assume that the synthetic benchmarks achieved by either AMD or NVIDIA are representative of anything more than these companies' efforts to tweak their driver sets against the pre-existing criteria for getting a "good score"?
Both companies I believe have been accused over the years of doing just that and pointing the finger at the other as taking part in shennaniganism"
So go read some non-synthetic ones (Score:3, Informative)
HardOCP is famous for their real gameplay ratings. They go and actually play through the game while testing performance. They then find the highest settings that the reviewer finds playable. Now while there is some subjectivity to it they do back it up with FPS numbers, and it is the same reviewer trying everything out. So it gives real, in game, actually playing, results. I find it maps nicely to what actually happens when I get a card and play games.
http://hardocp.com/article/2010/11/09/nvidia_geforce_gtx [hardocp.com]
Re: (Score:2)
The quote you provided _gives_ that grain of salt, by explicitly saying that those tests are synthetic, rather than real-world. I don't see the problem here...
Re: (Score:2)
>>Does anyone assume that the synthetic benchmarks achieved by either AMD or NVIDIA are representative of anything more than these companies' efforts to tweak their driver sets against the pre-existing criteria for getting a "good score"?
In short, no.
However, we have sites like HardOCP and AnandTech that run the cards through a variety of games and give the results. You can look at the results and decide if your current card is better or worse than the new card.
If you are trying to decide between a b
Re: (Score:2)
I thought this was why software like 3DMark Vantage have simulations which are basically equivalent to the rendering performed in actual games. There's 2 full runs of very detailed 3D scenes which actually must be done by the card, there's no way to sneak around actually rendering them. User is presented with the rendering on screen in real-time while doing the benchmark.
Also, the large gauntlet of actual game benchmarks helps give weight to any synthetics actually meaning something. I haven't seen a vid
Re: (Score:2)
Remember the FX5900 and 3DMark03?
That's why we don't trust synthetics.
anti-overclocking technology (Score:2)
I hope someone can figure out how to bypass the anti-overclocking tech. Otherwise, AMD is going to have an easier ride this round. Why are all manufacturers so damn evil? What's wrong with a little overclocking to boost speeds? When I'm spending this much money on a video card the least they could do is allow me to boost my speeds a little. They've also made water cooling the card pointless with their new current limiter. It's so easy to hate Nvidia. I'll buy from whichever company has the fastest card (wit
Re: (Score:2)
With a few huge exceptions, like the old 9500 to 9800 mod, overclocking and modding graphics cards has been relatively unrewarding, to the point that I doubt it's much of a market concern.
I bet they neutered overclocking in the 580 so that we wouldn't have card reviewers telling us about melted video cards.
Re: (Score:2)
could you even cool a OC'ed gtx580 considering the stock card is a 250W monster....
Re: (Score:2)
There is a difference between motherboards and GPUs/CPUs. The motherboards use overclocking as a feature that you pay extra for. GPUs and CPUs sell based on their clock, so the idea of an end user overclocking their chip is frightening to the manufacturer.
You must be new. Back in the olden days of the PC, the "Turbo" feature was a switch that when turned off made your computer slower. Today the popularity of overclocking has become a selling point. Amusingly, I used to overclock my GPUs, now they run hot enough to where I can tell there's no headroom. The manufacturers are already pushing them to the point where they're thermally throttling, whereas the last time that happened with CPUs on a regular basis it was the P4 and the whole world declared that it wa
Yeah but... (Score:2)
Is it powerful enough to run Civilization V?
Re: (Score:2)
Is it powerful enough to run Civilization V?
Only if you run a Beowolf cluster of them.
Re: (Score:2)
Re: (Score:2)
only Civ 4
Terrible Summary (Score:3, Interesting)
The /. summary ends with:
It can put up in-game benchmark scores between 30% and 50% faster than AMD's current flagship single-GPU, the Radeon HD 5870.
But if you read the original article, the one flaw in the (otherwise good) nVidia card is that is still loses to the 5970 which is -- according to the article -- 'about a year old'. So why is that other article mentioned in the summary talking about the 5870 as if its the flagship? Clearly the 5970 is. Or am I missing something?
Re: (Score:3, Informative)
It can put up in-game benchmark scores between 30% and 50% faster than AMD's current flagship single-GPU, the Radeon HD 5870.
The 5970 is a dual GPU solution. TBH, it's no surprise that it's faster than a single GPU solution that is a year newer. I would expect the last gen card in a dual GPU setup (this, or SLI/Crossfire) to outperform the latest next gen card, especially when the new card is really just an iteration of the architecture used in the last gen. Nothing really surprising about it at all. And I bet you if you get two of the GTX 580's in SLI, they'll stomp the 5970. That's a bit more of an apples to apples compari
Re: (Score:2)
I would expect the last gen card in a dual GPU setup (this, or SLI/Crossfire) to outperform the latest next gen card, especially when the new card is really just an iteration of the architecture used in the last gen.
Why? Bear in mind that it's not like there's any new features in the "next gen" 580 over the previous generation - the only improvement is peformance. Having about the same performance as a card that's been in the market for a year at a similar power consumption and price tag isn't exactly great progress.
And I bet you if you get two of the GTX 580's in SLI, they'll stomp the 5970.
I'd hope so, given that each of those two cards individually costs the same price as the 5970 and uses nearly as much power. For a dual-580 setup, we're talking $1,000 just for the cards alone, not includin
And something people seem to forget (Score:2)
Is that multi-GPU solutions are NOT the same as single GPU solutions just faster. In some cases, multi-GPU works great, you get nearly a doubling in speed. In other cases, it works ok you get more speed than single, though not double. In still other cases, it doesn't work at all, only one GPU is used. In yet other cases, shit goes really wrong and games won't work right unless you shut down a GPU. It depends on the game in question, the drivers you are using, and which company's GPUs you have. nVidia tends
Re: (Score:2)
Thanks all, I get it now. I didn't see mention of the distinction between the 5970 & the 5870 (which is fair, its a 580 review, not an ATI review). Though I was skimming, so its possible they point that out early & I just missed it.
Though why is the dual-CPU chip using less power than the single GPU nVidia card? Is there some subtle interplay between dual GPU processors I'm not aware of that makes them use less power, or is the nVidia 580 just a hog?
Re: (Score:2)
Unless you go by price.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102887&cm_re=radeon_5970-_-14-102-887-_-Product [newegg.com]
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125349&cm_re=gtx_580-_-14-125-349-_-Product [newegg.com]
Fast open source drivers coming.. (Score:3, Interesting)
Re: (Score:2)
So how good is the AMD open source driver? How much luck have you had running 3d games under Wine with it?
Re: (Score:2)
So how good is the AMD open source driver? How much luck have you had running 3d games under Wine with it?
Both the AMD r600c and the new r600g free software drivers are slow and phoronix benchmark story is that their evil binary blob is faster than both of those. Still, there is a very big difference between AMD and Nvidia; AMD worker-drones regularly work on the driver and the OpenGL support through MESA and they are making documentation available as fast as they can write it. As for Wine: I do not have the license for any 3d games or other Windows software for that matter, so I haven't tried running anything
So then what you are really saying (Score:2)
Is you think nVidia's driver team should have to write good open source drivers all by themselves, since clearly the OSS community isn't nearly as good at graphics drivers as they pretended. I mean I remember the rhetoric: Just release the documentation, we've got legions of people who will crank out a driver that is better than any of the closed ones in a hurry. Ya well we see how that went. Here it is over a year later and you say it still can't stand up to ATi's closed driver, which is not nearly as good
Re: (Score:2)
1) The OSS heads didn't appreciate how damn complex a graphics driver is. You have people who'd written SCSI drivers or something and said "Well that isn't that hard." They forget that a binary SCSI driver is around 20k or something, you are dealing with a simple device. The main ATi Windows driver is 7.6MB and that is just the central driver, never mind all the support files it needs to work right. It is a major job, and the hardware changes fast.
The biggest and foremost reason Linux support has been slow is that the graphics stack was very poor. Before AMDs announcement the only open source player was Intel who honestly nobody used for more than getting a picture. Since neither nVidia or AMD gave out any detail on their hardware and instead rolled their own closed source drivers, the open mesa stack basically still worked on a model from the 90s. If it had only been to drop AMD support into a modern 3D stack, we'd be much further than we are. Prett
Re: (Score:2)
I think what scares them most is that an open source driver would not intentionally cripple OpenGL rendering. The three to four times markup on Quadro cards is nasty business if you ask me. I'm fine with a card being more expensive because it offers you testing and support on professional apps, but comparing a Quadro 3700 with a 8800GT does not shine a very good light on nVidia.
Mind you, I suspect AMD is equally bad, I just never looked at that market in detail.
Re: (Score:2)
Re: (Score:2)
NVidia GTX N+110 Scarp your roomheater edtition! (Score:2)
The fastest single GPU consumer card available. TDP 244W! You can safely scrap your room heater now!
Summary inaccurate (Score:2)
Looks like HardOCP hasn't been testing AMD's most recent flagship product, the Radeon HD 6xxx series, which a single card alone eats a 480GTX for breakfast.
Re: (Score:2)
That's because the 5870 is more powerful than the 6xxx cards they've released so far.
Re: (Score:2)
Because of AMD changing the numbering scheme, again.
244W TDP? Urgh! (Score:2)
I am not in the market for a room heater! The HD68x0 ATIs take pretty much half that, at half the performance and a quarter of the price.
Again a showy card from Nvidia that basically only supports the ego of their lying boss and is otherwise a waste of money.
Re: (Score:2)
Released? (Score:2)
So its been released now? Why then can I not purchase one?
Stupid distortion of the English Language :/
Re: (Score:2)
Re: (Score:2)
8800GTX is not "worth" $200 of gaming performance. It's worth maybe $65 on a few select cases where power consumption isn't a concern.
It costs that much because they're a niche item.
Re: (Score:2)
Re: (Score:3, Interesting)
If you want to see the board, back off a few price/perf
Re:CPU, GPU... (Score:5, Informative)
In a "designing your next gaming build" sense, they largely already have. Unless you are a money-is-no-object-e-penis-must-get-longer type gamer, you can generally get better bang for your buck by going with a cheaper CPU and spending the savings on a nicer graphics card. It depends on the game, and there are situations where a truly epic(2x or 3x of the top of the line GPU ganged together with SLI or crossfire) graphics system will be CPU bound without the best CPU available; but Joe Gamer is, most of the time, better off with a third tier CPU and a second tier GPU, or a 2nd tier CPU and a 1st tier GPU.
In smaller systems(where board footprint really counts) or in cheap systems(where package costs and board size really count) the integration of CPU and GPU into a single package proceed apace, with AMD rolling low-end ATI tech into certain of their newer parts, and Intel trying to make their GMA stuff suck less. The only real wild card is Nvidia: Unlike Intel or AMD, they have no x86 cores to speak of, on the other hand, their GPU-computing initiatives are arguably the most advanced, in terms of tool and driver maturity. The question is, will they eventually produce an Nvidia equivalent to AMD and Intel's CPU/GPU combo packages(perhaps by buying VIA, who has adequate-but-deeply-unexciting x86 assets; but utter shit GPUs), or will they persist purely as a maker of high end gaming GPUs and GPU-based compute cards?
Unless the heriditary line of the "PC" as we know it is wholly extinguished, there will always be an x86 CPU floating around somewhere in the block diagram(and, in other types of systems, likely an ARM CPU); but it is already the case that, for many applications, the CPU has gotten fast enough to hit diminishing returns for many applications, and the GPU(or just the embedded h.264 decoder) is where the action is.
Re: (Score:2)
Re: (Score:3, Informative)
To the best of my knowledge, though, neither Nvidia, with their ARM SoCs, nor Intel with their on-package GMAs, nor AMD with their upcoming on-die ATI tech are creating what you might call a full "hybrid"(ie. a CPU whose instruction set also includes GPU-esque instructions, like MMX or SSE on steroids). At present, they are all just more heav
Re: (Score:2)
They are falling down on the execution; the additional cost of Ion 2 over GMA seems to push netbooks and nettops into the low-end desktop PC price range.
Re:CPU, GPU... (Score:4, Interesting)
With the prior generation of atoms, the usual pairing was Atom + fairly antiquated Intel chipset with GMA950 and a fairly high TDP. For just a little extra, you could pair the Atom with Nvidia's chipset instead, which had as good or better TDP and much better integrated graphics. Intel wasn't happy; but the end result was good.
With the newer generation, Intel brought most of the chipset functions onboard, and played hardball with licensing, so that "Ion 2" ended up consisting of, in essence, Nvidia's lowest-end discrete GPU added on to the system via the few PCIe lanes available. Unlike Ion, which was a genuine improvement in basically all respects other than OSS linux support, Ion2 meant higher TDP, more board space, and higher BOM.
Intel bears much of the blame for it; but Ion 2 is largely a dog, particularly when compared to the "CULV" options, which will get you a real(albeit low end) Core2 or i3 processor and a similar low end GPU for not much more than the Atom...
About 6 days from never (Score:2)
What may happen, what AMD would like to see happen, is for GPU functions to become a part of the CPU, that GPUs go away because CPUs can do it. However that'll be because CPUs have GPU like logic in addition to their own. GPUs are great and blazingly fast... But only at some things. I've written up the list before and don't feel like doing it again but more or less you find that some things run great on a GPU, others run for shit. More or less it has to be floating point, massively parallel, and have very l
Re: (Score:3, Insightful)
What may happen, what AMD would like to see happen, is for GPU functions to become a part of the CPU, that GPUs go away because CPUs can do it. However that'll be because CPUs have GPU like logic in addition to their own.
The problem, as Intel found out with Larrabee, is that a cache that works well for CPU tasks does not work well for GPU tasks, and vise-versa. For a GPU the bandwidth is everything, while for a CPU its the latency that matters most.
Our CPU's L1 caches are 32K/64K in size because smaller caches have significantly smaller latencies than larger ones. Its quite obvious that a 64K cache is way too small for a GPU, which could literally process 64K of data in only a few of its clock cycles.
Intel never could
Re: (Score:2)
Re: (Score:2)
Just remember to factor in your power usage prices. If you game a lot, it might be more economical to buy more of a card upfront and it'll use less power overall.
Re: (Score:2)
An interesting tidbit you might find helpful:
It looks like if you want awesome framerates and the lowest power draw even under full load, the GTX 460 in SLI is the winner. It even is the quietest in SLI configuration. Can't remember where I found the testings, but it compared it to all the other cards for power and noise in both single and SLI.
Re: (Score:2)
Why is this modded down and as a troll? I was simply stating how in awe I am at the pace and what's offered today.
Think back. Think of how far we've progressed since the "badass" 9800 GX2, which was only what, 2 years ago? Or even before that.
Re: (Score:2)
LOL. Believe it or not though, that's another think I'm amazed about. The idle power consumption on these new cards is insanely low too. Check out HardOCP or Tom's Hardware. Pretty crazy.