Despite Game-Related Glitches, AMD Discontinues Monthly Driver Updates 213
MojoKid writes "Recently AMD announced that it would cease offering monthly graphics driver updates, and instead issue Catalyst versions only 'when it makes sense.' That statement would be a good deal more comforting if it didn't 'make sense' to upgrade AMD's drivers nearly every single month. From 2010 through 2011, AMD released a new Catalyst driver every month like clockwork. Starting last summer, however, AMD began having trouble with high-profile game releases that performed badly or had visual artifacts. Rage was one high-profile example, but there have been launch-day issues with a number of other titles, including Skyrim, Assassin's Creed, Bat Man: Arkham City, and Battlefield 3. The company responded to these problems by quickly releasing out-of-band driver updates. In addition, AMD's recent Catalyst 12.6 beta driver also fixes random BSODs on the desktop, poor Crossfire scaling in Skyrim and random hangs in Crysis 2 in DX9. In other words, AMD is still working to resolve important problems in games that launched more than six months ago. It's hard to put a positive spin on slower driver releases given just how often those releases are necessary."
They didn't say slower (Score:5, Insightful)
They didn't say slower, they said as needed. Since they are already releasing 'out of band' they are just normalizing that process. They will release when they have fixes / function instead of on an arbitrary timeline. It seems to make perfect sense.
Re: (Score:2)
Not only that, but releasing 'out of band' patches on a regular basis was probably a huge drag on their responsiveness. If you have a deadline for a new driver patch EVERY SINGLE MONTH regardless if the patch is ready then you are either testing inconsequential releases and wasting your QA resources on releases that aren't important in order to meet some arbitrary deadline or you are releasing insufficiently tested patches to meet some arbitrary quota. Neither is a recipe for efficient allocation of your
Re: (Score:2)
An artificial release schedule may slow things (Score:2)
Re: (Score:2)
Is this nvidia spin? (Score:2, Interesting)
I mean, of course frequent updates are desirable. On the other hand, every release produces overhead which could be used to fix the problems at hand. In my experience, monthly update schedules are a terrible waste of valuable time.
Personally, I'm an nvidia user, since I hate the driver issues of AMD... but this news sounds like nvidia spin to me.
Re: (Score:2)
I simplified my entire life by using an xbox 360 and a playstation 3 for gaming, and a blu-ray player for movies. If I have a rip I just burn it and stick it in the blu-ray player.
While there are occasional bugs in the console games, I've rarely experienced them.
Re:Is this nvidia spin? (Score:4, Interesting)
Hah, I was waiting for it. Waiting for the first post to bring up your argument. I've been reading slashdot for way too long.
Don't forget that console games have tons of bugs now too. And big huge flaws. The Skyrim save game issue? Bioshock always messing up widescreen? Rockstar grand theft everything. Silent Hill Downpour--the entire freaking game is full of bugs and hard locks.
Anyway, I go back and forth on this. I don't know which solution is better. I think it's gotta be down to simply personal preference. I think both sides has it's flaws.
PC flaws:
- I swear to god I'm so sick of updating drivers, for anything. Graphics drivers should just be auto-updated, period. Not even to have a button in the ATI/Nvidia control panel is good enough. As it stands now, there are too many steps. Yea yea it's more safe to do it this way now, where if a driver is broke they can revoke it. But it's the same issue windows was having. Either deal with that, or deal with most users not upgrading at all.
- All games should have built in patching mechanisms. Steam does this right now, as do EA games or GFWL. But what if a game isn't? Or what if I want to buy a game from say GOG or Gamersgate. They don't auto patch. So ur stuck back in the days of yore, hunting down patches from fileplanet or something. That's bull and I flat out refuse.
-Small dev Q&A problems. I love freakin Red Orchestra 2 and Arma 2. Amazing games. But the bugs. Oh the bugs. Jesus it's terrible. Don't even bother playing a game until it's been out 6 months.
Console flaws:
- No support of alternative games. Read: MMO or F2P. Short of DC universe for PS3 or free realms, it's out. But that's a big segment of the future and part of the solution to keeping online communities big and a steady, not one-off, revenue stream. And consoles could OWN this market, but they don't. They could make a badass-looking (compared to the PC F2P's right now which have to be simple enough to run on IGP's) MMO's or F2P's. But nooo.
- No digital downloads for everything. There are a few games I want. Can't get em. Both ps3 and xbox only have like 20% of their titles available. Even on the ps3, you can't download most sony games. Pathetic (resistance 3 I'm looking at u). And the prices are atrocious. $60 or $40 for games that are only $30 or $15-20 online retail. And more money for a game without shipping or physical presence AND is locked to an account? Who in their right mind would buy it?
- No mouse/keyboard support. I'm not saying they should do it across the board like most people who throw out this argument do. No, consoles are meant to be played with a controller. HOWEVER, make keyboard support only for some genres, like strategy. There are barely any RTS games. This would allow them access to the market.
Re:Is this nvidia spin? (Score:5, Insightful)
I'd rather have to download patches than have the thing autoupdating when I don't want it to just yet. Same thing with drivers. Those are things that really should be managed by the user. There are plenty of circumstances where latest_version = best choice is a horrible assumption, esp with people who have older hardware. Some drivers just don't like some hardware configs too.
One of the biggest selling points of PCs is that the user controls the software. Take that away and it's just another stupid console like everything else is nowadays. I don't mind having an option for autoupdate, but I would not want it mandatory. I still want to have the installers available for local storage.
Re: (Score:3)
Oh I agree with you on the legacy part. I'm not buying any more games for the consoles right now, because we all can't trust MS for effectively bricking out machines by not supplying updates for games one the new console comes out. I feel bad for the people who keep on buying stuff. Didn't MS completely drop the original xbox the day the 360 was released? And nowadays (and I laugh now that I think of it) console owners feel the PC user's plight. Most games are borderline bricks or unplayable upon release an
Re: (Score:2)
The argument between HP and nVidia over defective GPUs is between HP and nVidia, not between me and nVidia. My HP laptop with QuadroFX1500M had a known problem. I had to fight with HP for more than 24 total hours on the phone to get them to admit it and issue me a replacement, but I did so. I had no problem with nVidia. HP had a problem with nVidia. I had a problem with HP. Replacement had a newer GPU (and everything else) and I sold it and bought some netbooks.
One of the netbooks I bought has AMD Athlon 64
Considering nVidia's actions, do you feel safe? (Score:4, Interesting)
The way nVidia has acted in the past is an indication of how it may act in the future. See one of the many articles, for example: Dell and HP balk at replacing bad Nvidia chip. [windowssecrets.com]
If you buy something with an nVidia product in it, you may get involved with enormous hassles like that. People who weren't following the sneakiness and dishonesty closely didn't get their computers replaced because there was a very limited period in which customers needed to act.
Both AMD and nVidia need better management, in my opinion.
Comment removed (Score:4, Informative)
Re: (Score:2)
Thanks for the advice, I lost my Win7 key but if I get another one later I'll try it sometime. I bought the laptop on the assumption that surely I would be able to upgrade to Win7, or if not, run Linux since the GPU core is antiquated. Boy, was I wrong.
Re: (Score:3)
Re: (Score:2)
I don't think they're trying to drive themselves out of business, they just aren't that competent but their competitors are.
This smells like a cost savings move to me, I guess the positive spin is that they might take longer to go out of business if they cut costs.
What'll actually happen is they'll get a bad rep for having problems that sit around too long. Seems they have OCZ disease...too excited to release new products and start making money and not smart or thorough enough to validation test their prod
Re:Which is worse, AMD or nVidia? (Score:5, Insightful)
If as a programmer I can do something that crashes your driver or blows up your machine, then the problem was with the driver, not the application programmer.
I was a systems programmer for 30 years. I wrote a ton of OS and driver code, especially drivers. If you could break the machine or cause stupid things to happen by having your app do something improper with the driver, then that was my fault.
Let me get this straight: (Score:5, Insightful)
You fix third-party software... by modifying drivers?
How about forcing the game makers to TEST THEIR DAMN GAME before releasing? Is it really so hard to throw together four test-beds with GPUs from different vendors?
Re:Let me get this straight: (Score:5, Insightful)
On one hand, AMD should fix it. On the other hand, AMD graphics cards are pretty popular. Their game should be designed to work on what they can reasonably expect their users to have.
Re:Let me get this straight: (Score:5, Insightful)
It's easier to just release the game and let gamers and video card manufacturers fight over who is in the wrong. By the time someone figures it out the developers have made their money and run off.
Re: (Score:3)
+1, Depressing.
Re: (Score:2)
Their game should be designed to work on what they can reasonably expect their users to have.
And that's another reason why many developers switch to consoles. Because you cannot predict what configuration the user has. It's not only videocards, it's motherboards, processors, RAM... All kinds of bugs. My own PC reboots from time to time, I have no way of knowing why that happens. And notebooks are even worse.
Passing the handling of hardware related bugs to developers is stupid. In that case videogames would support only specific system configurations and refuse to run on a different hardware. Do you
Re: (Score:2)
And that's another reason why many developers switch to consoles. Because you cannot predict what configuration the user has. It's not only videocards, it's motherboards, processors, RAM... All kinds of bugs. My own PC reboots from time to time, I have no way of knowing why that happens. And notebooks are even worse.
Passing the handling of hardware related bugs to developers is stupid. In that case videogames would support only specific system configurations and refuse to run on a different hardware. Do you want that?
Since the developers can't even make their games run correctly on consoles [doomworld.com], I think it's fair to blame the problem on them.
And is your problem with mystery reboots from Windows automatically updating? Fix that here [microsoft.com].
Re: (Score:2)
Re:Let me get this straight: (Score:5, Interesting)
Really? Bitcoin mining??? You think people are buying ATI cards to mine bitcoins? And not for gaming? Maybe a few people are reusing their old cards for mining, but the bitcoin fad has pretty much passed... I'd be shocked if even .1% of the AMD graphics cards sold are for bitcoin.
Re: (Score:2)
On the other hand, AMD graphics cards are pretty popular."
They are. Although to what extent that is the result of Bitcoin mining (at which AMD/ATI cards excel, and Nvidia cards suck), I'll leave as an exercise for the reader...
Is Bitcoin mining *really* that significant a part of the market as a whole? I suspect it probably seems more prominent on Slashdot than it actually is.
Besides which, from what I understand, the increasing difficulty of solving new "problems" to generate Bitcoins meant we'd passed the point where the electricity needed to power the computations outweighed the generally-accepted value of the generated Bitcoins some time back.
Re: (Score:3)
How about forcing the game makers to TEST THEIR DAMN GAME before releasing?
This.
Unfortunately, though, a forum full of "It doesn't work on AMD cards! OMG!!!" makes AMD look bad, not the game developer. AMD then have to go about emulating NVIDIA's driver bugs.
Re: (Score:3)
Bahahahha. I'm not denying that nVidia has had driver bugs, but complaining about AMD having to emulate nVidia's driver bugs is like complaining that Intel had to implement AMD64. nVidia is so much better at drivers than AMD that your comment looks like the insane rantings of a madman.
Re:Let me get this straight: (Score:5, Interesting)
nVidia is so much better at drivers than AMD that your comment looks like the insane rantings of a madman.
Oh, yeah? I program 3D graphics for a living so I have to deal with this stuff on a daily basis. I'm working around a bug right now.
Question: Are occlusion queries supposed to return number of samples or number of pixels in Direct3D?
A certain company's "pro" graphics cards seem to differ from their "consumer" graphics cards over this.
The only way I've found to get my program working is to do a dummy occlusion query when I create the framebuffer and see what happens.
Re:Let me get this straight: (Score:5, Informative)
Direct3D technically allows for both, the XNA game dev framework specifies number of pixels however, for performance reasons. The number of samples method tends to be more accurate but very slow. It's the same thing on the OpenGL side. CAD, 3D applications such as Maya etc, compositing programs etc tend to use samples over pixels, for more accuracy.
Re: (Score:3, Informative)
Question: Are occlusion queries supposed to return number of samples or number of pixels in Direct3D?
Occlusion queries are supposed to return number of pixels in both Direct3D and OpenGL.
A certain company's "pro" graphics cards seem to differ from their "consumer" graphics cards over this.
In both API's or just one? If just one, then the problem is actually within Direct3D and isn't the card at all.
The only way I've found to get my program working is to do a dummy occlusion query when I create the framebuffer and see what happens.
Then you're doing something else wrong and have misidentified the source of your trouble. I won't get into here, but this might prove helpful to you:
http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter06.html
http://http.developer.nvidia.com/GPUGems/gpugems_ch29.html
Re:Let me get this straight: (Score:5, Informative)
Re:Let me get this straight: (Score:4, Interesting)
Re: (Score:2)
They also lost the ability to set overscan on s-video out. I have a 9800 that I use for that and I game a little with it still. I needed to update the driver and lost that ability. yeah I know, who uses s-video still, well I do on one tv that I haven't replaced yet and I driver update shouldn't mean I lose features.
Point being, it's one of those things they won't go back and fix either.
Re:Let me get this straight: (Score:5, Interesting)
Even more unfortunately, NVidia have realised this and have been paying off video game developers not to test their games on AMD graphics cards prior to release and not to allow AMD access to pre-release versions to do it themselves.
Re:Let me get this straight: (Score:4, Interesting)
This needs to be modded up ^^
The dollars put into this amount to approximately 40% of nvidias entire "marketing" budget.
Basically they've started doing something that changes the industry even more to be in the hands of the content providers... When previously the hardware vendors had a bit more pull.
Back in the days of Voodoo and even for the first while of the ATI vs Nvidia era it was normal for game vendors to approach card makers for help debugging their games but there was no way in hell a card maker would pay for the privilege. Hell, back in the voodoo days they even PAID for the extra help making their games compatible with the cards in some cases.
ATI started caving and doing the same thing, which is part of what reduced their margins to the point where they just said the hell with it and sold out to AMD. AMD is refusing to play the game now so you get 1-2 week post-release bug fixes.
Re:Let me get this straight: (Score:5, Informative)
Re:Let me get this straight: (Score:5, Interesting)
It's not always driver bugs. Many of the fixes are things that tapdance around bad, buggy code within the game itself. Oftentimes the studio's devs play fast and loose with shader parameters or API compliance- and NVidia does it differently than AMD, etc.
Any time you see a "MAY" within a standards document, it really ought to be treated as a "SHALL" unless you know you're working on ONLY a target environment that the "MAY" doesn't affect you. Prime example would be something along the lines of VBO mapping to host addressing space. The spec says that it MAY stall the pipeline if you do this while you're in the middle of a rendering pass. Well...NVidia's implementation knows what VBOs are in-flight with a rendering pass and will stall only if it's known to be about to be used by the current pass in progress. AMD's drivers took the other, in fact, sensible approach because it's easier to implement and gains you performance overall if you don't have devs doing stupid things- they stalled ANY time you mapped any VBOs involved with the rendering pass in progress.
A major studio (Who shall not be named, nor shall the game...who knows, maybe you can guess the title...) did this in their GL code- they recycled VBOs, but did it intra -frame instead of inter -frame. The first is realtively safe, producing pretty good performance, the other's very much not so, based on the lead-in I gave just now. I should know, I've used it with some of the games I've done porting work on (Because the studio did the same thing in DirectX...which has the same restrictions here...). When you do it intra-frame, on NVidia, it slows the render pass down, but not unacceptably because it only stalls as long as needed to assure you're not corrupting the render pass. AMD, until they re-worked their VBO implementation would plummet to seconds per frame slide-show renderings on an X1950XTX card when it was THE hottest, fastest card out there- because it would stall the pipeline, taking milliseconds to recover, each and every time they re-mapped the VBO they were re-using to conserve on card memory on the frame's rendering pass.
Was it the driver's fault? Not even remotely close to the truth there. But...people will blame the driver, calling it "buggy". In fact, that's what happend, even.
Re: (Score:3)
Indeed, they should test their damn games. SWTOR has some of the worse issues I have ever encountered from an MMO. It's so sloppy and I am so pissed at the hype. What burns me is the computer game industry is monstrous in scope and size, yet there isn't an iota of gamer rights advocacy at all. If any other industry foisted off such shoddy, broken on purchase products, they would be rotting in prison. Can you imagine how things would be if the other industries had such slacker, shitty standards?
Consumers rul
Re: (Score:2)
Consumers rule and they need to get their collective shit together and start cracking whips.
Have you boycotted distributors and/or development teams whose games have had this problem?
Re: (Score:2)
Some of us have. Everyone wants to know why I didn't buy Skyrim, despite putting up with Fallout 3's/New Vegas' problems - and it's because of how Bethesda handled the clusterfuck that was the Rage launch.
Re: (Score:2)
you have got to be fucking kidding me (Score:2)
the people you bought this game from probably dont have health insurance, and you are worried about YOUR rights. here are your rights.
1. if you receive a defective product. return it to the store for your money back.
eventually the 'invisible hand' of capitalism will take care of the problem - bad companies will go bankrupt and good companies with good products will succeed, because consumers like you refuse to buy products that dont work.
or...
are you saying you dont believe in capitalism?
Re: (Score:2)
What kind of bullshit straw man/red herring opening was that? Health care, Christ, you play that card? OK, fine. It's sad but irrelevant to the subject.
1. You obviously haven't purchased a video game this century. Getting a rebate from the company for a digital download is going to happen right after hell freezes over and the Devil learns to ice skate. Also, it's computer software, the bugs don't fall out of the package when you break open the plastic seal. You don't get to throw your hands up in the air an
Re: (Score:2)
Re: (Score:2)
I tend to switch back and forth between ATI and Nvidia every upgrade - not that I do it on purpose, I just seem to upgrade whenever one or the other has the better-performing ~$250 card at the time (X1650, 8800GT, 5770 were the last couple). I have never had a "driver problem" with any game until Rage - which still didn't work after AMD pushed a hotfix, id Software issued their only two patches before they abandoned the game for dead, and still didn't work despite five months of AMD updates when I tried it
Re:Let me get this straight: (Score:5, Informative)
How about forcing the game makers to TEST THEIR DAMN GAME
Games often expose driver bugs. Major game developers are in communication with GPU vendors and when they discover bugs, the ones which turn out to be in the driver or the microcode sometimes get fixed, depending on how new the product is and whether the GPU is from Intel, AMD, or nVidia. nVidia has by far the best record in terms of working drivers, and also in terms of improving support for old hardware in new driver revisions. AMD is by far the worst. They have abandoned whole platforms while they were still shipping, for example R690M. I'm using a subnotebook based on it right now. Only thing it will run without shitting itself is Vista. And fglrx didn't support it when it was brand new, and still doesn't support it, and never will.
Don't be so quick (or anonymous, or cowardly) to assume that it's the game developer's fault when a problem "with the game" is fixed with a driver update.
Re: (Score:3)
I keep hearing people claim this about ATI/AMD; I must be the luckiest SOB in the world when it comes to buying hardware from ATI, I've never had trouble with any of my cards. Granted I run them under Windows.
Nvidia on the other hand, I have a single GFX sitting in my laptop and that is the crappiest piece of shit I've ever own. GFX driver keeps locking up, keeps crashing and has extremely poor performance compared to its competitors.
Re: (Score:2)
Get another driver version. I've rolled mine back to the one from 2011 and it's pretty stable. It took me 4 new installs to get the one that worked, though...
Note: I could choose between the one supplied by MS through Windows Update for my laptop, the one supplied by HP for my laptop (latest version had lower version than the MS version) and the ones from NVidia. Since the older HP one refused to remove the latest update, I ended up with the older NVidia one. Pretty happy with it, it works okay now.
But anyw
Re: (Score:2)
Heh... They also expose bugs within themselves. About 1/2-2/3rds of the bugs worked on when I worked with AMD as a contract dev were working around screwed up shader implementations and other playing fast and loose with the standards type coding within the games.
it's a completely fair point that many developers play cute tricks for performance which later bite them, and that some are just lames. I just don't want people assuming it's always the game developer's fault.
Re: (Score:2)
You fix third-party software... by modifying drivers?
How about forcing the game makers to TEST THEIR DAMN GAME before releasing? Is it really so hard to throw together four test-beds with GPUs from different vendors?
Do you mean to tell us that all vendors combined only have four different graphics cards available?
Re:Let me get this straight: (Score:4, Informative)
You fix third-party software... by modifying drivers?
How about forcing the game makers to TEST THEIR DAMN GAME before releasing? Is it really so hard to throw together four test-beds with GPUs from different vendors?
Having been on both sides of this.
There are some functions, usually directx functions that just do not behave properly with certain drivers. There is, in many cases, nothing you can do except ask the company to fix it. This is a double problem because a lot of times they won't look at your game until it's finished, so if you finish on friday and release on tuesday guess how much it's been looked at by nVIDIA or AMD.
While you are writing your game nVIDIA and AMD are writing new drivers and changing how their drivers behave. usually to accommodate someone eleses release, but not necessarily. That's incredibly frustrating, because you may not know whether the bug is your end, or theirs, especially if it behaves differently between driver releases.
For anyone who got the original version of the witcher 2 you could see the problem with 'test their damn game'. There was a problem with how ubersampling the ability to interact with objects. So the game came out with this problem, which is actually rare because almost no one had a card capable of doing ubersampling (even a new gtx680 today has slowdown with it). So AMD and the Witcher devs get onto fixing this problem. I think the problem was actually in how AMD was handling the sampling, but I'm not 100% sure. CD projekt did a hack workaround patch that changed how they did the sampling slightly, and at the same time AMD issued a fix, that wasn't compatible with the workaround. So you ended up in this problem where you're not even sure which solution you should be using as an end user.
Sure, a lot of the releases basically exists to clarify which codepath a particular game should be rendered with, or which SLI/crossfire profile it should use, which is relatively minor on the scale of things. But it really is a problem on the driver end that games are all treated inconsistently, or maybe that's a feature. Depends on your perspective. Treating games differently is a massive pain in the ass for development, but makes the experience much better for players, so take your pick.
Not always or even often the game's fault (Score:3)
If it has graphical glitches, ya that's probably the game, Poor performance, depends on. The problem with Rage is it is OpenGL and AMD has shitty GL drivers, they have for a long time. nVidia has long had GL and DX drivers that performed equally, AMD has long had GL problems (used to be much worse than now).
If it is BSODs or GPU driver crashes though? No, that is 100% on the graphics drivers. No matter what the program does, it shouldn't bring the system down. Anything running in Ring 3 can't bring the syst
Re: (Score:3)
...aaaaand PC gamers are wondering why I went Mac and console only for games rather than PC.
The games are still buggy now they're CONSISTENTLY buggy!
Re: (Score:2)
Wrote this as a reply in another post here:
Don't forget that console games have tons of bugs now too. And big huge flaws. The Skyrim save game issue? Bioshock always messing up widescreen? Rockstar grand theft everything. Silent Hill Downpour--the entire freaking game is full of bugs and hard locks.
Anyway, I go back and forth on this. I don't know which solution is better. I think it's gotta be down to simply personal preference. I think both sides has it's flaws.
PC flaws:
- I swear to god I'm so sick of upd
Re: (Score:2)
So do both. That might not be economically feasible. But if it is, then do both.
Re: (Score:2)
Can't agree here at all. AMD (formally ATI) had mediocre d3d drivers and were always notorious for having absolute craptacular opengl drivers. And that was over a decade ago and it's still the case as seen by recent opengl releases (brink and rage). And most of those users freaked out blaming the developer of the game when it's the driver's fault. And it's not modifying the drivers it's god damn fixing them!
I remember years ago Carmack said if he encountered a bug while testing on a nvidia machine he as
Re: (Score:2)
The NVIDIA Transition? (Score:5, Insightful)
As someone who is generally an AMD fan - their processors and video cards generally provide much better performance for much cheaper - their driver support, or lack thereof, is frustrating. NVIDIA consistently has far better driver support, and features, than their AMD counterparts, even if their cards don't provide as much bang for the buck.
If AMD falls even further behind in that game, I may just bite the bullet and switch to NVIDIA just to stop having to worry about driver-related frustrations altogether.
Re:The NVIDIA Transition? (Score:5, Interesting)
Re: (Score:3)
As someone who very recently switched back to AMD because recent Nvidia cards (including my own) have been giving me and others some annoying and only occasionally recoverable [microsoft.com] Purple Screens of Death*, I can't wait for a decent Company #3** to kick both their asses on driver size and reliability.
*In my case, a GTX 460, after a year of use. After it started interrupting my Terraria games (even with motherboard settings changes) I thought it was time to recheck what others experienced; and after that, time f
Re: (Score:2)
Along with waiting for new and high-quality cellular companies and cable companies, don't hold your breath.
Re: (Score:2)
After years of frustration with crap drivers for ATI video cards and crap drivers for AMD chipsets from third parties (any of them) I finally switched to Intel CPU, Intel chipset, and Nvidia graphics cards. I even bit the bullet and got an Intel model motherboard and made sure the RAM I bought was on the list of tested RAM [intel.com].
I have had zero problems since I bought it in 2009. Intel DP55WG, Intel i7 860, EVGA GeForce GTX 260, 8GB of a supported SKU of Kingston RAM. The biggest problem I've had (knock on woo
Re: (Score:2)
if your ethernet controller is crappy you'd better disable it and use an old 3COM 100Mb card or something.
Re: (Score:2)
I have to agree with this. I've had so, so many problems with nForce hardware and drivers it's not even funny. nForce is a redheaded stepchild, to be sure. I've had quite a few problems, with pretty much anything that matters on the platform: the ethernet, chipset/disk, and video. Independent cards are still the way to go with nVidia.
As a whole, I'd say ATI is much better about integrated products, but I'll take independent components which work over something which tends to like to fail, thanks. It's been
Re: (Score:2, Informative)
Yes they did. See the forcedeth [wordpress.com] driver (nVidia also provided a binary driver, called "nvnet" in Linux).
Re: (Score:2)
I got sick and tired of AMD/ATI back when Voodoo was still a pass-through board, and while their hardware has improved I'm never surprised to hear about bullshit with their drivers.
Re: (Score:2)
Funny, EVGA(3 graphics card in succession having huge flaws, such as fan not working properly, capacitors falling off, while the computer hadn't been moved in 6 months, and the third had bad RAM chips) is on my list of "hardware to avoid at all costs", just like Antec PSU's(none of them lasted more than 6 months, unlike the Q-tec, the cheap piece of shit they were meant to replace, is still alive to this day, 11 years after I bought it...), Gigabyte motherboards etc.
Re:The NVIDIA Transition? (Score:5, Interesting)
Here's irony for you:
-AMD supposedly releases driver updates on a monthly basis, though they haven't quite managed it for the last couple years, sometimes not making the deadline, sometimes just releasing basically the same driver two months in a row, then releasing out-of-band updates when games break their cards.)
-nVIDIA has always released drivers "as needed'.
-AMD switches to releasing drivers "as needed".
-Everyone complains, and threatens to switch to nVIDIA.
Re: (Score:2)
My NVIDIA card was having heating problems and needed a replacement so one day I finally said okay this ATI card is such a better deal (I think it was an X1900) and got that instead of a new NVIDIA card. Boy did I regret it. The card would overheat because the drivers wouldn't turn the fan up when it got hotter.
Re: (Score:2)
"I may just bite the bullet and switch to NVIDIA just to stop having to worry about driver-related frustrations altogether"
I wouldn't bother, I did and I've had more issues with the GTX 280 than I ever did with the HD 4870.
Re: (Score:2)
I'd consider buying Nvidia but (Score:3, Interesting)
Re: (Score:2)
While I agree it's pretty annoying, and certainly confusing for many consumers, it's not as if you can't tell what's inside a particular model, since it's pretty easy to find that information by googling.
Re: (Score:2)
Well if you really are unable to do a minimal amount of research to find out, ok I guess that's a reason not to buy, but I would think it wouldn't be to hard to just, you know, look shit up. nVidia's site is a good place and not hard to get to.
Also if you are talking desktops, and I assume you are from the use of the term board, then you are talking nonsense. The rebranding has been in the laptop space, not the desktop space. With laptops they do have some mixed naming as there are 600 series parts from the
Re: (Score:2)
Re: (Score:2)
Ultimately it really doesn't matter as what you should check are features and speed, not an arbitrary choice of what technology they use.
Indeed. And with that in mind, I would be very interested if anyone can cite even a single credible source that compares "workstation" and "gamer" cards objectively from nVidia and/or AMD. You'll find a load of people who parrot the line that you "must" use the far more expensive workstation cards for certain kinds of professional applications, but few can really tell you why, and even those who do generally refer to drivers rather than any difference in the hardware. And that's before you even get into nVi
Bias much? (Score:5, Insightful)
AMD says that they're moving from a monthly release cycle to a release-when-needed cycle and someone decides to write this piece of trash about it?
It's not a bad thing, it makes sense to do it like this. As the summary points out, AMD currently releases out-of-band updates for when a high-profile title has an issue or launch day performance increases, so it doesn't make sense to make another release that month that doesn't change much. It's just confusing and frankly unnecessary. Doing it "as needed" just means that when a driver release comes out, it's worth updating to. If that means I only have to update my drivers once every few months, I'm fine with that - even if it occasionally means there's 2 or 3 updates in the space of a month because a lot of games happened to come out then. Overall, it's better for everyone.
Article is a big load of FUD and should be ignored.
Disclaimer: I've currently got a Geforce 560 Ti in my desktop and my laptop uses a Geforce 555M chipset - frankly, I'm an nvidia fanboy and this article still disgusts me.
Re: (Score:2)
Re: (Score:2)
so when will linux drivers be needed? It sure won't be to fix windows game under wine.... anyways that is my concern about this statement. I'm still not running xorg-server-1.12 on my one AMD machine because FGLRX doesn't support it yet..
Re: (Score:2)
I don't think this statement has any bearing on their linux driver support. Linux driver support from both nvidia and AMD could be a lot better than it currently is, but I don't see how it's going to make support any worse.
Re: (Score:3)
The article at Anandtech is less ominous and explains why this is actually a good thing with video chips and drivers as complicated as they are today.
http://www.anandtech.com/show/5880/amd-discontinues-monthly-driver-updates-releases-catalyst-126-beta [anandtech.com]
What the summary and article from the submitter are missing is the term WHQL. AMD has and always will be releasing beta drivers to fix games as needed just as Nvidia does. What they are stopping with this announcement is halting the monthly WHQL releases. To ge
Maybe they would not have as many issues (Score:2)
If they released software and patches when they were done instead of on an artificial time schedule.
Crysis 2 in DX9 (Score:3)
heh ok, wonder how bad the demand for that is...
"check it out I got a i7 extreme fucking overclocked, 32 gigs of ram overclocked, quad ATI's also overclocked, 4 SSD's in RAID, and Windows XP cause DX9 is the shit yo"
cause no one plays crysis for the game, its a epeen ruler.
Re: (Score:2)
Not the case for Crysis 2. Crytek scaled it back a lot, and actually focused on making a game rather than just a tech demo. When it came out, it was DX9 only. Later they released a patch that introduced DX11 support and had some bigger textures, but at launch it was a DX9 title.
Re: (Score:2)
I take issue with that. Did you even play Crysis 1? Most people who've said that here on /. are just repeating that meme and haven't played it.
- The freedom to choose multiple routes on entire freaking huge areas instead of on-rails.
- Freedom unlike pretty much every shooter out now to kill in different ways. Want to be stealthy? Go cloak, sneak up behind a guy, turn on maximum strength, drag him to a cliff and throw him off. Or be entirely a different way: hop on top of a hut, turn on maximum strength, pou
Re: (Score:2)
actually yes I played crysis 1, it was just every other shooter on the market at the time, and fuck you if you dared stick your head out of a bush for a half ms cause a sniper 6 miles away would headshot you every single time, wonky vheical physics, dumb as shit AI ... aside from looking pretty and having a hype machine set into overdrive it really was a just below average shooter for the time
nevermind were talking about crysis 2 here and not the first game.
I want a stable driver (Score:2)
Not everyone is a 3D gamer who wants to be on the absolute cutting edge of everything. Not everyone thinks trading off stability against a few extra FPS is a good deal.
Would it be too much to give us a stable driver, with maybe one update per year? By stable I mean no dodgy hacks, and no game-specific "optimizations". I mean a driver that won't crash, and that isn't afraid to be a little slower in order to do things right. Is there really no one else out there who cares about stability?
What are the options? (Score:2)
What difference does it make (Score:2)
if your OEM locks you out of driver updates in the first place? I've had no end of frustration with my Lenovo laptop and the fact that they unlock a new AMD driver about once per year.
I see nothing has changed (Score:2)
I have been building computers since the early 1990s. There has always been a segment of the community that prefers AMD to (Intel / Nvidia / etc) and I have never understood why. I have tried both AMD CPUs and video cards over the years and always end up going back to Intel, and more recently Nvidia. It seems like AMD just cannot get it right when it comes to the gaming market. They often win on pricepoint, but completely fail on issues like what this article mentions.
Why do people continue to support A
Re: (Score:2)
Why do people continue to support AMD? All I can figure is that it has to do with an irrational hatred of Intel. Intel is monopolistic. Intel is anti-competitive. Intel is expensive. So, rather than supporting Intel and getting the best products on the market, people go with AMD and suffer in smug self-righteousness for doing the "right thing" and not supporting the companies that are dominating the field.
This article is about graphics drivers, not CPUs. Intel doesn't make any discrete graphics cards. And
There's always places like Guru3D (Score:2)
You can still get the latest 'leaks' at places like guru3d.com - it sucks if for people who don't know about this and people counting on (say) Steam Catalyst auto-update, but if I'm having issues with ATI or Nvidia drivers I go there first.
10 years ago... (Score:2)
About 10 years ago, I was using a Radeon 9600XT. At the time, it was well-known that ATI's drivers sucked, but that their hardware was better for the price you paid. So, with the release of the 9000 series, ATI (now AMD, as you probably know) made a big deal about how they were starting to overhaul and vastly improve their driver quality.
Well...?
10 years later, they're still not giving their drivers nearly the attention they deserve, and it seems evident that they simply don't consider them a high priorit
Linux support? (Score:2)
What about Linux drivers and support? ATI/AMD is supposed to be better in this area, but I haven't seen much compared to NVIDIA's awesome driver support. What about the rest of you?
Re: (Score:2)
You are correct, that makes intel irrelevant.
Re: (Score:2)
Re: (Score:2)
Really? Odd that developers seem to be abandoning consoles, and even console gamers seem to be abandoning them in favor of the PC. The last two years have seen a pretty good resurgence. If you want to see some odd stats, look at some of the indie titles that sold so poorly on consoles but sold so well on PC's that they made enough to make a sequel.
Re: (Score:2)
Very true. But what will really happen is pretty much all devs and games will go to the PC. They can just make so much more money and have more access to more gamers, and with more freedom. Prediction: F2P, indie, and mid-size will rule the PC and the majority of the gaming world. AAA will be marginalized. But AAA games will remain on the consoles and possible even eliminated on the PC in the future. The costs and art assets involved are going to be even more ridiculous than they are now. If anything, we'll
Re: (Score:3)
Regardless of how you feel about their products, it HAS been nice knowing ATI - considering how bad Nvidia's price-gouging is now, think how bad it would be without ATI.