Please create an account to participate in the Slashdot moderation system


Forgot your password?
AMD Graphics Bug Software Upgrades Technology

Despite Game-Related Glitches, AMD Discontinues Monthly Driver Updates 213

MojoKid writes "Recently AMD announced that it would cease offering monthly graphics driver updates, and instead issue Catalyst versions only 'when it makes sense.' That statement would be a good deal more comforting if it didn't 'make sense' to upgrade AMD's drivers nearly every single month. From 2010 through 2011, AMD released a new Catalyst driver every month like clockwork. Starting last summer, however, AMD began having trouble with high-profile game releases that performed badly or had visual artifacts. Rage was one high-profile example, but there have been launch-day issues with a number of other titles, including Skyrim, Assassin's Creed, Bat Man: Arkham City, and Battlefield 3. The company responded to these problems by quickly releasing out-of-band driver updates. In addition, AMD's recent Catalyst 12.6 beta driver also fixes random BSODs on the desktop, poor Crossfire scaling in Skyrim and random hangs in Crysis 2 in DX9. In other words, AMD is still working to resolve important problems in games that launched more than six months ago. It's hard to put a positive spin on slower driver releases given just how often those releases are necessary."
This discussion has been archived. No new comments can be posted.

Despite Game-Related Glitches, AMD Discontinues Monthly Driver Updates

Comments Filter:
  • by macemoneta ( 154740 ) on Sunday June 03, 2012 @11:46AM (#40201315) Homepage

    They didn't say slower, they said as needed. Since they are already releasing 'out of band' they are just normalizing that process. They will release when they have fixes / function instead of on an arbitrary timeline. It seems to make perfect sense.

    • Not only that, but releasing 'out of band' patches on a regular basis was probably a huge drag on their responsiveness. If you have a deadline for a new driver patch EVERY SINGLE MONTH regardless if the patch is ready then you are either testing inconsequential releases and wasting your QA resources on releases that aren't important in order to meet some arbitrary deadline or you are releasing insufficiently tested patches to meet some arbitrary quota. Neither is a recipe for efficient allocation of your

      • Plus it means buggier drivers as I noticed, especially in the first releases of the 10 and 11 series of drivers, that the monthly releases were more likely to be glitchy. Now that I know this it makes sense, most of the dev team were probably spending the majority of their time on fixing some major bug and just didn't have the time to do proper QA on the monthly. After all if you have a major game like Skyrim screwing up you aren't gonna be wasting as much time filling some monthly quota as you are fixing t

    • An artificial release schedule, one tied to the calendar rather than bug fixes, can actually slow things down. It can cause a certain amount of disruption when a team is in the middle of taking care of a bug. It seems somewhat similar to having to put together a demo when you are in the middle implementing a feature. I'd say try for a monthly release but don't necessarily let that goal interrupt fixes underway, let in progress fixes delay the release when it makes sense to do so.
    • Exactly and I'd add that as an owner of AMD cards that the monthly on time drivers? Really didn't do much. They'd add a little better support for some codec, a little bit better support for game X but frankly there was no reason why a lot of those monthly updates couldn't have been handed out with the bug fixes.

      Personally I'd rather download a new driver when they've fixed bugs than have to go "Oh its the second Thursday of the month or whatever, time for an AMD driver". The only thing I don't like about

  • Is this nvidia spin? (Score:2, Interesting)

    by Anonymous Coward

    I mean, of course frequent updates are desirable. On the other hand, every release produces overhead which could be used to fix the problems at hand. In my experience, monthly update schedules are a terrible waste of valuable time.

    Personally, I'm an nvidia user, since I hate the driver issues of AMD... but this news sounds like nvidia spin to me.

    • I simplified my entire life by using an xbox 360 and a playstation 3 for gaming, and a blu-ray player for movies. If I have a rip I just burn it and stick it in the blu-ray player.

      While there are occasional bugs in the console games, I've rarely experienced them.

      • by MogNuts ( 97512 ) on Sunday June 03, 2012 @06:11PM (#40204019)

        Hah, I was waiting for it. Waiting for the first post to bring up your argument. I've been reading slashdot for way too long.

        Don't forget that console games have tons of bugs now too. And big huge flaws. The Skyrim save game issue? Bioshock always messing up widescreen? Rockstar grand theft everything. Silent Hill Downpour--the entire freaking game is full of bugs and hard locks.

        Anyway, I go back and forth on this. I don't know which solution is better. I think it's gotta be down to simply personal preference. I think both sides has it's flaws.

        PC flaws:

        - I swear to god I'm so sick of updating drivers, for anything. Graphics drivers should just be auto-updated, period. Not even to have a button in the ATI/Nvidia control panel is good enough. As it stands now, there are too many steps. Yea yea it's more safe to do it this way now, where if a driver is broke they can revoke it. But it's the same issue windows was having. Either deal with that, or deal with most users not upgrading at all.

        - All games should have built in patching mechanisms. Steam does this right now, as do EA games or GFWL. But what if a game isn't? Or what if I want to buy a game from say GOG or Gamersgate. They don't auto patch. So ur stuck back in the days of yore, hunting down patches from fileplanet or something. That's bull and I flat out refuse.

        -Small dev Q&A problems. I love freakin Red Orchestra 2 and Arma 2. Amazing games. But the bugs. Oh the bugs. Jesus it's terrible. Don't even bother playing a game until it's been out 6 months.

        Console flaws:

        - No support of alternative games. Read: MMO or F2P. Short of DC universe for PS3 or free realms, it's out. But that's a big segment of the future and part of the solution to keeping online communities big and a steady, not one-off, revenue stream. And consoles could OWN this market, but they don't. They could make a badass-looking (compared to the PC F2P's right now which have to be simple enough to run on IGP's) MMO's or F2P's. But nooo.

        - No digital downloads for everything. There are a few games I want. Can't get em. Both ps3 and xbox only have like 20% of their titles available. Even on the ps3, you can't download most sony games. Pathetic (resistance 3 I'm looking at u). And the prices are atrocious. $60 or $40 for games that are only $30 or $15-20 online retail. And more money for a game without shipping or physical presence AND is locked to an account? Who in their right mind would buy it?

        - No mouse/keyboard support. I'm not saying they should do it across the board like most people who throw out this argument do. No, consoles are meant to be played with a controller. HOWEVER, make keyboard support only for some genres, like strategy. There are barely any RTS games. This would allow them access to the market.

        • by epyT-R ( 613989 ) on Monday June 04, 2012 @01:21AM (#40206041)

          I'd rather have to download patches than have the thing autoupdating when I don't want it to just yet. Same thing with drivers. Those are things that really should be managed by the user. There are plenty of circumstances where latest_version = best choice is a horrible assumption, esp with people who have older hardware. Some drivers just don't like some hardware configs too.

          One of the biggest selling points of PCs is that the user controls the software. Take that away and it's just another stupid console like everything else is nowadays. I don't mind having an option for autoupdate, but I would not want it mandatory. I still want to have the installers available for local storage.

  • by Anonymous Coward on Sunday June 03, 2012 @11:47AM (#40201331)

    You fix third-party software... by modifying drivers?

    How about forcing the game makers to TEST THEIR DAMN GAME before releasing? Is it really so hard to throw together four test-beds with GPUs from different vendors?

    • by 91degrees ( 207121 ) on Sunday June 03, 2012 @12:03PM (#40201435) Journal
      I'm in two minds over this. Assuming this is an actual glitch in the drivers causing the problems.

      On one hand, AMD should fix it. On the other hand, AMD graphics cards are pretty popular. Their game should be designed to work on what they can reasonably expect their users to have.
      • by Anonymous Coward on Sunday June 03, 2012 @12:14PM (#40201503)

        It's easier to just release the game and let gamers and video card manufacturers fight over who is in the wrong. By the time someone figures it out the developers have made their money and run off.

      • Their game should be designed to work on what they can reasonably expect their users to have.

        And that's another reason why many developers switch to consoles. Because you cannot predict what configuration the user has. It's not only videocards, it's motherboards, processors, RAM... All kinds of bugs. My own PC reboots from time to time, I have no way of knowing why that happens. And notebooks are even worse.

        Passing the handling of hardware related bugs to developers is stupid. In that case videogames would support only specific system configurations and refuse to run on a different hardware. Do you

        • And that's another reason why many developers switch to consoles. Because you cannot predict what configuration the user has. It's not only videocards, it's motherboards, processors, RAM... All kinds of bugs. My own PC reboots from time to time, I have no way of knowing why that happens. And notebooks are even worse.

          Passing the handling of hardware related bugs to developers is stupid. In that case videogames would support only specific system configurations and refuse to run on a different hardware. Do you want that?

          Since the developers can't even make their games run correctly on consoles [], I think it's fair to blame the problem on them.

          And is your problem with mystery reboots from Windows automatically updating? Fix that here [].

        • Dude I've been building machines since Win 3.x was the OS of the day and frankly? Drivers have never been better. I honestly can't even remember the last time I saw a BSOD in the shop that wasn't caused by a piece of hardware actually failing.

          On the other hand I have seen a LOT of programs, especially games, where it was obvious it was kicked out the door with minimal testing to make some deadline. Quite obviously game related bug, voice sync issues, textures popping in and out, games just slamming the hel

    • How about forcing the game makers to TEST THEIR DAMN GAME before releasing?


      Unfortunately, though, a forum full of "It doesn't work on AMD cards! OMG!!!" makes AMD look bad, not the game developer. AMD then have to go about emulating NVIDIA's driver bugs.

      • Bahahahha. I'm not denying that nVidia has had driver bugs, but complaining about AMD having to emulate nVidia's driver bugs is like complaining that Intel had to implement AMD64. nVidia is so much better at drivers than AMD that your comment looks like the insane rantings of a madman.

        • by Joce640k ( 829181 ) on Sunday June 03, 2012 @12:58PM (#40201833) Homepage

          nVidia is so much better at drivers than AMD that your comment looks like the insane rantings of a madman.

          Oh, yeah? I program 3D graphics for a living so I have to deal with this stuff on a daily basis. I'm working around a bug right now.

          Question: Are occlusion queries supposed to return number of samples or number of pixels in Direct3D?

          A certain company's "pro" graphics cards seem to differ from their "consumer" graphics cards over this.

          The only way I've found to get my program working is to do a dummy occlusion query when I create the framebuffer and see what happens.

          • by Shinobi ( 19308 ) on Sunday June 03, 2012 @01:45PM (#40202153)

            Direct3D technically allows for both, the XNA game dev framework specifies number of pixels however, for performance reasons. The number of samples method tends to be more accurate but very slow. It's the same thing on the OpenGL side. CAD, 3D applications such as Maya etc, compositing programs etc tend to use samples over pixels, for more accuracy.

          • Re: (Score:3, Informative)

            by Anonymous Coward

            Question: Are occlusion queries supposed to return number of samples or number of pixels in Direct3D?

            Occlusion queries are supposed to return number of pixels in both Direct3D and OpenGL.

            A certain company's "pro" graphics cards seem to differ from their "consumer" graphics cards over this.

            In both API's or just one? If just one, then the problem is actually within Direct3D and isn't the card at all.

            The only way I've found to get my program working is to do a dummy occlusion query when I create the framebuffer and see what happens.

            Then you're doing something else wrong and have misidentified the source of your trouble. I won't get into here, but this might prove helpful to you:

        • by sandytaru ( 1158959 ) on Sunday June 03, 2012 @01:24PM (#40202025) Journal
          nVidia has the mother of all driver bugs and they've refused to fix it for years. If you run a DVI to HDMI cable from an nVidia card with no native HDMI support, the driver recognizes the HDMI cable anyway, assumes it can run sound, and attempts to run sound via the nonexistent sound chip on the video card. In essence, it overrides the onboard sound and sometimes even a discrete sound card in the computer. Since native HDMI support was introduced in newer cards, nVidia has felt no need to address this glitch in their older cards. I ended up recycling an otherwise perfectly good GeForce 9800 GT because the computer it was in was hooked up to the 40" television, but any time I had the video card driver installed I had no sound!
      • by makomk ( 752139 ) on Sunday June 03, 2012 @01:15PM (#40201951) Journal

        Even more unfortunately, NVidia have realised this and have been paying off video game developers not to test their games on AMD graphics cards prior to release and not to allow AMD access to pre-release versions to do it themselves.

        • by Ironhandx ( 1762146 ) on Sunday June 03, 2012 @10:22PM (#40205369)

          This needs to be modded up ^^

          The dollars put into this amount to approximately 40% of nvidias entire "marketing" budget.

          Basically they've started doing something that changes the industry even more to be in the hands of the content providers... When previously the hardware vendors had a bit more pull.

          Back in the days of Voodoo and even for the first while of the ATI vs Nvidia era it was normal for game vendors to approach card makers for help debugging their games but there was no way in hell a card maker would pay for the privilege. Hell, back in the voodoo days they even PAID for the extra help making their games compatible with the cards in some cases.

          ATI started caving and doing the same thing, which is part of what reduced their margins to the point where they just said the hell with it and sold out to AMD. AMD is refusing to play the game now so you get 1-2 week post-release bug fixes.

    • by EdZ ( 755139 ) on Sunday June 03, 2012 @12:20PM (#40201541)
      It depends on where the problem lies: If the game is using the directX (or openGL) libraries correctly but the driver is mucking things up, then the game developer should not need to code around driver bugs. Conversely, if the game developer is using a 'clever hack' to eke out some more performance, this creates a headache for the driver developers to keep this hack working in one instance but stop it working for things written to the word of the API in other instances.
      • by Svartalf ( 2997 ) on Sunday June 03, 2012 @01:53PM (#40202207) Homepage

        It's not always driver bugs. Many of the fixes are things that tapdance around bad, buggy code within the game itself. Oftentimes the studio's devs play fast and loose with shader parameters or API compliance- and NVidia does it differently than AMD, etc.

        Any time you see a "MAY" within a standards document, it really ought to be treated as a "SHALL" unless you know you're working on ONLY a target environment that the "MAY" doesn't affect you. Prime example would be something along the lines of VBO mapping to host addressing space. The spec says that it MAY stall the pipeline if you do this while you're in the middle of a rendering pass. Well...NVidia's implementation knows what VBOs are in-flight with a rendering pass and will stall only if it's known to be about to be used by the current pass in progress. AMD's drivers took the other, in fact, sensible approach because it's easier to implement and gains you performance overall if you don't have devs doing stupid things- they stalled ANY time you mapped any VBOs involved with the rendering pass in progress.

        A major studio (Who shall not be named, nor shall the game...who knows, maybe you can guess the title...) did this in their GL code- they recycled VBOs, but did it intra -frame instead of inter -frame. The first is realtively safe, producing pretty good performance, the other's very much not so, based on the lead-in I gave just now. I should know, I've used it with some of the games I've done porting work on (Because the studio did the same thing in DirectX...which has the same restrictions here...). When you do it intra-frame, on NVidia, it slows the render pass down, but not unacceptably because it only stalls as long as needed to assure you're not corrupting the render pass. AMD, until they re-worked their VBO implementation would plummet to seconds per frame slide-show renderings on an X1950XTX card when it was THE hottest, fastest card out there- because it would stall the pipeline, taking milliseconds to recover, each and every time they re-mapped the VBO they were re-using to conserve on card memory on the frame's rendering pass.

        Was it the driver's fault? Not even remotely close to the truth there. But...people will blame the driver, calling it "buggy". In fact, that's what happend, even.

    • Indeed, they should test their damn games. SWTOR has some of the worse issues I have ever encountered from an MMO. It's so sloppy and I am so pissed at the hype. What burns me is the computer game industry is monstrous in scope and size, yet there isn't an iota of gamer rights advocacy at all. If any other industry foisted off such shoddy, broken on purchase products, they would be rotting in prison. Can you imagine how things would be if the other industries had such slacker, shitty standards?

      Consumers rul

      • Consumers rule and they need to get their collective shit together and start cracking whips.

        Have you boycotted distributors and/or development teams whose games have had this problem?

        • Some of us have. Everyone wants to know why I didn't buy Skyrim, despite putting up with Fallout 3's/New Vegas' problems - and it's because of how Bethesda handled the clusterfuck that was the Rage launch.

      • Thanks for that post, nice to know I'm not the only one bit by that. My oldest wanted that game soooo bad but finally gave up when even an upgrade to a hexacore and HD4850 wasn't enough to keep that game from slowdowns. Considering the not very fancy graphics having it slow down on even a dual wasn't acceptable, but a hex with a 256bit GPU? Give me a break! I finally showed him several other MMOs that had better graphics and more going on and said "Look bud, if these games don't glitch but that one does? it

      • the people you bought this game from probably dont have health insurance, and you are worried about YOUR rights. here are your rights.

        1. if you receive a defective product. return it to the store for your money back.

        eventually the 'invisible hand' of capitalism will take care of the problem - bad companies will go bankrupt and good companies with good products will succeed, because consumers like you refuse to buy products that dont work.


        are you saying you dont believe in capitalism?

        • What kind of bullshit straw man/red herring opening was that? Health care, Christ, you play that card? OK, fine. It's sad but irrelevant to the subject.

          1. You obviously haven't purchased a video game this century. Getting a rebate from the company for a digital download is going to happen right after hell freezes over and the Devil learns to ice skate. Also, it's computer software, the bugs don't fall out of the package when you break open the plastic seal. You don't get to throw your hands up in the air an

    • by Alarash ( 746254 )
      I don't know. Since (literally) the start of ATI I've heard about news like this one, or just that some specific (often popular) games not 'working properly.' It's one of the reasons I've never owned a single ATI video card and always went the 3dfx/nvidia route. I'm baffled that some people keep buying ATI, even if they are cheaper on a power:price comparison.
    • by drinkypoo ( 153816 ) <> on Sunday June 03, 2012 @12:30PM (#40201609) Homepage Journal

      How about forcing the game makers to TEST THEIR DAMN GAME

      Games often expose driver bugs. Major game developers are in communication with GPU vendors and when they discover bugs, the ones which turn out to be in the driver or the microcode sometimes get fixed, depending on how new the product is and whether the GPU is from Intel, AMD, or nVidia. nVidia has by far the best record in terms of working drivers, and also in terms of improving support for old hardware in new driver revisions. AMD is by far the worst. They have abandoned whole platforms while they were still shipping, for example R690M. I'm using a subnotebook based on it right now. Only thing it will run without shitting itself is Vista. And fglrx didn't support it when it was brand new, and still doesn't support it, and never will.

      Don't be so quick (or anonymous, or cowardly) to assume that it's the game developer's fault when a problem "with the game" is fixed with a driver update.

      • by Splab ( 574204 )

        I keep hearing people claim this about ATI/AMD; I must be the luckiest SOB in the world when it comes to buying hardware from ATI, I've never had trouble with any of my cards. Granted I run them under Windows.

        Nvidia on the other hand, I have a single GFX sitting in my laptop and that is the crappiest piece of shit I've ever own. GFX driver keeps locking up, keeps crashing and has extremely poor performance compared to its competitors.

        • Get another driver version. I've rolled mine back to the one from 2011 and it's pretty stable. It took me 4 new installs to get the one that worked, though...

          Note: I could choose between the one supplied by MS through Windows Update for my laptop, the one supplied by HP for my laptop (latest version had lower version than the MS version) and the ones from NVidia. Since the older HP one refused to remove the latest update, I ended up with the older NVidia one. Pretty happy with it, it works okay now.

          But anyw

    • by mwvdlee ( 775178 )

      You fix third-party software... by modifying drivers?

      How about forcing the game makers to TEST THEIR DAMN GAME before releasing? Is it really so hard to throw together four test-beds with GPUs from different vendors?

      Do you mean to tell us that all vendors combined only have four different graphics cards available?

    • by Sir_Sri ( 199544 ) on Sunday June 03, 2012 @12:38PM (#40201673)

      You fix third-party software... by modifying drivers?

      How about forcing the game makers to TEST THEIR DAMN GAME before releasing? Is it really so hard to throw together four test-beds with GPUs from different vendors?

      Having been on both sides of this.

      There are some functions, usually directx functions that just do not behave properly with certain drivers. There is, in many cases, nothing you can do except ask the company to fix it. This is a double problem because a lot of times they won't look at your game until it's finished, so if you finish on friday and release on tuesday guess how much it's been looked at by nVIDIA or AMD.

        While you are writing your game nVIDIA and AMD are writing new drivers and changing how their drivers behave. usually to accommodate someone eleses release, but not necessarily. That's incredibly frustrating, because you may not know whether the bug is your end, or theirs, especially if it behaves differently between driver releases.

      For anyone who got the original version of the witcher 2 you could see the problem with 'test their damn game'. There was a problem with how ubersampling the ability to interact with objects. So the game came out with this problem, which is actually rare because almost no one had a card capable of doing ubersampling (even a new gtx680 today has slowdown with it). So AMD and the Witcher devs get onto fixing this problem. I think the problem was actually in how AMD was handling the sampling, but I'm not 100% sure. CD projekt did a hack workaround patch that changed how they did the sampling slightly, and at the same time AMD issued a fix, that wasn't compatible with the workaround. So you ended up in this problem where you're not even sure which solution you should be using as an end user.

      Sure, a lot of the releases basically exists to clarify which codepath a particular game should be rendered with, or which SLI/crossfire profile it should use, which is relatively minor on the scale of things. But it really is a problem on the driver end that games are all treated inconsistently, or maybe that's a feature. Depends on your perspective. Treating games differently is a massive pain in the ass for development, but makes the experience much better for players, so take your pick.

    • If it has graphical glitches, ya that's probably the game, Poor performance, depends on. The problem with Rage is it is OpenGL and AMD has shitty GL drivers, they have for a long time. nVidia has long had GL and DX drivers that performed equally, AMD has long had GL problems (used to be much worse than now).

      If it is BSODs or GPU driver crashes though? No, that is 100% on the graphics drivers. No matter what the program does, it shouldn't bring the system down. Anything running in Ring 3 can't bring the syst

    • ...aaaaand PC gamers are wondering why I went Mac and console only for games rather than PC.

      The games are still buggy now they're CONSISTENTLY buggy!

      • by MogNuts ( 97512 )

        Wrote this as a reply in another post here:

        Don't forget that console games have tons of bugs now too. And big huge flaws. The Skyrim save game issue? Bioshock always messing up widescreen? Rockstar grand theft everything. Silent Hill Downpour--the entire freaking game is full of bugs and hard locks.

        Anyway, I go back and forth on this. I don't know which solution is better. I think it's gotta be down to simply personal preference. I think both sides has it's flaws.

        PC flaws:

        - I swear to god I'm so sick of upd

    • by Sark666 ( 756464 )

      Can't agree here at all. AMD (formally ATI) had mediocre d3d drivers and were always notorious for having absolute craptacular opengl drivers. And that was over a decade ago and it's still the case as seen by recent opengl releases (brink and rage). And most of those users freaked out blaming the developer of the game when it's the driver's fault. And it's not modifying the drivers it's god damn fixing them!

      I remember years ago Carmack said if he encountered a bug while testing on a nvidia machine he as

      • Actually Carmack admits that he built Rage for the consoles [] and that PC simply wasn't given the same priority so surprise! the PC version is shit. Water is wet, day comes after night, console ports suck balls, film at 11.

        Kinda ironic considering that Id "Games" (I'd call a lot of their later stuff fancy tech demos) have never been bit hits on consoles but instead have had the PC modding community make decent games out of their tech demos, but hey! Piss on your core audience and you shouldn't be surprised

  • by deweyhewson ( 1323623 ) on Sunday June 03, 2012 @11:48AM (#40201333)

    As someone who is generally an AMD fan - their processors and video cards generally provide much better performance for much cheaper - their driver support, or lack thereof, is frustrating. NVIDIA consistently has far better driver support, and features, than their AMD counterparts, even if their cards don't provide as much bang for the buck.

    If AMD falls even further behind in that game, I may just bite the bullet and switch to NVIDIA just to stop having to worry about driver-related frustrations altogether.

    • by DWMorse ( 1816016 ) on Sunday June 03, 2012 @11:51AM (#40201349) Homepage
      To date, my nForce motherboard can't hit sleep mode without the network card going full retard. You NEVER go full retard. For shame, Nvidia. It's been over 2 years and they still haven't released a fix. Nvidia has their share of issues too.
      • As someone who very recently switched back to AMD because recent Nvidia cards (including my own) have been giving me and others some annoying and only occasionally recoverable [] Purple Screens of Death*, I can't wait for a decent Company #3** to kick both their asses on driver size and reliability.

        *In my case, a GTX 460, after a year of use. After it started interrupting my Terraria games (even with motherboard settings changes) I thought it was time to recheck what others experienced; and after that, time f

        • Along with waiting for new and high-quality cellular companies and cable companies, don't hold your breath.

      • After years of frustration with crap drivers for ATI video cards and crap drivers for AMD chipsets from third parties (any of them) I finally switched to Intel CPU, Intel chipset, and Nvidia graphics cards. I even bit the bullet and got an Intel model motherboard and made sure the RAM I bought was on the list of tested RAM [].

        I have had zero problems since I bought it in 2009. Intel DP55WG, Intel i7 860, EVGA GeForce GTX 260, 8GB of a supported SKU of Kingston RAM. The biggest problem I've had (knock on woo

      • if your ethernet controller is crappy you'd better disable it and use an old 3COM 100Mb card or something.

      • by CAIMLAS ( 41445 )

        I have to agree with this. I've had so, so many problems with nForce hardware and drivers it's not even funny. nForce is a redheaded stepchild, to be sure. I've had quite a few problems, with pretty much anything that matters on the platform: the ethernet, chipset/disk, and video. Independent cards are still the way to go with nVidia.

        As a whole, I'd say ATI is much better about integrated products, but I'll take independent components which work over something which tends to like to fail, thanks. It's been

    • If you do, I suggest EVGA. Lifetime warranty on the cards, and if they manage to send a lemon replacement out (mutter) they'll send a Fedex guy to your door to retrieve and replace it.

      I got sick and tired of AMD/ATI back when Voodoo was still a pass-through board, and while their hardware has improved I'm never surprised to hear about bullshit with their drivers.

      • by Shinobi ( 19308 )

        Funny, EVGA(3 graphics card in succession having huge flaws, such as fan not working properly, capacitors falling off, while the computer hadn't been moved in 6 months, and the third had bad RAM chips) is on my list of "hardware to avoid at all costs", just like Antec PSU's(none of them lasted more than 6 months, unlike the Q-tec, the cheap piece of shit they were meant to replace, is still alive to this day, 11 years after I bought it...), Gigabyte motherboards etc.

    • by TheEyes ( 1686556 ) on Sunday June 03, 2012 @03:16PM (#40202779)

      Here's irony for you:

      -AMD supposedly releases driver updates on a monthly basis, though they haven't quite managed it for the last couple years, sometimes not making the deadline, sometimes just releasing basically the same driver two months in a row, then releasing out-of-band updates when games break their cards.)
      -nVIDIA has always released drivers "as needed'.
      -AMD switches to releasing drivers "as needed".
      -Everyone complains, and threatens to switch to nVIDIA.

    • I've always bought 3Dfx then NVIDIA when they bought them out. I was always tempted to get ATI cards because of the price and performance but was always hesitant because of their driver issues.

      My NVIDIA card was having heating problems and needed a replacement so one day I finally said okay this ATI card is such a better deal (I think it was an X1900) and got that instead of a new NVIDIA card. Boy did I regret it. The card would overheat because the drivers wouldn't turn the fan up when it got hotter.
    • by sa1lnr ( 669048 )

      "I may just bite the bullet and switch to NVIDIA just to stop having to worry about driver-related frustrations altogether"

      I wouldn't bother, I did and I've had more issues with the GTX 280 than I ever did with the HD 4870.

    • Just don't download every driver, its as simple as that. I've found that if I stick with the even driver numbers there is no hassles at all, but often their odd numbered drivers are glitchy. Now that we know about TFA it was probably because they were fixing bugs and just shoved the odd ones out the door, but I have 3 ATI cards in my family's PCs as well as an AMD APU in my netbook and by just sticking with the even releases its been smooth sailing all the way.

      Oh and if you like AMD you might want to snatch

  • by NotSoHeavyD3 ( 1400425 ) on Sunday June 03, 2012 @11:51AM (#40201353)
    With their constant rebranding of old boards I can never keep straight what the hell I'd be buying. (Is that 600 series a kepler or fermi based board? Who can tell?)
    • by Trepidity ( 597 )

      While I agree it's pretty annoying, and certainly confusing for many consumers, it's not as if you can't tell what's inside a particular model, since it's pretty easy to find that information by googling.

    • Well if you really are unable to do a minimal amount of research to find out, ok I guess that's a reason not to buy, but I would think it wouldn't be to hard to just, you know, look shit up. nVidia's site is a good place and not hard to get to.

      Also if you are talking desktops, and I assume you are from the use of the term board, then you are talking nonsense. The rebranding has been in the laptop space, not the desktop space. With laptops they do have some mixed naming as there are 600 series parts from the

      • I'm mostly thinking of the NVidia 640 series (Admittedly OEM) where some of them are kepler and some are fermi. (At least that's what I read on the Wikipedia list of all those units.) I mean they have multiple Geforce 640 that are different cards with different chipsets.
      • Ultimately it really doesn't matter as what you should check are features and speed, not an arbitrary choice of what technology they use.

        Indeed. And with that in mind, I would be very interested if anyone can cite even a single credible source that compares "workstation" and "gamer" cards objectively from nVidia and/or AMD. You'll find a load of people who parrot the line that you "must" use the far more expensive workstation cards for certain kinds of professional applications, but few can really tell you why, and even those who do generally refer to drivers rather than any difference in the hardware. And that's before you even get into nVi

  • Bias much? (Score:5, Insightful)

    by neokushan ( 932374 ) on Sunday June 03, 2012 @11:53AM (#40201365)

    AMD says that they're moving from a monthly release cycle to a release-when-needed cycle and someone decides to write this piece of trash about it?
    It's not a bad thing, it makes sense to do it like this. As the summary points out, AMD currently releases out-of-band updates for when a high-profile title has an issue or launch day performance increases, so it doesn't make sense to make another release that month that doesn't change much. It's just confusing and frankly unnecessary. Doing it "as needed" just means that when a driver release comes out, it's worth updating to. If that means I only have to update my drivers once every few months, I'm fine with that - even if it occasionally means there's 2 or 3 updates in the space of a month because a lot of games happened to come out then. Overall, it's better for everyone.

    Article is a big load of FUD and should be ignored.

    Disclaimer: I've currently got a Geforce 560 Ti in my desktop and my laptop uses a Geforce 555M chipset - frankly, I'm an nvidia fanboy and this article still disgusts me.

    • Ya I read it pretty much as the same, this isn't so much as tech journalism as it is an opinion piece about a recent decision by AMD.
    • by cynyr ( 703126 )

      so when will linux drivers be needed? It sure won't be to fix windows game under wine.... anyways that is my concern about this statement. I'm still not running xorg-server-1.12 on my one AMD machine because FGLRX doesn't support it yet..

      • I don't think this statement has any bearing on their linux driver support. Linux driver support from both nvidia and AMD could be a lot better than it currently is, but I don't see how it's going to make support any worse.

    • by DudemanX ( 44606 )

      The article at Anandtech is less ominous and explains why this is actually a good thing with video chips and drivers as complicated as they are today. []

      What the summary and article from the submitter are missing is the term WHQL. AMD has and always will be releasing beta drivers to fix games as needed just as Nvidia does. What they are stopping with this announcement is halting the monthly WHQL releases. To ge

  • If they released software and patches when they were done instead of on an artificial time schedule.

  • by Osgeld ( 1900440 ) on Sunday June 03, 2012 @12:32PM (#40201627)

    heh ok, wonder how bad the demand for that is...

    "check it out I got a i7 extreme fucking overclocked, 32 gigs of ram overclocked, quad ATI's also overclocked, 4 SSD's in RAID, and Windows XP cause DX9 is the shit yo"

    cause no one plays crysis for the game, its a epeen ruler.

    • Not the case for Crysis 2. Crytek scaled it back a lot, and actually focused on making a game rather than just a tech demo. When it came out, it was DX9 only. Later they released a patch that introduced DX11 support and had some bigger textures, but at launch it was a DX9 title.

    • by MogNuts ( 97512 )

      I take issue with that. Did you even play Crysis 1? Most people who've said that here on /. are just repeating that meme and haven't played it.

      - The freedom to choose multiple routes on entire freaking huge areas instead of on-rails.

      - Freedom unlike pretty much every shooter out now to kill in different ways. Want to be stealthy? Go cloak, sneak up behind a guy, turn on maximum strength, drag him to a cliff and throw him off. Or be entirely a different way: hop on top of a hut, turn on maximum strength, pou

      • by Osgeld ( 1900440 )

        actually yes I played crysis 1, it was just every other shooter on the market at the time, and fuck you if you dared stick your head out of a bush for a half ms cause a sniper 6 miles away would headshot you every single time, wonky vheical physics, dumb as shit AI ... aside from looking pretty and having a hype machine set into overdrive it really was a just below average shooter for the time

        nevermind were talking about crysis 2 here and not the first game.

  • Not everyone is a 3D gamer who wants to be on the absolute cutting edge of everything. Not everyone thinks trading off stability against a few extra FPS is a good deal.

    Would it be too much to give us a stable driver, with maybe one update per year? By stable I mean no dodgy hacks, and no game-specific "optimizations". I mean a driver that won't crash, and that isn't afraid to be a little slower in order to do things right. Is there really no one else out there who cares about stability?

  • I've had stability problems with both ATI cards that I've owned recently. The older one was a 4xxx series that was totally inadequate, and my current one is a Radeon HD 6670 which should be adequate for most things, but really doesn't provide a smooth experience in Skyrim under Windows 7 nor is it the best in Linux. Compositing under KDE is not stable with this card. I don't use the closed source driver, however, under Linux, but I don't feel that I should need to use the Catalyst driver just to get KDE'
  • if your OEM locks you out of driver updates in the first place? I've had no end of frustration with my Lenovo laptop and the fact that they unlock a new AMD driver about once per year.

  • I have been building computers since the early 1990s. There has always been a segment of the community that prefers AMD to (Intel / Nvidia / etc) and I have never understood why. I have tried both AMD CPUs and video cards over the years and always end up going back to Intel, and more recently Nvidia. It seems like AMD just cannot get it right when it comes to the gaming market. They often win on pricepoint, but completely fail on issues like what this article mentions.

    Why do people continue to support A

    • Why do people continue to support AMD? All I can figure is that it has to do with an irrational hatred of Intel. Intel is monopolistic. Intel is anti-competitive. Intel is expensive. So, rather than supporting Intel and getting the best products on the market, people go with AMD and suffer in smug self-righteousness for doing the "right thing" and not supporting the companies that are dominating the field.

      This article is about graphics drivers, not CPUs. Intel doesn't make any discrete graphics cards. And

  • You can still get the latest 'leaks' at places like - it sucks if for people who don't know about this and people counting on (say) Steam Catalyst auto-update, but if I'm having issues with ATI or Nvidia drivers I go there first.

  • About 10 years ago, I was using a Radeon 9600XT. At the time, it was well-known that ATI's drivers sucked, but that their hardware was better for the price you paid. So, with the release of the 9000 series, ATI (now AMD, as you probably know) made a big deal about how they were starting to overhaul and vastly improve their driver quality.


    10 years later, they're still not giving their drivers nearly the attention they deserve, and it seems evident that they simply don't consider them a high priorit

  • What about Linux drivers and support? ATI/AMD is supposed to be better in this area, but I haven't seen much compared to NVIDIA's awesome driver support. What about the rest of you?

"We don't care. We don't have to. We're the Phone Company."