Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Graphics Technology

AMD Next-Gen Graphics May Slip To End of 2013 76

MojoKid writes "AMD has yet to make an official statement on this topic, but several unofficial remarks and leaks point in the same direction. Contrary to rumor, there won't be a new GCN 2.0 GPU out this spring to head up the Radeon HD 8000 family. This breaks with a pattern AMD has followed for nearly six years. AMD recently refreshed its mobile product lines with HD 8000M hardware, replacing some old 40nm parts with new 28nm GPUs based on GCN (Graphics Core Next). In desktop, it's a different story. AMD is already shipping 'Radeon HD 8000' cards to OEMs, but these cards are based on HD 7000 cores with new model numbers. RAM, TDP, core counts, and architectural features are all identical to the HD 7000 lineup. GPU rebadges are nothing new, but this is the first time in at least six years that AMD has rebadged the top end of a product line. Obviously any delay in a cutthroat market against Nvidia is a non-optimal situation, but consider the problem from AMD's point of view. We know AMD built the GPU inside Wii U. It's also widely rumored to have designed the CPU and GPU for the Xbox Durango and possibly both of those components for the PS4 as well. It's possible, if not likely, that the company has opted to focus on the technologies most vital to its survival over the next 12 months." Maybe the Free GNU/Linux drivers will be ready at launch after all.
This discussion has been archived. No new comments can be posted.

AMD Next-Gen Graphics May Slip To End of 2013

Comments Filter:
  • Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.

    Whatever happened to the Unlimited Detail guys?

    • by Anonymous Coward

      Whatever happened to the Unlimited Detail guys?

      Seem to have bailed on the gaming side of things, the whole concept has problems when you consider doing things like animation, opacity, reflections, multiple varying light sources, shadows, etc... I'm not saying they couldn't solve it but everything they demoed was the sort of stuff we can already do - just look on youtube for voxel renderers - and they omitted all the tricky things, moreover their explanation of how it works means reflections, opacity and shadows don't even work in their paradigm. Some of

    • Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.

      Whatever happened to the Unlimited Detail guys?

      In a way, we do need a more powerful GPU, but not the way they are doing it.

      Simply by adding shader units, or by ramping up the GHz no longer do the job.

      A total overhauling of the GPU mindset must take place, but it takes much more than the hardware guys (AMD/nVidia), it also takes a paradigm change on the graphic programmers to push for a real change

    • Comment removed based on user account deletion
  • I have a Southern Island card that will likely never have a usable open source graphics driver so I am never buying AMD again. I can get way better video from Intel Integrated graphics and those nice Intel open source drivers than I can from a 6 core AMD proc with a SI card. I am done with AMD.

    • What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer, but it's still a very recent card by open driver development standards. Support will probably only get better with time, and I'm hoping that talk on Phoronix forums about synching the development of open drivers with Catalyst for the 8xxx or 9xxx cards will bring us better support.

      While I agree with you that right now Intel is the only way to go if you're dead set on using open drivers, making future purchase plans

  • ...if it means those of us with Radeon HD 5000 through 8000 series GPU's get a little more life before AMD arbitrarily labels them "legacy" so that they can stop paying their engineers to develop drivers for them (like they recently did with the HD 2000 through 4000 series)... :p
    • Yeah. I'm the fearful owner of a HD 5xxx. If we can expect only about five years of support, we're fucked. GPUs should be supported for about ten years, minimum. Especially now, when pretty much any discrete card from the past decade is sufficient for compositing. If they did like Nvidia and released updated legacy drivers whenever Xorg needed, I wouldn't be pissed. (Having said that, Nvidia refuses to release a fix for the FX and 6xxx lines and Gnome 3/Cinnamon/Unity, which is disconcerting.)

      • I'm not convinced that paying $50 for a graphics card should qualify for 10 years of active driver support. Anybody using a card that old doesn't care about performance which means they're likely to be the low end models in the first place. AMD has enough financial issues at the moment, funding driver development so users don't need to pay for an upgrade is not in their best interests.
      • Only five years of support is shitty, no question. Out of curiosity, what are the GeforceFX and Geforce 6 cards doing wrong in Gnome 3? The Geforce6 cards were just bumped to legacy support last month, but the FXes have been in limbo for a lot longer... and Nvidia's failure to ever release a driver better than beta quality for NT 6 was pretty fucking irritating.
        • Geforce FX, 6000 series and 7000 integrated chipsets draw all kinds of multicolored garbage in GTK3 DEs with open drivers and, with closed drivers, they draw garbage and either hang or are unusably slow (not being hyperbolic - I mean actually taking minutes to draw a window). Nvidia aknowledged the issue, but stated it's not their problem. Support for legacy is only for Xorg ABI changes. Nouveau, on the other hand, is understaffed, receives no official help from them and has been going through a rewrite, so

          • Yeesh. That's almost as heartbreaking as finding an ancient box I'd built for college two years ago, booting it up, installing Slackware, and discovering that no one ever bothered to fix the driver for its Rendition Verite 2200 to support 2D acceleration. At this point I wouldn't hold my breath for a fix; I'd just switch to Xfce and swear oaths of vengeance. It's probably more productive than waiting for this to get fixed on either side of the driver support pool.
    • Wow... am I really getting modded-down by AMD shills??!
      • Of course, I could've been unfairly modded-up by nVidia and Intel shills... just the luck of the draw, I guess. :p
        • by Anonymous Coward

          How do you know it was shills? It could have been fanboys.

      • Wow... am I really getting modded-down by AMD shills??!

        They may not be shills, they may just be fanboys. Happens to me when I tell the truth about AMD/ATI, too.

      • Wow... am I really getting modded-down by AMD shills??!

        Meanwhile, at AMD headquarters:
        Peterson: Sir, sir!
        Rory Read: What is it Peterson?
        Peterson: It's terrible! There is a guy...a free thinking radical! On slashdot, he is suggesting we...
        Rory Read: We what!?
        Peterson: ...we support our products a little longer.
        Rory Read: Oh my god! Quickly Peterson, hire some people to get onto this 'slash dot', you must find a way to suppress this person! We need to devote resources to silencing such an opinion!

    • Comment removed based on user account deletion
      • I'm gaming with an HD4850 now and its fine...

        Apparently you haven't tried using the HD 4850 with any kernels higher than 3.4...

        • Comment removed based on user account deletion
          • Correct me if I'm wrong but I was under the impression that this issue isn't directly related to kernels 3.5 and above per se, but rather that AMD is refusing to compile their legacy driver for x.org 1.13 (which is perhaps indirectly related to the kernel version). Maybe someone can shed additional light on the subject...?
            • Comment removed based on user account deletion
              • To be completely fair, that's not a just comparison. Windows should be compared to a distribution, not to the Linux kernel. Windows $version is a fixed release, with ABIs well defined. The same is true for any stable version of Debian or CentOS. What happens is Linux is in constant development and the myriad of distros advance too fast. If we all ran RHEL, we would have absolutely no problem with Xorg ABI changes and drivers. But developers build for the latest libraries, thus the distros have to keep fairl

              • they work fine in windows

                What the fuck are you smoking?!

                • Okay, admittedly Radeon HD 2000 through 4000 drivers work great in Windows (aside from the fact that they require .NET for full functionality) but apparently you haven't been paying attention to the recent fiasco involving their more recent GPU's under Windows.

                  Anyway, you've made it perfectly clear that you haven't actually looked into the HD2000-40000/Linux/X.Org issue; it's about time you bowed out to stage left on the subject...

  • by rrhal ( 88665 ) on Monday February 11, 2013 @09:09PM (#42867647)

    AMD announced today that they would have a message clarifying this. Apparently these rumors are not all true.

    • I'm just gonna put it on the table...

      GTA5 delayed until Sept 17...
      Rayman Legends delayed until Sept 17...
      Feb 20 Sony PS4 announcement, AMD chips, scaled up production (for Sept release?)

      And to top off the wishlist category:
      Valve will release a console in 2013... PS4 will be a Steam "Premium" unit.
      . /end_wild_speculation

  • So, what I'm hearing is that AMD will be releasing its new line of video cards right around Christmas season, when a lot of people get new systems anyway? I've never understood why nVidia and ATI release their first cards around spring. Sure, get the bugs out early I guess, and there's got to be a bunch of young kids who have summer jobs willing to put all their profit towards a new gaming rig, but I still find it hard to believe that it isn't more profitable to just release the cards around October-ish,
  • AMD has definitively said that they will not be releasing 8000 series GPUs this quarter, or possibly not even this year.... No need for "several unofficial remarks"....

  • I was really looking forward to selling my HD 6990 with a waterblock for enough to offset the cost of a new [or a couple new] 8xxx series card(s) but now it doesn't look so promising. Dangit, dagnabbit, GRRRR.... cry..... I was GOING to ebay it for about 550 bucks for the combo [well worth it] which means a new one would have only cost me about 300 bucks or so minus the water block. Now it's going to be about a hundred less and that really does suck.
    • I have been looking to get a second hand graphics card (either a 7970 or a gtx 680) on ebay and I can tell you that cards with waterblocks typically sell for less than the ones with stock air cooling. There is not a whole lot of people with full loop watercooling in the first place and they are a bunch that typically wants the latest and greatest in hardware. The only ones that would be interested in your card are a handfull of people looking to add a second 6990 and already have one with the same waterbloc
    • by Omestes ( 471991 )

      Don't take this as an attack; I'm curious why you actually need an 8000 series card, and why you need water cooling on your present card?

      I have a single, stock cooled, non-OC 5770, and can run pretty much every game on maximum settings (or rather, any game that doesn't choke on AMD GPUs). Why would you need much more, unless you're using your GPU for calculation, or mining bitcoins or something? I used to be a big graphics bleeding-edger, but thanks to everything being tied to ancient console hardware, I

      • I have 2 rigs, one with a oc 7870 and one with a oc 6970 and neither one of them can run the newest games with full AA at 30+ fps. Unless you are gaming at 800x600 or consider 10fps a playable frame rate there is no way your 5770 can run "pretty much every game on maximum settings".
        • Re: (Score:2, Flamebait)

          by Khyber ( 864651 )

          AA and AF are shit things to concern yourself with.

          With those off, every game I play can be maxed out on everything else on my GTX 460, 60+ FPS. Hell, I can almost reach that on my old 9800GTX+

          And on a 32" 1080p monitor, sitting 5 feet away, using a GPU with a huge chunk of RAM, you don't need to worry about AA or AF. You're not seeing jaggies unless the models suck.

        • by Omestes ( 471991 )

          ... full AA at 30+ fps

          That might be it, I keep AA down a notch since it is the the feature with highest requirements for the smallest effect. I honestly can't tell the difference (in game) between all the new alphabet soup AAs and the bog-standard AA. I've come to the conclusion that they are largely a marketing thing. Though most of the time I can use whatever FXAA DMAA PPAA WTFBBQAA they have. And generally autodetect throws me into max, at least for the games I play. Perhaps I've saved as well because I don't just do "max

      • Don't take this as an attack; I'm curious why you actually need an 8000 series card, and why you need water cooling on your present card

        I don't take anything as attacks on this site. I really don't care what people think, say, or do :P But the reason I want to sell it is not so much for lack of performance, as it is still a really fast card, but for worth and age. I've had this one for well over a year and a half now and one of the games I play hates it, SWTOR, and by hates it I mean HATES it [oh it still gets 100+ FPS with everything maxed but it is anything BUT stable :(]. I play all games at my primary monitors native resolution [and s

  • by LordNimon ( 85072 ) on Monday February 11, 2013 @11:25PM (#42868373)

    I live in Austin. The only thing that AMD is know for around here is layoffs. I'm surprised they have any engineers left to work on their products. Why anyone would work for them is a mystery to me.

    • Perhaps engineers who wish to get a job would work for them? Those that understand AMD isn't firing people for the lulz?
      • Perhaps engineers who wish to get a job would work for them? Those that understand AMD isn't firing people for the lulz?

        Well no, AMD is firing people for the lulz. They hired 'em on the same basis. This is not your father's AMD.

    • Not sure how they re-organized themselves, however AMD *was* a cpu making, not a gpu maker. They bought out the Canadian company ATI that was nVidia's only real competitor and rebranded the whole thing eventually as AMD. ATI makes the gpu. So unless AMD is new to Austin, or they have combined production across locations, likely they are not one and the same. From what I understand ATI was a pretty cool company.

  • by Anonymous Coward

    AMD uses TSMC for its stand-alone GPUs, as does Nvidia. TSMC has been having the greatest difficulty making these very complex chips. Meanwhile, other foundries, like GF, are making great strides in chip technology.

    Nvidia and AMD have the choice of going for another round of parts on the same process at TSMC, with only modest improvements at best, or waiting for a 'shrink'. Neither AMD or Nvidia feel much market pressure at this time, since their high-end parts are already way too powerful for all the curre

    • by Anonymous Coward

      Powerful PC GPU hardware will set far render distances

      That approach is old hat now. Modern games don't have far clip planes anymore, but render everything to "infinity". Objects just become less distinct with distance, same as in real life.

      Guild Wars 2 is a typical example of an MMO with a modern rendering engine. You can stand on a high mountain pass and see everything to arbitrary distances, and objects don't suddenly "pop" into view as you approach like in the bad old days. The technology doesn't eve

  • AMD have also recently said they have no ability nor plans to compete with Intel on high end desktop processors either. Their top-of-the-line FX8350 is only modest competition for Intel's midrange.

There is no opinion so absurd that some philosopher will not express it. -- Marcus Tullius Cicero, "Ad familiares"

Working...