AMD Next-Gen Graphics May Slip To End of 2013 76
MojoKid writes "AMD has yet to make an official statement on this topic, but several unofficial remarks and leaks point in the same direction. Contrary to rumor, there won't be a new GCN 2.0 GPU out this spring to head up the Radeon HD 8000 family. This breaks with a pattern AMD has followed for nearly six years. AMD recently refreshed its mobile product lines with HD 8000M hardware, replacing some old 40nm parts with new 28nm GPUs based on GCN (Graphics Core Next). In desktop, it's a different story. AMD is already shipping 'Radeon HD 8000' cards to OEMs, but these cards are based on HD 7000 cores with new model numbers. RAM, TDP, core counts, and architectural features are all identical to the HD 7000 lineup. GPU rebadges are nothing new, but this is the first time in at least six years that AMD has rebadged the top end of a product line. Obviously any delay in a cutthroat market against Nvidia is a non-optimal situation, but consider the problem from AMD's point of view. We know AMD built the GPU inside Wii U. It's also widely rumored to have designed the CPU and GPU for the Xbox Durango and possibly both of those components for the PS4 as well. It's possible, if not likely, that the company has opted to focus on the technologies most vital to its survival over the next 12 months."
Maybe the Free GNU/Linux drivers will be ready at launch after all.
C'mon, folks. (Score:1)
Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.
Whatever happened to the Unlimited Detail guys?
Re: (Score:1)
Whatever happened to the Unlimited Detail guys?
Seem to have bailed on the gaming side of things, the whole concept has problems when you consider doing things like animation, opacity, reflections, multiple varying light sources, shadows, etc... I'm not saying they couldn't solve it but everything they demoed was the sort of stuff we can already do - just look on youtube for voxel renderers - and they omitted all the tricky things, moreover their explanation of how it works means reflections, opacity and shadows don't even work in their paradigm. Some of
Re: (Score:2)
Personally i like both more polygons and good art direction. Maybe some talented artists as well.
Re: (Score:2)
Personally i like both more polygons and good art direction. Maybe some talented artists as well.
Are you willing to pay two or three times more per copy for such a game?
Re: (Score:2)
Are you willing to pay two or three times more per copy for such a game?
Yes. I'll even pony up for a new video card if I like the game enough.
Re: (Score:2)
Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.
Whatever happened to the Unlimited Detail guys?
In a way, we do need a more powerful GPU, but not the way they are doing it.
Simply by adding shader units, or by ramping up the GHz no longer do the job.
A total overhauling of the GPU mindset must take place, but it takes much more than the hardware guys (AMD/nVidia), it also takes a paradigm change on the graphic programmers to push for a real change
Re: (Score:2)
No thanks (Score:1)
I have a Southern Island card that will likely never have a usable open source graphics driver so I am never buying AMD again. I can get way better video from Intel Integrated graphics and those nice Intel open source drivers than I can from a 6 core AMD proc with a SI card. I am done with AMD.
Re: (Score:1)
Performance? You sure you didnt mean nvidia? ;)
Re: (Score:2)
The OP looks for open source drivers.
Re: (Score:1)
Re: (Score:3)
What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer, but it's still a very recent card by open driver development standards. Support will probably only get better with time, and I'm hoping that talk on Phoronix forums about synching the development of open drivers with Catalyst for the 8xxx or 9xxx cards will bring us better support.
While I agree with you that right now Intel is the only way to go if you're dead set on using open drivers, making future purchase plans
Re: (Score:2)
What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer,
"Never" means "certainly not while the card still runs modern software".
but it's still a very recent card by open driver development standards.
But not by any reasonable, objective standard.
Re: (Score:2)
What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer, but it's still a very recent card by open driver development standards.
I was under the impression that 3D was required before 2D on the SI cards due to them relying on Glamor
Re: (Score:2)
Either that's no longer true or they got enough 3d stuff working to support 2d already:
http://cgit.freedesktop.org/xorg/driver/xf86-video-ati/commit/?id=a60d2152e928a7011fc7c44a885a34c3cdd4f0fe [freedesktop.org]
This may not be so bad... (Score:2)
Re: (Score:1)
Yeah. I'm the fearful owner of a HD 5xxx. If we can expect only about five years of support, we're fucked. GPUs should be supported for about ten years, minimum. Especially now, when pretty much any discrete card from the past decade is sufficient for compositing. If they did like Nvidia and released updated legacy drivers whenever Xorg needed, I wouldn't be pissed. (Having said that, Nvidia refuses to release a fix for the FX and 6xxx lines and Gnome 3/Cinnamon/Unity, which is disconcerting.)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Geforce FX, 6000 series and 7000 integrated chipsets draw all kinds of multicolored garbage in GTK3 DEs with open drivers and, with closed drivers, they draw garbage and either hang or are unusably slow (not being hyperbolic - I mean actually taking minutes to draw a window). Nvidia aknowledged the issue, but stated it's not their problem. Support for legacy is only for Xorg ABI changes. Nouveau, on the other hand, is understaffed, receives no official help from them and has been going through a rewrite, so
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
How do you know it was shills? It could have been fanboys.
Re: (Score:2)
did you consider the label of legacy isn't arbitrary and is based on a pre-determined lifetime? and that they aren't going to continue supporting old technology forever?
Did you consider that you might not have a clue what you're talking about? Radeon HD 4000 GPU's were shipping in brand new machines up until very recently...
Re: (Score:2)
Wow... am I really getting modded-down by AMD shills??!
They may not be shills, they may just be fanboys. Happens to me when I tell the truth about AMD/ATI, too.
Re: (Score:2)
Wow... am I really getting modded-down by AMD shills??!
Meanwhile, at AMD headquarters: ...we support our products a little longer.
Peterson: Sir, sir!
Rory Read: What is it Peterson?
Peterson: It's terrible! There is a guy...a free thinking radical! On slashdot, he is suggesting we...
Rory Read: We what!?
Peterson:
Rory Read: Oh my god! Quickly Peterson, hire some people to get onto this 'slash dot', you must find a way to suppress this person! We need to devote resources to silencing such an opinion!
Re: (Score:2)
Shills, fanboys, whatever. Geez! :)
Re: (Score:2)
Re: (Score:2)
I'm gaming with an HD4850 now and its fine...
Apparently you haven't tried using the HD 4850 with any kernels higher than 3.4...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
To be completely fair, that's not a just comparison. Windows should be compared to a distribution, not to the Linux kernel. Windows $version is a fixed release, with ABIs well defined. The same is true for any stable version of Debian or CentOS. What happens is Linux is in constant development and the myriad of distros advance too fast. If we all ran RHEL, we would have absolutely no problem with Xorg ABI changes and drivers. But developers build for the latest libraries, thus the distros have to keep fairl
Re: (Score:2)
they work fine in windows
What the fuck are you smoking?!
Re: (Score:2)
Anyway, you've made it perfectly clear that you haven't actually looked into the HD2000-40000/Linux/X.Org issue; it's about time you bowed out to stage left on the subject...
Wait a week (Score:4)
AMD announced today that they would have a message clarifying this. Apparently these rumors are not all true.
Re: (Score:3)
I'm just gonna put it on the table...
GTA5 delayed until Sept 17...
Rayman Legends delayed until Sept 17...
Feb 20 Sony PS4 announcement, AMD chips, scaled up production (for Sept release?)
And to top off the wishlist category: /end_wild_speculation
Valve will release a console in 2013... PS4 will be a Steam "Premium" unit.
.
Re: (Score:2)
Might be better for profits (Score:1)
Re:Might be better for profits (Score:5, Interesting)
failure to fact check (Score:2)
AMD has definitively said that they will not be releasing 8000 series GPUs this quarter, or possibly not even this year.... No need for "several unofficial remarks"....
Sigh...... (Score:1)
Re: (Score:1)
Re: (Score:1)
Don't take this as an attack; I'm curious why you actually need an 8000 series card, and why you need water cooling on your present card?
I have a single, stock cooled, non-OC 5770, and can run pretty much every game on maximum settings (or rather, any game that doesn't choke on AMD GPUs). Why would you need much more, unless you're using your GPU for calculation, or mining bitcoins or something? I used to be a big graphics bleeding-edger, but thanks to everything being tied to ancient console hardware, I
Re: (Score:1)
Re: (Score:2, Flamebait)
AA and AF are shit things to concern yourself with.
With those off, every game I play can be maxed out on everything else on my GTX 460, 60+ FPS. Hell, I can almost reach that on my old 9800GTX+
And on a 32" 1080p monitor, sitting 5 feet away, using a GPU with a huge chunk of RAM, you don't need to worry about AA or AF. You're not seeing jaggies unless the models suck.
Re: (Score:2)
... full AA at 30+ fps
That might be it, I keep AA down a notch since it is the the feature with highest requirements for the smallest effect. I honestly can't tell the difference (in game) between all the new alphabet soup AAs and the bog-standard AA. I've come to the conclusion that they are largely a marketing thing. Though most of the time I can use whatever FXAA DMAA PPAA WTFBBQAA they have. And generally autodetect throws me into max, at least for the games I play. Perhaps I've saved as well because I don't just do "max
Re: (Score:2)
Don't take this as an attack; I'm curious why you actually need an 8000 series card, and why you need water cooling on your present card
I don't take anything as attacks on this site. I really don't care what people think, say, or do :P But the reason I want to sell it is not so much for lack of performance, as it is still a really fast card, but for worth and age. I've had this one for well over a year and a half now and one of the games I play hates it, SWTOR, and by hates it I mean HATES it [oh it still gets 100+ FPS with everything maxed but it is anything BUT stable :(]. I play all games at my primary monitors native resolution [and s
Do they have any engineers left? (Score:3, Informative)
I live in Austin. The only thing that AMD is know for around here is layoffs. I'm surprised they have any engineers left to work on their products. Why anyone would work for them is a mystery to me.
Re: (Score:2)
Re: (Score:2)
Perhaps engineers who wish to get a job would work for them? Those that understand AMD isn't firing people for the lulz?
Well no, AMD is firing people for the lulz. They hired 'em on the same basis. This is not your father's AMD.
Re: (Score:2)
Not sure how they re-organized themselves, however AMD *was* a cpu making, not a gpu maker. They bought out the Canadian company ATI that was nVidia's only real competitor and rebranded the whole thing eventually as AMD. ATI makes the gpu. So unless AMD is new to Austin, or they have combined production across locations, likely they are not one and the same. From what I understand ATI was a pretty cool company.
Waiting for the process shrink (Score:1)
AMD uses TSMC for its stand-alone GPUs, as does Nvidia. TSMC has been having the greatest difficulty making these very complex chips. Meanwhile, other foundries, like GF, are making great strides in chip technology.
Nvidia and AMD have the choice of going for another round of parts on the same process at TSMC, with only modest improvements at best, or waiting for a 'shrink'. Neither AMD or Nvidia feel much market pressure at this time, since their high-end parts are already way too powerful for all the curre
No more distant clip planes and popups (Score:1)
That approach is old hat now. Modern games don't have far clip planes anymore, but render everything to "infinity". Objects just become less distinct with distance, same as in real life.
Guild Wars 2 is a typical example of an MMO with a modern rendering engine. You can stand on a high mountain pass and see everything to arbitrary distances, and objects don't suddenly "pop" into view as you approach like in the bad old days. The technology doesn't eve
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
We've had skybox/skydome in games for years now.
Can you see a plane at 33,000 feet from the ground? Not the vapor trails it leaves, but the plane itself? 33,000 feet is a lot less than 17 miles. Yeah....
AMD really truly no longer a player in the desktop (Score:1)
AMD have also recently said they have no ability nor plans to compete with Intel on high end desktop processors either. Their top-of-the-line FX8350 is only modest competition for Intel's midrange.