AMD Demos DirectX 11-Capable ATI Graphics Card 107
An anonymous reader writes "Today at a press conference in Taiwan, AMD demonstrated the world's first GPU capable of DirectX 11 technology. The demonstrations shows the major improvements DirectX 11 gives us over DirectX 10 and also shows us what AMD has in store for an ATI Graphics Card coming out before the end of 2009 capable of DirectX 11. AMD shows three primary features of DirectX 11: a tessellator, which allows for less blocky and more fluid and realistic details; compute shaders which allows for less restricted programming; and finally, how DX11 is better designed to take advantage of multiple CPU cores."
Direct X11? (Score:5, Funny)
Re: (Score:1, Interesting)
Re: (Score:2)
Will programmers be able to utilize? (Score:3, Interesting)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Unlike console gaming, very few graphically intensive PC games are designed to work at a specific quality with a specific frame rate for a specific consumer card. Rather, they're designed to be able to harness power from cards that don't exist at the time of development, and make good use of the features they know of at the time of development.
On a console, the system you design on is the system your users play on and direct optimization for the platform is both necessary and worthwhile. In a PC gaming sc
Re:Will programmers be able to utilize? (Score:5, Interesting)
I think tessellation will be controllable on the driver side, in that case, you wont need to write specialized code in order to take advantage of it.
From what I understand, it is basically point based curve matching using differential calculus - a fundamental change in the way models are being rendered. So even for existing games, you just need to turn on tessellation processing with your graphics card driver, and you should be able to take advantage of it due to the fact it just changes the rendering method, models themselves and other parameters should remain the same.
Re: (Score:2, Informative)
That would produce some bad looking results, as the driver wouldn't know the difference between a model that is intentionally polygonal and one that is not.
Re: (Score:2)
Not necessarily - you could define an object using some form of CSG and have the tessellation tool create the actual geometry (which is pretty much what CAD software does today). It won't be as simple as the grandparent suggests, though - it will require some sort of primitive input unless it is designed to work with some sort of Level of Detail scheme (an area I do know Microsoft has patents in)
A very simple instance of CSG is a sphere, which can be defined with a point and a radius. You could then do so
Re:Will programmers -be willing- to utilize? (Score:2)
You can utilise the (agreed) good performance of the PS3, as witnessed by the number of eggheads who have hacked them to serve as cheap supercomputing clusters, (see /. posts Ad nauseam.
I'd personnally rephrase your comment more along the lines of "is it financially viable"?
Of course, the PS3 is a notorious horror to code for, but the other factor - market share - should be up there too.
Naturally, the two are related.
AMD & DirectX11 - sounds like a similar Pyrrhic victory...
Now if only OpenGL etc. had t
Oblig (Score:1, Funny)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
The funny thing is I'm running XP so am on DX9 and really saw no need for DX10. Just another graphics card upgrade that doesn't make the games any better.
Now if they could put out a card that improved gameplay and it only worked in Vista I'd be upgrading in a heartbeat!
Re: (Score:2)
Was same here until I saw Stalker Clear Sky in DX10.1
The day after I had Vista installed.
Re: (Score:2)
I've got Stalker CS on my DX9 XP install. I'd like to see some comparisons to get an idea of what DX10 can do for a game like that.
Re: (Score:2)
Oh ... never midn. Yowch.
http://www.pcgameshardware.com/aid,658913/Stalker-Clear-Sky-Exclusive-DX9-vs-DX10-screenshots/News/ [pcgameshardware.com]
Re: (Score:2)
Man.. this is like the DVD vs HD-DVD/Blu Ray thing.
I can barely tell a difference between Dx9 and Dx10. In some cases, the Dx9 seems easier to see vs the Dx10 is more realistic (but harder to see/much busier).
Of course, the only game I've played in the last year was Wii Sports so it doubly doesn't matter.
I guess it is hard for me to appreciate the differences since in my lifetime, graphics have gone from Apple IIe to Vectrex to the nearly unplayable BattleMech (due to clipping of really big triangles) to R
Re: (Score:2, Interesting)
That is depend how you play the game. In Stalker Clear Sky I really used to hike through the area after I cleared it from the enemies because the graphics were just that beautiful.
Re: (Score:2)
Exactly. If you come from a position of having devoted hundreds of hours to games such as Bubble Bobble, Pirates!, Defender of the Crown etc. in the C64 and XT days, the last 2% of graphical improvement is hardly detectable. Only recently was
Re: (Score:2)
not all the areas in S:CS show differences. /s.t.a.l.k.e.r. ubergeek
the differences are not subtle, but they are sparse.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
There was a shot with wood planks, but that just looked like a higher resolution texture or bump map was being used, not really very exciting. Makes
Re: (Score:2)
Just another graphics card upgrade that doesn't make the games any better.
And that's any different than any other incrementing of DirectX, how? Did DirectX 9 provide you improved gameplay over DirectX 8? Doubtful. It was just another graphics card update that didn't make the games any better.
Re: (Score:2)
Re: (Score:2)
DX11 is a superset of DX10, so there's no reason for Microsoft to wait. Basicly it brings a few more interfaces but most importantly, much better multi-threading performance that is all on the driver side. All DX10 games will run just fine under DX11 and the minor performance hit we saw by DX10 is again being made irrelevant by faster cards.
This one goes up to 11 (Score:3, Funny)
How about they work on their DX 10 performance first.
Because this one goes up to 11, so obviously it's better.
Linux drivers? (Score:2, Insightful)
Re: (Score:2)
Re:Linux drivers? (Score:4, Insightful)
Re: (Score:2)
You're not kidding. What's truly amazing is that it's *utterly* random.
Sometimes I'm running WOW, playing a DVD, and have 5-6 browser windows open and my computer's solid as a rock for days. Sometimes, I've got nothing but a single browser window, and bam-- "Vista has detected your graphics driver has crashed."
The good news is that Vista can recover from it 9 times out of 10. Even without crashing WOW, which is pretty impressive.
Re: (Score:2)
I've heard this a lot, but I can't say it matches my experiences.
3 years on a laptop with Win2000 and a very old ATi chip (7xxx series): no video-caused crashes.
2 years on a laptop with WinXP and an old-ish ATi chip (9xxx series): no video-caused crashes.
6 months on a Vista laptop with (now outdated) Mobility 200M: no video-related crashes, EVEN WITH BETA DRIVERS (for comparison, nVidia's Vista drivers weren't what I would call release quality until more than 6 months after Vista's release. ATi was there ov
Re: (Score:1)
When has ATI had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.
Very true, The only positive thing I have to say about ATI drivers is they are a shit load better than Nvidia drivers, But that is like being smarter than the brain damaged kid at the back of the Bus.
Re: (Score:2)
Nvidia on the other hand I've failed to get hardware that works long enough to comment on the drivers since the started making chip-sets for something more advanced than plain old pci.
4 different systems and 5 different cards(6xxx and
Re: (Score:2)
Sadly, I have to concur. Things have gotten pretty bad in both camps these days. Particularly when it comes to older title support.
FUD (Score:2)
Their drivers are fine. That's the first thing AMD fixed after acquiring ATI.
On the contrary,
When has Nvidia had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.
See how easy that was?
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
That might also be a Linux community's fault. Linux never provided solid driver development kit.
Doesn't seem to have stopped nVIDIA from making a pretty solid driver for Linux.
Re:Linux drivers? (Score:5, Informative)
Doesn't seem to have stopped nVIDIA from making a pretty solid driver for Linux.
nVidia basicly overrode the lower third or so of X11 (it's a big function pointer table) and wrote their own implementation, ATI did the same except with less success. AMD/Intel is now trying to invent a proper open source stack with graphics execution manager (GEM) for memory management, kernel mode setting (KMS) for flicker free boots and more, low-level state tracking framework called Gallium3D to expose modern shaders, better direct rendering interface (DRI2), redirected direct rendering (RDR) and various other improvements but you're talking about things only 1-2 years old. nVidia has succeeded yes but for most intents and purposes they wrote the whole thing themselves, There's a reason it's a sore point for open source fanatics, it's not merely a blob addon it basicly ripped out a whole chunk of open source, said "not good enough" and replaced it with their own blob.
Re: (Score:2)
ATI did the same except with less success.
Which was entirely my point. That ATI is still to this day unable to release a half-decent driver for Linux is their own fault since nVIDIA was clearly able to do so.
Re: (Score:3, Interesting)
There's a reason it's a sore point for open source fanatics, it's not merely a blob addon it basicly ripped out a whole chunk of open source, said "not good enough" and replaced it with their own blob.
No one gives a shit if Nvidia said the open source part was "not good enough" - we give shit because what nvidia replaced it with is broken and can't be fixed. I wasted over $600 on two top-end nvidia cards due to their supposed "great linux support" only to have them fail to work with my high end monitor because of an extremely simple TMDS configuration bug in their driver.
When I jumped through all the hoops of their ultimately bullshit support on a freakin webforum and gave them all the debug output they
Re: (Score:1)
Also, the open source drivers are progressing nicely. http://xorg.freedesktop.org/wiki/RadeonFeature [freedesktop.org]
meh, we go through this every few years (Score:2, Funny)
Re: (Score:2)
Closer to the ultimate goal (Score:2, Interesting)
Realistic 3D CGI porn. Of course.
3D CGI Porn (Score:3, Insightful)
I guess that's for people who find it's just too creepy to have actual porn actresses in their downloaded mpg's... watching them... laughing at them... judging them...
With CGI porn, the disconnect is complete! It has become a truly solitary masturbatory experience, the last vestiges of shared sexuality banished.
WOO.... hoo?
Re: (Score:2)
No, it's for people who are all like "yea baby, oh, touch yourself, yea more of that, NO NO DON'T LICK THE TITTY! who told you that licking your own titty is sexy? It's not, so stop that. God if I were the director, I would have slapped you for that. Now look what you did, you killed my boner."
With computer CGI porn, no actress will lick her own titties ever again.
Re: (Score:2)
Hmm yeah, in addition to unrealistic depictions of what having sex is like and standards of beauty, let's top that off with some mindreading ability too. I wonder if any of that could make it hard connecting with real girls that need to be actually pleasured, look average instead of bombshell and have a mind of their own. No wonder realdolls sell, if they could make them semi-intelligent sex robots too they should be just the thing...
Re: (Score:2)
My girlfriend may not be able to read my mind, but she does know what I'm thinking when I tell her what I'm thinking. Porn lacks that ability. :-P
Whoops, this is slashdot! Replace "girlfriend" with "Fleshlight" or something.
Re: (Score:2)
Re: (Score:2)
And they do what you want without having to pay them.
Well, anything someone on a screen could do and say.
Re: (Score:2)
With CGI porn, the disconnect is complete! It has become a truly solitary masturbatory experience, the last vestiges of shared sexuality banished.
Say what you will about it being impersonal. CGI is the only way I could afford to complete my masterpiece: Lord of the Cock Rings: The Battle of Purple Helm's Deep Penetration.
Re: (Score:1)
Re: (Score:2)
5 seconds per frame? My first modem was 2400 baud-- 240 characters a second, or about 0.12 fps on a standard 80*24 terminal.
So... (Score:2, Insightful)
(In fact, I hope that they finally do something about this. I was forced to avoid any ATI hardware for over 5 years now, just because of driver incompatibilities. It's just sad.)
Re: (Score:1, Funny)
Re: (Score:2)
Then why post whinging about it? Are you the same sort of person that complains about being able to find porn on bing after you disable the filters?
Re: (Score:2)
And we salute you~
Jeez dude, no one cares and it doesn't belong in this thread.
Yet Another Feature... (Score:3, Interesting)
that isn't in XP, hence nobody cares. You'll have the what, 30% market segment with Vista, and maybe 10% that are regular gamers who will be using this.
This will just encourage the further brokenness that Windows is turning the PC gaming platform. Good Job!
PS: Before everyone jumps in to say that everyone will jump into Win7, I think you're mistaken. The only way Microsoft will kill XP for most existing users would be to introduce a critical bug that they choose not to fix. I played with Win7 for a few days and can safely say that it doesn't add anything that I've ever wanted to use that a trivial search for google wouldn't find an as-good or better alternative. And maybe its just me, but pretty much every single UI 'enhancement' since circa Win2k is always a step backwards in terms of -my- productivity.
Its lucky that I'm Linux competent since Fedora/Gnome makes practically everything I need easy and uncluttered. If the barrier for entry was a little lower, I could see mass exodus potential coming as XP users take an honest look at what they -really- want to update to.
Re: (Score:2)
Dude, that wasn't a PS. That was practically a P on its own merits.
Not an FP, but a P none the less. A troll P, at that.
Troll's pee!
(Sorry ... long day.)
Re: (Score:1)
I'm pretty sure that it's not just you who feels this way about Windows 7, but that's mainly because this site is full of people who hate Microsoft for the sake of hating Microsoft. Amongst the general population Windows 7 is gonna own. Personal opinion, to be sure, but historically I'm pretty good at judging hype on its own merits.
Re: (Score:1)
Re: (Score:2)
Yes, the bug is called "Not having DirectX 11"
There will never be a mass exodus to Linux as long as corporations keep ties to an OS and developers write code to take advantage of specific items in that OS.
If applications were self contained like they should be, then the OS wouldn't matter nearly as much.
Re: (Score:1)
The end of the CPU/GPU divide (Score:1, Interesting)
From the article:
Lastly, DX11 is better designed to take advantage of multiple CPU cores. This should allow developers to offload some of the work on to the processors that are typically there not doing as much work, freeing up the GPU to do the more important processing and rendering.
Interesting turnaround. The original motivation for the GPU was to allow the CPU to offload expensive graphics computation to a dedicated processor. Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again.
This is further evidence that the CPU/GPU divide is being eliminated, and that there will likely be no such distinction among processors in the near feature.
Re: (Score:2)
"Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again."
This is more an artifact of programmers not knowing how to utilize extra cores of modern processors then it is really "offloading", it's more like taking full advantage of the CPU, GPU cards and drivers have always shared the load between CPU and GPU, the fact of the matter is with many core CPU's many programmers haven't learned to utilize them effectively.
Re: (Score:2)
Interesting turnaround. The original motivation for the GPU was to allow the CPU to offload expensive graphics computation to a dedicated processor. Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again.
No, that is not what's going on. The GPUs have become so fast at doing their jobs that the CPU can't feed them fast enough. That's where the new features in DX11 will help. It will make it possible to efficiently use multiple threads to feed the GPUs. This has been an issue in DX10 and earlier.
The real question is... (Score:4, Funny)
Re: (Score:2)
No.
Re: (Score:3, Funny)
Re: (Score:2)
Well yeah. It's developers going bankrupt had no effect on the chances of DNF being released. It's equally likely that it will be finished in 2011 as it was to be finished in 2008. So, you know, anything (equal to nothing) could happen!
Finally! (Score:1)
I now can play my favorite game of all time with decent performance: 3DMark
Useless, stupidly written information-free article (Score:2)
Come on, this is the depth of comprehension that the author has about what tessellation is?
One of the technologies in DirectX 11 is something called tessellator.
Tessellator allows for more smoother, less blocky, and more organic looking objects in games. Anti-aliasing shouldn't be confused with this, as AA does a descent job at smoothing out sharp edges but tessellator actually makes it look more fluid and frankly much more realistic. Tessellator makes things look more "rounded" instead of chunky and blocky. Instead of having to trade off quality for performance, like in the past, developers can now have the most realistic scenes without a performance hit.
Tech Fragments is an appropriate name for the site, I guess, seeing as they can't even get the tense of the word right.
Worst quote ever (Score:1)
Instead of having to trade off quality for performance, like in the past, developers can now have the most realistic scenes without a performance hit.
Yeah, I'm sure turning on tessellation won't cause any performance hit at all.
Tech Fragments has the most sensationalist writers ever.
Yes, but... (Score:1)
Obligatory (Score:1)
This one goes to 11.
Nvidia on the Run (Score:2)
Excuse Me But... (Score:2)
And isn't this the reason why there never was an Nvidia 10.1 card, but ATI ran it just fine?