Dual GPU graphics solution from ATi? 359
Graphics Guru writes "Last week TweakTown posted an exclusive picture of the ATi Radeon 8500 MAXX with believable accompanying information also regarding the highly anticipated ATi R300. 3DChipset is today reporting that they have confirmation that the 8500 MAXX is indeed real and is due to be shipped fairly soon. Here's what someone from ATi told them: "The ATI Radeon 8500 Maxx is for real and the card is already in full production and about to be shipped soon. ATi has finally nailed certain issues with the dual chip. Final testings have been done and you should here noise from ATi regarding this offering." You decide if it is real or not, a solid dual GPU solution would surely rock the industry to massive proportions!"
Rocking v2.0. (Score:3, Insightful)
--saint
petition worked? (Score:1)
ATI hopefully for Linux Doom III (Score:1, Interesting)
Stereo (Score:4, Funny)
Re:Stereo (Score:3, Funny)
Re:Stereo (Score:3, Insightful)
D'oh! (mod me up)
Comment removed (Score:4, Insightful)
Re:Bull (Score:3, Insightful)
Re:Bull (Score:2)
ati drivers are bollocks. as well as matrox.
i'm glad someone points this out everytime there is a story about these jokers releasing some whiz bang hardware that probably wont work cuz their drivers are poop!
-
uberfag, and proud of it.
AFR (Score:3, Informative)
GPU1: renders a frame
GPU2: renders a frame - GPU1: Displays frame it just rendered
GPU1: Renders a frame - GPU2: Displays frame it just rendered
etc.
Re:AFR (Score:2)
Well... (Score:4, Informative)
The problem with the Rage Fury MAXX has to do not with the method of interleaving or the presence of GPUs, but rather the method ATI chose to bridge the two chips, which isn't permitted in the NT AGP code. ATI couldn't find a way around it, so they abandoned the card. Sad, since it was a nice performer under Win9x...
However, many other implementations do 2 graphics chips right with NT support, such as the Voodoo 5 5500 and the high-end multi-chip Quantum3D boards.
So, ATI could easily do a Radeon MAXX part with WinXP support, since they know what mistakes not to make in silicon this time around...
ATi (Score:2, Redundant)
-=Crazy stuff happens w/ the Bong and me.=-
Re:ATi (Score:5, Informative)
You mean like the linux drivers that ATI wrote for the Radeon 8500 and 8800? Guess you'll be switching to ATI now, right?
Dinivin
HW drivers vs HW specs (Score:3, Informative)
In ATI's defense, unlike nVidia (who are strictly proprietary), ATI do make the chipset details available so anyone can write open source drivers for whatever esteoric OS they happen to be using - there's more OSs than just Windows and Linux, you know!
Of course, it would be nice if ATI released both specs and drivers, but IMHO, it is better in the long term for open source OSs if the specs are released.
Which industry? (Score:5, Insightful)
Which industry would that be? The gaming industry is slowing down as far as graphics go. Mark my words, there is going to be a shift soon from graphic intensive to gameplay innovation. People don't want games to be any prettier (or don't notice much of a difference). Notice how the mod community is getting bigger and better? Its cause they take the graphics engines and add innovation.
I'm rambling, but I think that these new video cards aren't going to be this big explosion that they were in the past. Sure they are big and powerful, but people aren't going to fork over the cash to get this one when they can get a good GeForce2 that can play their games just as well.
Re:Which industry? (Score:2, Interesting)
Re:Which industry? (Score:3, Insightful)
If you do both, either your innovation or your graphics will be outdated by the time you finish.
A perfect Example... (Score:2)
They are trying to perfect both, in the mean time, everyone else will have passed them by.
Re:Which industry? (Score:2)
Plus if anyone remembers ATI did this before with another MAXX card that suffered horribly from limited, crappy drivers that assumed too much, and didn't work. Good idea, bad irl.
Re:Which industry? (Score:2)
I had two ATI All-In-Wonder cards, and I finally abandoned them when I got tired of constantly hearing about bugfixes coming "soon" that never came. That, and I really hated their application software for TV and video capture. I hate overdone metaphor GUI's.
Re:Which industry? (Score:5, Insightful)
Which industry would that be? The gaming industry is slowing down as far as graphics go.
Hi, FortKnox, I'd like to disagree with your "marked words". I'd say that anything that dramatically increases the performance of graphical gaming will be welcome. I grew up playing PC-Man, Friendlyware PC Arcade and other ASCII games. I eventually progressed to Doom, to Quake (by way of Heretic, which I liked more than my friends). On opening day, I bought Warcraft III. Turns out my 16 meg Radeon doesn't play it perfectly smoothly, so I may end up buying a new video card by the end of the year for my dual G4. In my case, the vid card is clearly the bottleneck.
People have been talking a deal over the years about how consumers don't want anything prettier (or won't notice), they're happy with what they have. Game play is what's important. Don't try to tell me that Myst was more than an eye candy excuse. Sure, consumers are happy with what they have now, but it turns out that pretty is what sells in the stores -- nobody wants to read a novella describing "game play", they want to see screenshots!
I'm rambling, but I think that these new video cards aren't going to be this big explosion that they were in the past. Sure they are big and powerful, but people aren't going to fork over the cash to get this one when they can get a good GeForce2 that can play their games just as well.
You actually remind me of the neigh-sayers (or nay-sayers, depending on where you're from) a decade, two or three ago saying that games were nothing more than a distraction on computers. Nobody ever designs a computer for the things, and certainly nobody would fork out over $500 just to play games! Can you tell me which industry is driving which? I won't say that iD is single-handedly responsible for Intel's bottom line over the past 10 years, but I will say that consumer's demand for "prettier, better, smoother" has been responsible for a great many computer sales. They don't need 2GHz Pentiums / Athlons to balance their checkbooks or play with Mozilla (yet).
The ultimate in game play, I'm willing to bet, is an eyepiece or two that behaves as a huge, high-res screen, but takes up the entire field of view. That will be a great number of pixels (dare I guess 10k horiz by 5k vert per eye?) Maybe we'll have some game play innovations along the way, but there is certainly a need for more innovation. Perhaps we'll be stopping by 32" LCDs (or OLEDs) on the way, banks of seamlessly tiled conventionally sized screens, or even something different. The fact of the matter is, consumers are happy with what they have -- until they see something better. That's where the bucks are. Where would Gateway/ Dell/ Compaq/ Toshiba/ HP/ Apple be without those consumers wanting the next pretty thing?
Oh, and yes, I do see the mod community getting bigger. They even have some great successes. But, do you see them modding Wolf 3d? Doom I? Or do they move on, exercising the latest engines to the fullest of their abilities? Would they prefer engines that allow them to show on the screen what they have in their heads?
Re:Which industry? (Score:2)
I suspect a lack of Mac optimization from Blizzard.
Re:Which industry? (Score:2)
"Games are the only legitimate use for computers."
Re:Which industry? (Score:2)
Yes, there are still people modding Doom 1 and 2... go over to doomworld.com and check it out. Helluva lot more gameplay than today's "walk around corner and kill single enemy" bullshit.
Surely you are kidding? (Score:2)
I predict the curve won't break until realtime computer graphics are far more convincing than the computer graphics we see in movies today.
That will be a short while
Re:Which industry? (Score:2)
Once you can get multi-pass polygon/pixel resolution at acceptable performance levels the need textures becomes moot and very interesting things will start to be produced.
Re:Which industry? (Score:2)
I remember when Quake 3 first came out - it was amazing. Now, it "looks like every other game out there". Utterly incredible. I remember the days of playing Dangerous Dave and Zaxxon on my 8088, and thinking that they were incredible. In past years, graphic quality has been critical because there has been so much room for expansion. However, now, there's only so much you can do with eye candy.
The next area of innovation will be in gameplay. FPSes were innovative...when Wolenstein and Doom came out. There have been a few definitive games, but for the most part, there has been little innovation in that arena. Now, there's room for it.
Doom 3, while looking amazing, will not do well if the gameplay isn't above par. It's possibly the best-looking game to date, but gamers are becoming increasingly disillusioned with graphic candy, and are craving better gameplay.
Re:Which industry? (Score:2)
Re:Which industry? (Score:2, Informative)
FOV is based on the "camera angle" (which is probably hardcoded in NWN) and width/height ratio. In 800x600, you actually have a wider FOV than in 1280x1024.
Dual GPUs (Score:3, Funny)
Re:Dual GPUs (Score:2)
Re:Dual GPUs (Score:5, Funny)
I'll start up a computer company to capitalize on the hig demand for such systems. We'll put all the horsepower into killer graphics and put them in cool looking boxes.
I think I'll call the company SGI. With a plan like this, we'll be around FOREVER.
Re:Dual GPUs (Score:2)
Dual CPU's on the desktop are a waste of money, by the time you need the extra horsepower, it's cheaper to buy a new chip which will be faster (moore's law), though i have found using dual cpu'ed servers give it about an extra year of life at a fraction of the cost. Currently using a dual 500mhz celeron, which is 2 yrs old and there is still no load on the system, having that extra processor to handle tasks on a true multithreaded OS rocks!
Re:Dual GPUs (Score:5, Insightful)
That's not necessarily true. I've been running dual for a couple of years now, and the benefits I see to it are far deeper than adding 'extra horsepower'.
I'm a 3D artist. I use Lightwave primarily, but also use Photoshop and After Effects quite extensively. I spent a LOT of time waiting for stuff to get done. My boss got me a dual Athlon 1600 with a gig of RAM early this year. She didn't get it for me because she wanted me to halve my rendering times, but rather she wanted me to make better use of my time while the computer was busy.
Lightwave is multithreaded, but not very elegantly. As a matter of fact, I rarely enable the multithreaded option. Instead, while it's rendering, I set up processes on the other processor to continue on with what I'm doing. Sometimes I'm building the next model, sometimes I'm generating a texture in Photoshop, or I'm setting up a composition in After Effects.
So while my computer is busy rendering, I'm still busy being productive. Some of you are saying "Yeah, but you'll never get 2x the processing out of it." And you know what? That's basically true, at least in a benchmark point of view. I get close to double clock speed when I have a rendering running on each processor, but I doubt I hit 2x. I don't need 2x anymore, though. About a year ago I started layering my animations. That means that my computer would render elements of a scene, which render much faster than the entire scene. As each frame is generated, it gets added to the composition in After Effects. So while my computer is rendering, I'm busy in After Effects getting it all put together. This sure beats waiting for the rendering to get completed. Heck, thanks to this technique (and the dual proc), I rarely have 'over-the-weekend-renderings' that have the potential to go horribly wrong.
Would I be better off with a second machine? No. For the amount if money that was spent on my machine (roughly $1,500 sans monitor and hard drives), I probably could have gotten more 'pixels rendered' per minute. But, it'd be a huge blow to my workflow switching between two computers. It wouldn't take very long for the 100mbit connection between them to become a huge bottleneck. As a matter of fact, I'm not sure it would have been all that cheaper. We'd still have to get me high end video cards and monitors for each machine.
Are dual proc desktops for everybody? Not really. The best benefit you'd see is that Windows 2000 behaves a LOT better. Explorer and IE are both very multithreaded, and are much more responsive. As a matter of fact, my Athlon 1.2 gig machine at home felt sluggish compared to my old Dual P3 550. It kicked the 550's butt at rendering, but when it came to browsing the web, doing email, etc, the dual 550 was much more responsive.
In short, dual processor machines have their place. If you primarily play games, you probably won't care much. But if you do CPU intensive work, it'll make your life a lot easier. Unless, of course, you like having nothing to do while your machine is busy.
Really now... (Score:2, Troll)
"Optimization, shmoptimization! Just cram a second GPU in there and we'll be fine."
I really wish people would just stop coming out with new hardware for a couple of years, so that we can all save a few upgrade bucks, and the software industry can get their act together, and start writing clean, well optimized, stable programs, instead of trying to always catch up to the bleeding edge that nobody really asked the hardware companies to push.
Sorry, but I disagree (Score:2)
Which is not an excuse to make *unessecerrily* bloated code. But to me I'd be a lot happier if it's featurerich and stable rather than fast. Usually you only get to pick at most two out of three, at least on a sane budget
Kjella
Re:Really now... (Score:3, Informative)
If you can spend $1M on hardware development and come out with a new chip that's 20% faster or spend $1M on software and put out drivers that are 5% faster, where is the money better spent? Besides which, you can charge for the new hardware. Charging for the new drivers is not acceptable to consumers.
Freezing the hardware for "a couple years" is not acceptable. Companies will simply cease to exist. Upgrades are part of the business model of the industry, and that modern systems are capable of doing virtually all tasks home and business users would require of them is part of the reason for the technology bust in the past couple years.
Look, it's simple. If you don't play the latest and greatest games, or don't care if you can play those games at uber-high res with all the effects turned on, then you don't need to upgrade. And yes, you can generally play the new games just fine on an older computer (my system is an Athlon 750, 512MB PC133, 32MB GF2 and runs DS and NWN just fine. Plays Q3 just fine. Will it play UT2 just fine? I doubt it... but it's 2 years old now).
As for "nobody asked hardware companies to push" -- speak for yourself. Go look at the Doom3 demo. You simply can't do that on current hardware with any semblance of speed. Yeah, you can run it on a GF3/4/ATI 8500, but you'll have to run it at a lower resolution and turn off features. Run it on a GF2? An ATI 7500? An MX anything? Maybe. It won't have anywhere near the eye candy.
Once we're to photorealistic scenes being rendered in realtime with no drops below ~60 fps on large, outdoor scenes you can say we've gone far enough. And by that time we'll probably want 3D or something else that will continue to push the bleeding edge.
Until then, there is room for improvement. And there's a lot more room on the hardware side then there is on the software side.
Re:Really now... (Score:2)
With today's constant hardware upgrade cycles you can be forced to upgrade to a new OS because your old one does not support USB 2.0c, it only supports USB up through 2.0b. And you're screwed, since that new digital camera you got only works with 2.0c. So you end up spending money on new software and new hardware.
With the hardware part of the equation frozen, you would only be forced to spend on the software, and that spending would be motivated by tangible improvements rather than broken back-compatibility.
This is all a pipe dream of course. But I sometimes wonder... Why is my 1.4G pentium 4 not even 10 time faster than my 33MHz 386 was in '92?
Re:Really now... (Score:2)
Probibly because you don't have DOS 3.3 and the application from 1992 installed. Who knows you might be able to get by with the old Word Perfect and Visicalc "office suite" I'm sure that the framerate on Microsoft Flight Simulator V2 would be impressive!
support (Score:2, Interesting)
Re:support (Score:2)
Spelling and drivers are related? (Score:2)
Not only can't the people at ATi write drivers, they can't write or spell very well.
I doubt I'll ever buy another ATi product again, my AIW128 Pro hasn't been the same since I tried to install new drivers for it. I can't even get it fully back to the old drivers now. *sigh*
Re:Spelling and drivers are related? (Score:5, Funny)
Re:Spelling and drivers are related? (Score:2)
Yeah, too bad you didn't read the article, or you would have seen that it was quoted as being from a source at ATi.
single CPU cards are a better choice (Score:2, Interesting)
Here's a good article with some benchmarks on this great value card.
[hothardware.com]
http://www.hothardware.com/hh_files/S&V/abit_ti
This is a nice concept card, but it's not going to put ATI on top.
money [freedkp.com] for moogles [kupoflux.com]
Re:single CPU cards are a better choice (Score:2)
Re:single CPU cards are a better choice (Score:2)
Re:single CPU cards are a better choice (Score:2)
If there are specs... (Score:2)
Cheers
Re:If there are specs... (Score:2)
As for the 8500 MAXX, no word yet. Odds are it will probably be a while before XF86 supports both GPUs.
Wasted Cycles (Score:2)
I mean, if people can trick the TCP stack into doing distributed math [slashdot.org], they can certainly trick these GPUs into doing it to...
Promises, promises... (Score:5, Funny)
Yeah, they said this about the final Kiss tour as well.
Re:Promises, promises... (Score:2)
Argh! (Score:2)
Stop Slamming ATI (Score:5, Informative)
For all you GNU/Linux junkies, ATI has been much more forthcoming in information for developing XF86 drivers than NVidia(proprietary binary only).
If my original Radeon VIVO (Score:2)
But when you write a driver and refuse to run a machine with it for more than an hour, and then, worse than that, ship the product and try to sell it for $600 upon release, you do get a bad name.
ATI deserves every flame they get until my radeon supports VfW without an ungodly amount of hacks. And video capture is the absolute least amount of the problems with the driver that shipped (the fact that your DVD support is gone if you lose/ruin your driver disc would be number 2 on the list).
ATI can keep their crappy products. Of course, now I've switched over to Linux, I'm starting to buy their products again (looks like third party drivers written without full specs of ATIs cards are more stable than ATIs own -- who'd-a-thunk-it?).
>ATI has been much more forthcoming in information for developing XF86 drivers
Which would explain why third party X11 drivers are better than their windows drivers. Man, you have to have one really poor set of coders to be beat out on the quality of the drivers for your product by people hacking out code as a hobby.
Re:Stop Slamming ATI (Score:3, Insightful)
People who whine about this just prove how inane and stupid the free software movement can be.
Look at this, and virtually every other thread, regarding ATI. See how many complaints there are about the poor drivers despite the superior hardware. Contrast to nVidia, whose drivers support every card made going back 3 years, have great performance, and are usually very stable.
Now tell me again how there aren't trade secrets in that driver code?
Re:Stop Slamming ATI (Score:2)
They aren't trying (in a serious fashion) to hide any tricks from ATI or Matrox. Binaries aren't that hard to disassemble, especially when you can feed known data to them over and over again. It's hard for a layman to understand, but to a driver programmer, this is part of the job.
Perspective (Score:2)
I also happen to think that people get way too hung up over the 2% of the time they are playing games when the other 98% ofthe time they would be better off with a ATI or Matrox card.
Not to mention the times I was hosed by crappy NVidia drivers in Linux until NVidia got their crap working better.
And in a year or so (Score:2)
Picture (Score:4, Informative)
http://www.jeffwilhelm.com/files/r250.jpg
hyperlink (Score:2)
Re:hyperlink (Score:2)
Re:Picture -- FAKE (Score:5, Informative)
For one the Heatsink fans are exactly the same - right down to the positioning of the fan fins
For two the wires from one of the fans are not casting a shadow
In fact if you do a Google image search you can find the original image
see original here:
http://home.earthlink.net/~doniteli/radeon
Re:Picture -- FAKE (Score:2)
Maybe fake, but not the way you mention. (Score:2)
The board layouts are very similar, and it's likely that someone did take a stock Radeon picture and mod it a bit, but those are still different boards entirely.
DVI (Score:2)
Feel the heat (Score:4, Funny)
Besides, who needs clean clothes when your getting 200 fps
Re:Feel the heat (Score:2)
Cool, does that mean everyone in europe (220-240 is standard here) will get to play games faster than you guys?
Washing machine only 120V (Score:2)
What about heat? (Score:2)
What's ATI doing to keep TWO of these in one box from overheating?
Multifunction Card? (Score:2, Funny)
The ATI Radeon 8500 Maxx: Your High-End Desktop Graphics and Affordable Home Heating Solution from ATI!
Rock the industry? (Score:3, Funny)
(emphasis added)
When did Slashdot start hiring cheezy '80s Hair Metal band rejects to post stories to the front page?
It's "News for Nerds", not "News for Mullet-Sporting Losers Who Can't Get Over Their High School Glory Days".
Flamebait? Maybe a little.
Re:Rock the industry? (Score:2)
I doubt it matters. (Score:3, Interesting)
I've seen this card work. I runs fast and it looks gorgeous.
You know what the Parhelia tried to do? Fragmented AA? Voodoo could've torn that up years ago. The V5 6000 did 8x Full screen AA. Fast. At 1024. It's amazingly gorgeous.
Think about it. This card is 2-3 years old. The architecture is what matters. Not the amount of GPUs. The GeForce4 4600 can't even consider 8xAA. The V5 6K does, and it does it well. On 128M of SDRAM. I'd still maybe take the 4600 over the V5 6K. But it would be a hard decision. The 4600 with it's DDR memory and GPU can handle some things better. Some. Not all.
This card just proves that it really doesn't matter how much RAM or how many GPUs a card has. It's in the way the card is built. There aren't many cards I'd take over the V5 6K. If I could get one, for myself to keep, I'd pull me Geforce 3 out in a heartbeat. The GPU isn't a factor here. The RAM (DDR over SDR) isn't a factor. The V5 6K is just that well built, even 3 years later.
There. I've said my piece. After seeing the V5 in action, I don't care to get the least bit excited about the "latest greatest" graphics cards ever again.
Re:I doubt it matters. (Score:2)
What the Hell?!? (Score:5, Insightful)
Think about it. This card is 2-3 years old. The architecture is what matters. Not the amount of GPUs.
Again, this card had 4 processors!
It sort of had 128M of RAM. It actually has 32MB of RAM per processor. So, all the latest games that use up more than 32MB of RAM in texture / geometry caching will run really slowly on the V5. Also, for those that don't remember, this was the card that you had to plug into the wall separately from the computer.
Don't get me wrong, I've used the V5 5500 (2 GPU version), and it was really cool at the time. But I'll take a GF4 any day of the week over any voodoo you offer me (unless of course I can sell it at the collector's item price :)
Re:I doubt it matters. (Score:2)
Re:I doubt it matters. (Score:2)
But you are just a moron. Granted, I should have specified what this was running on, but you're taking the V5 5500's benchmarks based on it's release! Yeah, the V5 6K won't do quite so well on a PII 400. It does a helluva lot better on an Athlon XP. Helllooo? It's not all in the graphics card. Chipset, processor, RAM. You know, I hear those are important components too!
PII 400MHz != 1.4GHz Athlon
And NO the 4600 can't stomp it in any benchmark. 8xFSAA. You moron. The 4600 can't even render in 8xFSAA. Even if the 4600 only got 15fps (it was running smooth, at least 30) 15 0? New math! Hooray for new math! It won't do you a bit of good to review math!
I've seen this thing RUN. You haven't. Once you have, you can tell me what you thought it looked like.
Do what you will. I've got Karma to burn.
Re:I doubt it matters. (Score:2)
Who gives a flying fuck if the GF4 can't do 8xFSAA when the Voodoo5 6000 couldn't do it at a playable fps, the whole argument is moot and besides 4xFSAA at 1600x1200 is going to look a hell of a lot better than 1024x768 no matter what kind of FSAA you throw at it.
The Voodoo 5 6000 was not the pinnacle of graphics cards you make it out to be. I suspect you may be confusing it with Rampage, 3DFX's true next-gen chip which would have kicked a whole lot of ass.
Re:I doubt it matters. (Score:2)
Pixel and vertex shaders.
No Voodoo card can do them regardless of how many of those chips you stick on a board. Fixed function pipelines suck once you get used to being able to change all of the lighting and shading code that the graphics board runs.
Re:I doubt it matters. (Score:2)
You only had four words.
AA & memory bandwidth (Score:2)
It has little to do with the number of GPUs you've got. The Voodoo 5 probably had to have several just to keep up with the bandwidth it had.
So what was so great about the Voodoo 5 6000? they put a huge amount of bandwidth into a card when it just wasn't economically viable. I'm sure that nVidia and ATI probably both had internal test setups that could equal it, but they both had the sense not to try and make a commercial product out of it until the cost of fast RAM came down.
As far as I am aware the V5 6000 didn't have any particularly special AA tricks, which nVidia seem to have now (compare Geforce3 AA performance with GeForce4 Ti...) so I'd imagine that the Ti4600 would beat the V5 nowadays, on 4x AA at least. Shame they don't have a higher AA mode, but with the next gen of games coming out, you wouldn't be able to afford it anyway, even with e Ti4600 or a V5 6000.
Of course, the V5 had no pixel or vertex shaders (which is gonna hurt image quality) and no hardware T&L. As the majority of current games are still CPU-bound, that's gonna hurt the Voodoo 5.
Let's go to the numbers (Re:I doubt it matters.) (Score:3, Informative)
Q3A, 1600x1200x32 bit (no FSAA)
GF4 Ti4600 : 160.6 fps
V5 6000 : 58.7 fps
Expect almost a linear scaling for FSAA. Note that at 4x, the GF4 would be pushing out around 40 fps. The 6k? About 13. At 8x? Let's be generous, and call it 8. Yes, the machines being tested are very different (a 1.3ghz Athlon vs. an 800mhz P3), but at those resolutions, you're very close to being 100% CPU bound.
I admire the meaningless iconoclasm that would lead one to tout an evolutionary dead-end like the 6000 as the be-all end-all of video cards, but in the future, you would be better served by appealing to the Voodoo's superior blast capacity, or the "warmth" of its image, rather than trying to make a technical argument without even the slimmests of legs to stand on.
Best,
'jfb
Links:
V5 6k benchmarking: http://www.voodooextreme.com/hw/previews/v5_6000/
GF4 numbers: All over, but I used these:
http://www.anandtech.com/showdoc.html?i=1
Why I think this is fake (Score:3, Insightful)
I apologize in advance for the AC post, but being as my company has a working relationship with ATI, blah, blah, etc etc.
On to why I think this is fake:
1. Look at the heatsink/fans. From the picture, it looks like they are using different model fans for the different GPU's, looking at the position at which the power wires are coming from. Being that I am a board designer, I can tell you that this would never happen, in order to keep the bill of materials down.
2. On the very bottom right of the card, under the last SRAM chip, there is a small device (regulator?) that looks like its overlapping the edge of the board. This would never pass board layout verification, because there are certain clearances you need to observe when laying out pcb's.
3. It looks like the lower GPU is violating the AGP spec for connector keepouts. I'm not sure on this, as I dont have the AGP design guide handy, but that GPU looks like it's positioned extremely low.
4. Silkscreen for some of the parts further down the board (compare some of the electrolytic can & SRAM silkscreens) seems to be conspicuously absent.
5. Look at the ATI symbol silkscreen. Right above it is a fiducial (these are used during assembly, as a way for the machine doing the assembly to calibrate it's position to the board), and part of a silkscreen that looks exactly like the assembly guide for the SRAMs! This is the thing that to me stands out the most as being doctored.
Re: (Score:2)
I'd buy it if it didn't have any fans... (Score:2)
it's a good thing (Score:2)
HardOCP says this is a definite fake (Score:2)
What about the Voodoo2? (Score:2)
Agh! Marketing Splooge, attacking from 3 o'clock!
Dual "GPU" configurations have been around since early '98, when the Voodoo2 came out. Sure, 3Dfx called it SLI, but it was essentially two 3D cards working as one - and someone (Quantum3D?) made single cards with dual Voodoo2's on them. Not to mention the Voodoo5 which had 4 GPUs on it.
I remember seeing someone (could have been Quantum3D again) who was promising a 16-GPU version of the Voodoo5 for mass $$$.
Multiple GPU's is nothing new, and it's definitely not going to "shake the gaming industry to it's core only on PAY PER VIEEEEEEEEEW....."
If ATi are so close to releasing this product... (Score:2)
Yes, the geek in me thinks "Dual GPU Radeon card. Sweet!". But the realist in me thinks "Well, the ONLY "proof" of it we've seen are unconfirmed leaks, and a badly photochopped photo of a product that ATi already have in full production.
Ahuh. I'll believe it when I see it, in person.
Re:bad news for Linux? (Score:2)
Re:Cool (Score:2, Insightful)
Reading over Carmack's finger, one can see that he is currently having the most success with NVidia drivers and the card will therefore not only be fast, but should support every feature he builds in.
I'm currently building my own system that is focused on Doom III. My best reccomendation for a graphics card is by Gainward [gainward.com]; the GeForce 4 PowerPack! Ultra/750 XP Golden Sample [gainwardusa.com]. It retails for right around $350.00. However (as far as I know), it still holds the best benchmarks of the GeForce4 series.
There is something to be said about Carmack's opinion on ATI hardware (which is good overall), but for the sake of assuring your computer will be running Doom III perfectly I must stand by the opinion that the GeForce4 TI 4600 is what you need...
Re:Dual GPUs aren't new (Score:2)
Think about it: the GeForce4 Ti4600 only needs one GPU chip to achieve its amazing 3-D performance; why bother with the engineering and chip cooling hassles of a dual-GPU setup?
Re:Linux 3d Support (Score:2)
Re:Linux 3d Support (Score:2)
People really need to do a little research before posting here...
In fact, there are two sets of 3D drivers for the Radeon 8500 currently available:
1) Closed source DRI drivers available from ATI
2) Open source DRI drivers available from the DRI development team (very new, probably very unstable).
Dinivin
Re:Linux 3d Support (Score:2)
The drivers for the FireGL also work on the Radeon 8500, which has the same 2D and 3D core.
Dinivin
Re:Two really good reasons to avoid ATi (Score:3)
awful driver support.
About 4 years ago, when I was buying my new PC
(for the time... 400MHz P2), I had never heard of
nVidia. I had two options: nVidia card (cheaper)
or ATi card (more expensive.)
Well, never having heard of nVidia, I went ATi.
HAH! There were never any drivers for my poor ATi
Rage LT Pro... MAYBE if I was lucky, I could get
a half-decent framerate in the DirectX sample...
but there was NO OpenGL acceleration at all!!
Mind you, this was under 98, NOT 2000 or any
other platforms that are especially difficult.
The Linux drivers (UTAH-GLX) weren't much better
either - every other launch of a 3D app, the
machine would go down, requiring me to SSH in and
reboot. (I couldn't just restart the X server,
that didn't work. ARRRGGHHH!)
But, we also had another machine with an ATi
card, that time a P2-450 with (I believe) the
same card, except with digital out. That machine
ran 2000. Well, there were drivers once, all
right... the "recommended" drivers were the ones
that came with 2k (they said it themselves!), and
THEY SAID that those drivers didn't provide
acceleration. So, they also had the "alternate"
drivers. I gave those a shot - couldn't hurt
anything, no? I grabbed those. Now, I have an ATi
logo sitting in the taskbar, allowing me to
adjust nothing since it's a LCD flat-panel. It
looked promising... I ran my little app, Snake3D. [sourceforge.net] The machine locked hard. "#$!%!!" I remember
not even trying DirectX, and I immediately went
back to the software drivers.
So, learn from my fuck-ups, and don't buy it
until you see it running in your favorite OS
(whatever it may be), running your favorite game.
(Preferably something graphically intensive -
make sure it's not software rendering, and turn
it to high quality.)
Or, if you just want to screw off ATi for fucking
everyone else off with high-priced cards and
missing drivers, go for a GF4. You don't need a
duallie GFX card anyway.
But whatever you choose, have fun!
--j
Re:Two really good reasons to avoid ATI (Score:2)
That's your evidence? 8500 gets slightly lower scores than Ti4200, and costs almost 50% less. True, 4400 and 4600 are faster, but price/performance easily goes to 8500.