ATI Rage Fury MAXX Review 166
Johan Jonasson writes "There's an excellent review of the ATI Rage Fury MAXX over at Tom's Hardware. For those unfamiliar with the product it's a monster graphics board with two Rage 128 PRO chips, each with isolated 32MBs of memory per chip which adds up to 64MB on one board. There's another review of the same board at Sharky Extreme. I've got to get me one of these. "
Single-card SLI, anyone? (Score:1)
either way, it simply looks like two-cards-on-one. No real forward progress, for me.
.. course, I'm also waiting for the fscking Guillemot Prophet DDRs (geForce) to hit the shelves locally.. hardware t&l... *sigh*
(pirst fost?)
TnL (Score:1)
Thats nice and all.. but... (Score:1)
Linux support? Nope! (Score:2)
From the Sharky review:
As the Rage Fury MAXX is meant for gamers, ATi has written drivers for Windows 98 only. While we'd agree that most gamers don't run NT4, it will be interesting to see the impact Windows 2000 has on the gaming community. Other then that, Linux users will have to look elsewhere and Win 3.x users should definitely think about upgrading.
When will these people learn???? Sounds like a nice card, but I'm certainly not in the market for Windows-only hardware.Specs? (Score:1)
-Vel
Why is this review important? (Score:1)
Why is this news? Lots of video cards have been reviewed before but was never news on slashdot.
Am I missing something here or is this just a slow news day?
I dunno if I want it... (Score:3)
As far as I can remember both articles mentioned that for online-gamers the card would have a slight delay because of the dual-cpu design.
It said something about rendering twice as slow as a nVidia GeForce but made up speed by rendering each frame on the idle cpu. So frame one would be rendered by CPU1, frame two by CPU2, frame 3 by CPU1 and so on...
Anyway, what they said was that if you would have a framerate of 50 frames/sec that would give you on a normal nVidia a time difference between an action (movement, shooting and stuff) and rendering of the actual frame about 0.02 seconds. Giving the ATI has dual-CPU it takes about 0.04 seconds to render.
According to Sharkeyextreme you would certainly "feel" the difference.
Anyway, another reason why I personally would prefer nVidia is because of their good native openGL support.
Re:Sounds great (Score:1)
have 32 megs and the onboard video takes up 4 megs....sigh
Re:Why is this review important? (Score:1)
"Some smegger's filled in this 'Have You Got A Good Memory?' quiz!"
Re:Linux support? Nope! (Score:2)
I'm sure that their marketing team is unaware of the open source team. It'll hit us eventually, I should think.
Why buy one? (Score:2)
2) Great DVD BUT no TV output.
3) Slower frames/s than GeForce and marginally better than TNT2 Ultra for some games.
Seems enough for me to leave it alone for a while.
2 chips for roughly the same performance... (Score:3)
Re:Specs? (Score:1)
It's simple (Score:2)
In simple, why support something that you can't use? Yeah "you gotta get one of those" but what are you gonna do with it? The holidays are over so hanging it on your tree as an ornament would be rather redundant.
Re:Specs? (Score:1)
Or try this http://www.atitech.com/ca_us/corporate/press/1999
I've lost all respect for tom.. (Score:3)
I would love to tell you the real story here but I don't think these hardware companies would appreciate it
Tom lost all my respect. And if you feel you need to read the surrounding text, look at the URL below. Wow. http://www6.tomshardwar e.com/graphic/99q4/991230/fury-14.html [tomshardware.com]
I've gotta get one of these... (Score:1)
--Evan
Re:Specs? (Score:1)
Re:Linux support? Nope! (Score:2)
Wait a few months... (Score:1)
Re: (Score:1)
drivers (Score:1)
Basically, you have 2 driver options for the Rage Fury: 1) release drivers, which are slower than old people fucking, or 2) beta drivers, which are faster, but ridiculously buggy. Damned if you do, damned if you don't.
Don't bother emailing ATI for Linux drivers. They aren't worth having. Except for you poor souls who bought Rage Furies. (my condolences to you all)
Commercial! (Score:1)
How much were you paid to post this crap up here? If I wanted to read about HARDWARE REVIEWS, I would go to a HARDWARE REVIEW (read: commercial) site.
Kindly do not defile Slashdot with this tripe.
Another biased review by an idiot (Score:5)
This review seems almost as biased as the last one based on the board before it was released. Do they think that nVidia will be sitting around and not have something better by the time ATi release their next generation chip? If I'm going to spend upwards of $250 on a graphics card, I don't want to be shelling out for another card later in the year to get the new features. Stupid reviewer! The ATi card is no cheaper than the geForce DDR, but with lower performance and fewer features. It's obvious which card to get when chosing between the two. Besides, who cares about the hi-res results: no serious gamer would play at 1280x1024, the framerate is half what I consider the miniumum for games like Quake 3 - where the difference between 50 and 60 fps is noticeable, let alone playing at 30 fps! - and with Quake 2 I would suggest that there is no need to go to resolutions above 640x480 or 800x600 as there is no real gain.
The cheapest Creative Labs 3D Blaster Annihilator Pro (geForce DDR) is available for $233, according to computers.com (can't find the MAXX yet):
Creative Labs 3D Blaster Annihilator Pro, sorted by price [cnet.com]
Re:It's simple (Score:1)
The problems with this card (Score:2)
Some people have raised the concern that there will be an additional latency in first person shooters that some gamers would notice, since it's rendering the next frame ahead before it displays when you haven't hit the key to decide your actions in that frame yet. Maybe not noticable to some people but for the hard core gamer....
The Voodoo2 SLI and multichip Voodoo4/5 cards don't have this problem because they render portions of the same frame.
It's also very inefficient to have 32 MB per chip rather than a shared 64MB pool.
You're better off going for a Voodoo5 if you want the absolute highest fill rate or a GeForce DDR if you want maximum geometry throughput.
Lets be fair here (Score:2)
Seems to be a lot of ATI slamming going on. I for one know that ATI has NEVER made the fastest card in the market (although their marketing department seems to think so).
But, I do have an ATI AIW 128 (not Pro). Boy is it nice to watch TV, or record a TV magazine for later. Its nice to be able to broadcast video in netmeeting or CUSeeMe. Its nice to be able to do all of this on one board. I have OpenGL driver support (even in NT!). I have DirectX support. It compensates DVD playback. Its a very well rounded board. And what's more, I get very decent game play at 32bit (whereas most traditional 3D boards cringe at 32-bit color and stick to 16-bit).
Now, there were two ways to proceed in enhancing game performance:
GeForce did one and ATI did the other. The reviews I've seen place them very close. No which one of these guys will figure out how to use TWO T&L chips first?
Anyways, I just wanted to point out that ATI is most prominently a marketing company selling to an OEM market. And they're doing a DAMN GOOD job at that. Their boards are not the best, but they're certainly far from the worst.
Re:2 chips for roughly the same performance... (Score:1)
whats wrong with these companies?? (Score:2)
=======
There was never a genius without a tincture of madness.
Why invest in ATI instead of nVidia? (Score:5)
Sure, you can buy a MAXX product for $200~250 and have yourself a kick-ass video card. Or, you could shell out $200~300 for a GeForce-based card and get a kick-ass video card that might just have a longer lifetime in it.
S3's and nVidia's new chipsets support hardware transformation and lighting--done right on the video card, instead of the CPU (which would be software). 3dfx's and ATI's new products don't. Now, it depends on game developer's support for this new technology, but chances are good that many games in the coming couple years will count on offloading these calculations to the video card in hardware T&L enabled cards. If that happens, then owners of these cards will experience serious performance boosts or be able to run games their non-T&L-card-owning bethren can't.
Don't be fooled by the 64 Megs of RAM on the MAXX, either. It doesn't increase the total textures the card can handle, because each chip has to keep track of (almost) all the textures simultaneously. The RAM on this video card is not a particular selling point compared to other 32M cards.
One point ATI might be able to score on is price. The MAXX is expected to retail for less than GeForce products, and may offer a better deal. Only time and the market will tell.
Of course, MAXX products will really succeed in the OEM market, where ATI's strength is. And when (if) this technology gets ported to the Mac, it'll be a major boon to Mac gaming. Given ATI's current stranglehold on the Mac 3D video card market, I expect this card will find it's way there soon enough.
Re:Why is this review important? (Score:1)
Mostly in the Macintosh world (remember some of us use supplemental operating systems) where this is the supposed solution to Apple being cheap and only given "power users" 4 slots (most need 6+ for some reason unknown to people such as me...then again I don't do graphics!)
This was supposed to be the card that put ATI back on the map... apparently not!
Why??? (Score:1)
video newbie (Score:1)
---
Moderators - what the hell? (Score:1)
Why on earth is that post - a reasonable summary of the two articles - moderated DOWN as a troll?
Hoping to meet you in a dark metamoderation alleyway....Thank you. (Score:1)
I really would like to know why the person submitting was so excited by this card.
Gah. (Score:3)
Is it just me, or has Sharky been infected with the "suck up to our advertisers" disease that hit Tom a while back? Get this quote from here [sharkyextreme.com]:
Well, excuse me, since ATI has thrown two chips at the problem compared with one for the NVidia card, I would expect the words "raw power" to be applied to the GeForce. On top of that, he says that the ATI card "almost overtakes" the GeForce DDR; the framerate differences between the ATI and the SDR card on the three tests on this page were 0.4 FPS, 0.1 FPS and 0.4 FPS again, whereas the gaps between the ATI and the DDR card were, respectively, 5 FPS, 5.4 FPS and 5.1 FPS. Since we're talking about a nearly 20% difference in F/R between the ATI and the DDR cards, his comments strike me as being just this side of dishonest. He then goes on to say that the DDR GeForce card has better bandwidth and T&L, as if NVidia were cheating or something.
If you look at the tests, many of them show the ATI card getting its ass well and truly kicked by the GeForce cards, sometimes by margins of 100% or more, yet Sharky skims by these figures as if they were of little importance, even though he's the one who did the tests. Faugh. Show us your list of advertisers, Sharky.
My advice,... (Score:2)
lay low for a couple of months,
and buy yourself a Playstation 2
J.
Re:I've lost all respect for tom.. (Score:1)
There is a difference between being libelous and reporting the truth. However, isn't it already too late, since he is posting benchmarks in which only one company or product will appear the victor? Or perhaps by saying "It was the program that stated your video card sucked, not me" is his way of avoiding lawsuits.
Either way, if he knows something we don't, he should be obligated to let us, the consumer know, as it is the goal of his site. But, it is HIS site, and he is free to do what he wants with it.
Re:It's simple (Score:1)
I don't care if you use Linux or not. In fact I don't care what you use as it is unimportant to me. However if Slashdot is to cater to the Linux community you shouldn't see a problem with such posts. I wasn't aware that this was a universal end all website. Maybe I should start reading elsewhere for linux related material.
It seems that as more *nix/linux/bsd people post about their experiences and how a company and or person affects our community we offend the "other".
With that I plan to make that the last time I read Slashdot. It has become something I don't wish to be a part of any longer and will explore other options; if any. Sad.
Theater Chip or Not? (Score:2)
Re:Thats nice and all.. but... (Score:1)
Mad ATI Product Names (Score:4)
*Rage* 128 *Fury* - whoa!
*Rage Pro* (how to be professionally mad?)
Is the next board going to be a buget version?: *Rage 128 Mildly-Upset*
And of course dont forget their next board:
*RAGE 256: HOMICIDAL MANIAC*
Re:It's simple (Score:1)
What was the point in replying? Also I wasn't aware that speaking of previous experiences and what has happened to me as a Linux user was something of an illegal post.
Slashdot has now becomea site for news. Technical related news and less of a site that deals with Linux at all. Patents/Laws/Tech News/Rants disgusied as news. It's pathetic.
No, that's NOT what we need. (Score:1)
What we need is a public commitment by major companies to have Linux-ready drivers at the get-go. Preferrably open-source, but whatever tickles their whiskers... If they can't handle open-sourcing their drivers, then they can release binary only, and release specs on request for the hardware.
Re:TnL (Score:1)
Hardware reviews linked on /. (Score:1)
Re:It's simple (Score:1)
Obviously you're not very observant. What have Jon Katz's rantings (stories from the hellmouth) or the perpetual stories on nanotechnology or the The Who have to do with Linux? If you look at the
"However if Slashdot is to cater to the Linux community you shouldn't see a problem with such posts. "
I have a problem with the original post. It was whining and the opening question/statement was unnecessary.
"With that I plan to make that the last time I read Slashdot. It has become something I don't wish to be a part of any longer and will explore other options; if any."
Fine. Cut off your nose despite your face. You won't become a martyr.
Re:Linux support? Nope! (Score:1)
-Chris
(proud owner of a Mach32 that was fried by X)
Re:LINUX 3D POWERHOUSE, WHERE IS IRIX NOW??? (Score:1)
Let's be fair (Score:2)
But I agree. The PSX2 is going to kick severely large ammounts of ass. And I'd rather shell out for a Geforce now anyway
Re:Why is this review important? (Score:1)
#2 - Video Capture Card
#3 - Better Sound Card
#4 - SCSI card
With 10/100 ethernet onboard, the need for slot 5 has been alleviated. There used to be a day when people would plop two Apple Quickdraw 3D cards into a 9500, but video cards have eliminated that need. But some people still like a 3rd monitor (one for editing, one for previewing, and one for other apps... Or else, one for sound tracks, one for laying out, and one for video...
And yet others seem to like 2 SCSI cards.... With the possibility of two or three live video sources, just now with that 160 MB/Sec variant of SCSI has it been possible to shove all that data down on SCSI channel.
And there's more possibilities, too... but i don't want to ramble for too long about this
Re:No, that's NOT what we need. (Score:2)
Re:It's simple (Score:1)
but the bottom line is that if Slashdot catered exclusively to the Linux community they wouldn't get much audience...
You know? If I was going to post something this blatantly stupid, I'd post as an AC too! Do you not have any idea of this history of /.? Or did you just follow a link from Wired last week?
But more to the point, both in reply to your post and that of the others flaming those who are concerned with lack of linux drivers, I've got a harsh reality for you - not every post on /. is going to be relevent to you. I don't go through flaming the hell out of everyone who posts something about a Palm Pilot, even though (!!!) I don't own one...
Off topic . . . (Score:1)
my lame ass attempt at a public service announcement
Re:Thats nice and all.. but... (Score:1)
Re:LINUX 3D POWERHOUSE, WHERE IS IRIX NOW??? (Score:1)
I have an Indigo 2, 250Mhz Extreme. 128MB RAM.
That Sunuvabiatch kicks the living piss out of my DUAL PIIx400 in doing Rendering / Graphics work.
(Admittedly, it's a little sluggish in a regular state, but that's not what i use it for).
At work I use an O2 - 256mb RAM, (not sure the clock speed). (I called it PapaSmurf on the LAN, cuz it's blue)
This sucker is quite the powerhouse. Rendered a 2 minute 3D animation at 1600x1600x32bit in my lifetime. (I went to lunch and it was done. not sure how long).
Irix May not be as robust as other Unices, nor as fancy looking as linux, but it does what it does.
(And I like 4dwm, if you must know... it don't look like windows)
No go back to your cave, AC
Re:I've lost all respect for tom.. (Score:2)
not in the least. He isn't "obligated" to tell you $hit. However, if he knows info but can't share it (fear of lawsuits, NDAs, etc) then he shouldn't even bring it up, otherwise small minded people get pissed off because their "right" to know is being violated.
From what it sounds like, Tom has pissed off more than a few folks with biased reporting. Let that be a lesson to any would-be hardware pundits in the crowd (from both directions).
Re:Sounds great (Score:2)
Re:TnL (Score:1)
Interesting that you mention John in your post... (Score:2)
It's not so much the chips themselves but the drivers that make the chipsets worse than they actually are. Yes, the ATI offerings are nowhere near as good as the Matrox, NVidia, etc. offerings- but they're everywhere, cheap, and are serviceable. As for this card, we'll have support for the basic configuration shortly- all we need for the full support is the info to interlace them from ATI.
Re:2 chips for r....--SMP? (Score:1)
then again, to make real use of a good machine, you need a real operating system, and this card can only run with win98?
it's a start, but it definately needs work.
Re:I've lost all respect for tom.. (Score:1)
Because of better Open Source support... (Score:1)
Re:I've gotta get one of these... (Score:2)
'Fine' is insufficient. If I can't see every individual rendered blood droplet when I blow your head through the back of the screen from all the way across the level without any slow down with ALL of the eye candy turned on then it isn't good enough!
>:)
Kintanon
Great Review (Score:1)
Another review (Score:1)
Pablo Nevares, "the freshmaker".
Re:I've lost all respect for tom.. (Score:1)
Where was this?
Pablo Nevares, "the freshmaker".
Re:Another biased review by an idiot (Score:2)
There goes all your credibility. Hi-res results are what actually demonstrate raw hardware speeds. Low-res scores reveal little about the actual speed of the card, because few chips are fill rate bound at low resolutions.
Even though high-resolution game scores are a much more effective way to measure a chip's fill rate, they aren't the be-all-and-end-all of the chip's capability. I'd like to see how this ATI handles a 500,000 poly scene typical in the CAD world...
________________________________
Re:I've lost all respect for tom.. (Score:1)
Where was this?
It is still there, I just checked it. I would cut and paste the whole paragraph, but for some reason netscape wont let me select anything on the page. The sentance I quoted is in the first paragraph, about 3/4 of the way down.
Re:Linux support? Nope! (Score:1)
No NT/Win2000 support sucks from a BUSINESS perspective. So many graphics card makers ignore the NT market despite the fact that game developers almost universally use NT or now Win 2000 to develop in...Who wants to write and debug code under 98? Not me. By not supplying an OpenGL driver for NT4 and not supplying OpenGL and DirectX 7 drivers for Windows 2000, companies are shooting themselves in the foot as far as getting good developer support and optimization for their cards.
Only NVidia and 3DFX seem to really grasp this concept, which is probably why all the other card makers continue to exist only because of OEM deals where users dont really know what they are getting.
Re:LINUX 3D POWERHOUSE, WHERE IS IRIX NOW??? (Score:1)
The IR2 wasn't created for running Quake2. You're comparing apples to oranges...However, it does stand to reason that current and future consumer level PC cards will start beating the "big iron" for rendering of a few years ago. Yet more Moores Law in action, this time applied to the graphics processor rather than the CPU.
As much as I like Linux, Linux doesn't specifically do 3D better than any operating system. In fact, technically, you could call Windows a better 3D OS, since at least (since Win98 B) it has come with a standard 3D API (OpenGL), whereas Linux has no true standard (Mesa is a bit of a de facto standard... XFree4 should fix the whole issue, but I digress...)
Also, while Linux might get some of the big name 3D packages (Maya, etc), don't hold your breath for them to become Open Source.
Re:Another biased review by an idiot (Score:1)
Sure, it would interesting to see how these cards work in the CAD world (I don't think you'll be seeing that in Sharky or Tom's reviews.) That is a world that I am not familiar with, but I was under the impression that these cards would be very low-end in that arena. I believe that these cards are aimed at gamers, but please correct if I'm wrong. I do see the benefit of a cheap card for people trying to do CAD work outside of a workplace, but the demand is obviously lower than in other parts of the market.
Re:LINUX 3D POWERHOUSE, WHERE IS IRIX NOW??? (Score:1)
Re:2 chips for r....--SMP? (Score:1)
Re:2 chips for roughly the same performance... (Score:1)
To be fair to ATI... (Score:1)
However, faulting a company releasing a graphics card right now without a Linux driver is a bit unfair. Lets wait until XFree4 is widely available (at which point Linux will finally have something like a standard 3D driver system) before we bash anyone for lack of drivers, and this includes NVidia, 3DFX, etc as well as ATI.
Re:2 chips for roughly the same performance... (Score:1)
One tip --make sure you have the processor(s) to support high-intensity gaming. I'm starting to see the age already on my k6-2 450 -- recommendable for you to get something with better FP (k7, p3 (shudder) or dual celeron (yay!)
Just my thoughts.
Re:I've lost all respect for tom.. (Score:1)
Fastest way to get there, AFAICS, is to go to the intro, click "Benchmark results" down and the bottom, which puts you on page 8.. then keep clicking to the next page until you get to page 14...
What an annoying site.
---
Re:!flame (Score:1)
Damn, I have a Rage II+. POS.
Enough Already!! (Score:1)
A wealthy eccentric who marches to the beat of a different drum. But you may call me "Noodle Noggin."
Quote is on the 3DMark 2000 page. (Score:1)
sounds above like he doesn't know what's going
on, then he sais that he wants to tell but he
cant.
I've been dissapointed in toms sight lately, not
really because he's arogant, or has 10 banner
adds on his homepage (though that doesn't help).
The sight has just dried up. The reviews are
too slow, too late, and too few and far between.
There's too many other good sights to read now
and his isn't at the cutting edge anymore.
I read about the MAXX somewhere else before I
read Toms.
His bread and butter was the Celeron overclocking
stuff that he covered and now there's not a
peep about overclocking the P3-x5x0E's. He
did some really good stuff but he's been slipping
lately.
Re:2 chips for roughly the same performance... (Score:1)
the big problem... (Score:2)
1. We can't have any more than X triangles per frame, limiting geometric complexity.
2. Nearly all (90%) of the computing power is going toward rendering, leaving precious little left for AI, physics, or anything else.
The future is obviously in cards with T&L, and it will become clear in the next year that games that expect a T&L card will run MUCH faster. With a T&L engine, we can now fit many more tris on the screen (5x? 10x?) at the nearly the same frame rates. We can also have much more complex worlds.
So while the MAXXXXX might be ok for now, it will lose out to the GeForce. Maybe not today, but it will. While most companies are pushing fill rate to beyond the max (1600x1200 at 120 fps? who needs that?), the geforce is the only card that will could run Myst in real time at 60 fps.
Not even worth it... (Score:1)
For $270 of *my* dollars, I'd rather spend $20 more and get a GeForce-DDR card - creative, diamond, and guillemot all make fantastic solutions that blow the pants off the performance of ATI's unspectacular MAXX.
They messed up, basically... they find a hard time beating TNT2 Ultra cards, which run $100 cheaper and are well established (with good OpenGL)...
Not too mention their lame driver support right now, versus nVidia's solid existing drivers, their commitment to driver optimizations, and their production of stable cards...
the only thing GeForce is weak at is it's relatively slow SDRAM (only 150MHz) and its slim memory interface (64Mbit).
An intelligent consumer would skip the ATIRage Fury MAXX (and the new S3) and go with a good TNT2 Ultra or shell out the big bucks for the GeForce-DDR.
End of story.
Re:Another biased review by an idiot (Score:1)
I'm a Gamer, I demand Proformance from my hardware.
Hello guys? Can you say "shitty software support"? (Score:1)
OS supported: Win9x. Win9x. and Win9x. Not even NT/Win2K.
No T&L, and no fancy VSA-100 effects. You're gonna be suffering on slower CPUs. While it doesn't suck, it doesn't offer anything over the GeForce cards. And it costs too much in that respect, too.
Re:TnL (Score:1)
Re:Why invest in ATI instead of nVidia? (Score:2)
Have you noticed that when a new card comes out, Microsoft scrambles to release a new direct3d, and game makers scamble to use the new API in their games to take advantage of the card? In contrast, OpenGL has been through all the stages of 3d acceleration that are just now finding their way into cheap consumer cards. Why keep learning new APIs and inferior ways of doing things when OpenGL has been ready for years?
Re:Not even worth it... (Score:1)
Which I just did. Not like I can really afford it, but that's what credit cards are for, right?
what an, uh, "eleet" name (Score:3)
Re:Another biased review by an idiot (Score:2)
50 fps is noticeable to me, but I can get used to it and get my rail gun shots back on target after about 15mins of play (I often have to lower my max framerate to 50 to cut down network lag.)
Q3 on the otherhand is entirely different. I've gone from being a reasonably good Q2 player to a crap Q3 player. I think that it is all those funny angles so now I have to relearn with the mouse
Re:Mad ATI Product Names (Score:1)
************************************************ ***
Re:I dunno if I want it... (Score:1)
Toms had a "preview" and a full review of the release version. In the full review he goes in looking for this latency and failed to find any. He mentions that his search for latency was subjective, but he withdrew his earlier [statements?][predictions?]. Are you sure you read the latest version of the Toms review?
(My employer's URL blocker won't let me look at the sharky's site.)
Re:I've lost all respect for tom.. (Score:1)
It sounds like the card makers are hacking the benchmark rather than writing better general purpose drivers/hardware. I seem to recall some issues with Number 9 doing something like this a half decade or so back (obtaining benchmark results 50X better than anybody else or some such)
People depend on benchmarks. Hardware companies can force-fit their stuff to match a particular benchmark. I think he was saying this between the lines, and implying that he thinks it is cheating.
Re:2 chips for roughly the same performance... (Score:1)
Matrox could put 2 G400 chips. Trident can put together 10 8900 chips. You must be thinking that integrating 2 chips are technologically nothing. Go back and smoke some more crack.
Re:I've lost all respect for tom.. (Score:1)
And if you renamed a real game to that name you got nasty graphical errors.
And on anther note, Tom falled from grace a long time ago. Around when he couldn't accept that 3dfx made better cards than nVida. (Nowadays nVida is on top, but they weren't in the beginning.)
are you sure? (Score:1)
Yes, yes, it's a really old card. But it was The-Sh*t in its day!
Re:Why invest in ATI instead of nVidia? (Score:2)
Perhaps that's why Microsoft and SGI are folding Direct3D into OpenGL [microsoft.com]? (I make no claims that this will ever actually happen, of course, but that is the plan.)
Re:Linux support? Nope! (Score:2)
Yeah, they have a free, closed-source app now that lets you watch tv on an all-in-wonder, big deal.
Re:Not Much Better (Score:2)