Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software

ATI Rage Fury MAXX Review 166

Johan Jonasson writes "There's an excellent review of the ATI Rage Fury MAXX over at Tom's Hardware. For those unfamiliar with the product it's a monster graphics board with two Rage 128 PRO chips, each with isolated 32MBs of memory per chip which adds up to 64MB on one board. There's another review of the same board at Sharky Extreme. I've got to get me one of these. "
This discussion has been archived. No new comments can be posted.

ATI Rage Fury MAXX Review

Comments Filter:
  • Okay, not interleaving scanlines.. Frame interleaving?

    either way, it simply looks like two-cards-on-one. No real forward progress, for me.

    .. course, I'm also waiting for the fscking Guillemot Prophet DDRs (geForce) to hit the shelves locally.. hardware t&l... *sigh*

    (pirst fost?)



  • by BadERA ( 107121 )
    TnL What more can I say? Oh, maybe: Voodoo5
  • by Anonymous Coward
    ATI just really REALLY blows chunks, IMO. They've got the goofiest named video cards and ads and keep releasing junk with extra features that no one really needs. I think of them as a budget Matrox. Maybe if this card is 'all that' then they might improve their rep a bit.
  • From the Sharky review:

    As the Rage Fury MAXX is meant for gamers, ATi has written drivers for Windows 98 only. While we'd agree that most gamers don't run NT4, it will be interesting to see the impact Windows 2000 has on the gaming community. Other then that, Linux users will have to look elsewhere and Win 3.x users should definitely think about upgrading.

    When will these people learn???? Sounds like a nice card, but I'm certainly not in the market for Windows-only hardware.
  • How about a release of the tech. specs so we can write our own driver and make it GPL?

    -Vel

  • Why is this news? Lots of video cards have been reviewed before but was never news on slashdot.

    Am I missing something here or is this just a slow news day?

  • by Banpei ( 113822 ) on Wednesday January 05, 2000 @05:56AM (#1401621) Journal
    I already read both reviews a week ago...

    As far as I can remember both articles mentioned that for online-gamers the card would have a slight delay because of the dual-cpu design.

    It said something about rendering twice as slow as a nVidia GeForce but made up speed by rendering each frame on the idle cpu. So frame one would be rendered by CPU1, frame two by CPU2, frame 3 by CPU1 and so on...

    Anyway, what they said was that if you would have a framerate of 50 frames/sec that would give you on a normal nVidia a time difference between an action (movement, shooting and stuff) and rendering of the actual frame about 0.02 seconds. Giving the ATI has dual-CPU it takes about 0.04 seconds to render.

    According to Sharkeyextreme you would certainly "feel" the difference.

    Anyway, another reason why I personally would prefer nVidia is because of their good native openGL support.

  • Dam, its got more memory than my k6 box. Only
    have 32 megs and the onboard video takes up 4 megs....sigh
  • If you listen carefully, you can hear the barrel being scraped... ;)

    "Some smegger's filled in this 'Have You Got A Good Memory?' quiz!"
  • We just need a good OpenGL driver ... that's all we ask ;-)

    I'm sure that their marketing team is unaware of the open source team. It'll hit us eventually, I should think.
  • 1) No Linux drivers, win98 only.
    2) Great DVD BUT no TV output.
    3) Slower frames/s than GeForce and marginally better than TNT2 Ultra for some games.

    Seems enough for me to leave it alone for a while.

  • by shang ( 130364 ) on Wednesday January 05, 2000 @06:02AM (#1401627) Homepage
    That's a sure sign of a weak card. When you have to put 2 of your best chips in one card to equal the competition, that just means you have a real shoddy chip. I don't see a big performance increase gained over its competition. I'm sure Matrox can put 2 G400 chips on the same board and kill ATi on performance.
  • ATI promised to have the specs availible. This was very big news a while ago. While they are not as commited as some companies, they do a little bit. A great bene for those of us running LinuxPPC!
  • Why are you talking about a card that has no linux drivers. The company has made it a point to just support linux enough to barely use the card. The 3d support for the Rage Fury chipset period isn't even finished yet. Why bother? I will never buy another ATI card again. Hell, I would of been rid of the card if SuSe didn't release a free X server for it. Enough ranting

    In simple, why support something that you can't use? Yeah "you gotta get one of those" but what are you gonna do with it? The holidays are over so hanging it on your tree as an ornament would be rather redundant.

  • Damn I didn't preview. The press release is HERE! [atitech.com]

    Or try this http://www.atitech.com/ca_us/corporate/press/1999/ 4241.html
  • by mistalinux ( 78981 ) on Wednesday January 05, 2000 @06:07AM (#1401633) Homepage
    I have no respect for Tom anymore. Look at this statement:

    I would love to tell you the real story here but I don't think these hardware companies would appreciate it

    Tom lost all my respect. And if you feel you need to read the surrounding text, look at the URL below. Wow. http://www6.tomshardwar e.com/graphic/99q4/991230/fury-14.html [tomshardware.com]

  • Is it just me, or is it *EVERYTIME* a new graphics board comes out everyone is like "I've gotta get one of these." Exactly how much video processing power do most people need. I have an ATI All-in-Wonder Pro (AGP) with a VooDoo II Add-on. Quake runs fine. :)
    --Evan
  • That's all we need. Thanks for the info...
  • ATI does support Linux and Open Source. Here [ati.com] is a press release concerning it. Also their web site assures developers that technical data is availiable upon request. It is only a matter of time before a linux driver can be written.
  • 3dfx is releasing the Voodoo5 6000 later this spring. It's a 128-bit/128Mb card with video out, unlike the ATI card. It won't ship with drivers for anything but Microsoft, but have hope.
  • Comment removed based on user account deletion
  • by aphr0 ( 7423 )
    Many people are complaining about the lack of Linux support. Don't feel too bad, linux types. The drivers for the current Rage Fury for windows are crap. Pitiful quality. I'd LOVE to get rid of my Rage Fury due purely to the unacceptable quality of the drivers, but I'm a bit short on cash at the moment.

    Basically, you have 2 driver options for the Rage Fury: 1) release drivers, which are slower than old people fucking, or 2) beta drivers, which are faster, but ridiculously buggy. Damned if you do, damned if you don't.

    Don't bother emailing ATI for Linux drivers. They aren't worth having. Except for you poor souls who bought Rage Furies. (my condolences to you all)
  • Ok, Andover, this is tech news, geek news, and weird political/anti-privacy stuff news here, only.

    How much were you paid to post this crap up here? If I wanted to read about HARDWARE REVIEWS, I would go to a HARDWARE REVIEW (read: commercial) site.

    Kindly do not defile Slashdot with this tripe.

  • by Malc ( 1751 ) on Wednesday January 05, 2000 @06:18AM (#1401642)
    "It offers instant gratification for the games that are out now. When T&L-enabled titles start hitting the shelves later in 2000, ATi's next generation chip should be ready for them." - Sharky Extreme

    This review seems almost as biased as the last one based on the board before it was released. Do they think that nVidia will be sitting around and not have something better by the time ATi release their next generation chip? If I'm going to spend upwards of $250 on a graphics card, I don't want to be shelling out for another card later in the year to get the new features. Stupid reviewer! The ATi card is no cheaper than the geForce DDR, but with lower performance and fewer features. It's obvious which card to get when chosing between the two. Besides, who cares about the hi-res results: no serious gamer would play at 1280x1024, the framerate is half what I consider the miniumum for games like Quake 3 - where the difference between 50 and 60 fps is noticeable, let alone playing at 30 fps! - and with Quake 2 I would suggest that there is no need to go to resolutions above 640x480 or 800x600 as there is no real gain.

    The cheapest Creative Labs 3D Blaster Annihilator Pro (geForce DDR) is available for $233, according to computers.com (can't find the MAXX yet):
    Creative Labs 3D Blaster Annihilator Pro, sorted by price [cnet.com]
  • Yes, it's very simple. Not everybody on /. uses Linux. Stop whining.
  • There's a few problems with this card.

    Some people have raised the concern that there will be an additional latency in first person shooters that some gamers would notice, since it's rendering the next frame ahead before it displays when you haven't hit the key to decide your actions in that frame yet. Maybe not noticable to some people but for the hard core gamer....

    The Voodoo2 SLI and multichip Voodoo4/5 cards don't have this problem because they render portions of the same frame.

    It's also very inefficient to have 32 MB per chip rather than a shared 64MB pool.

    You're better off going for a Voodoo5 if you want the absolute highest fill rate or a GeForce DDR if you want maximum geometry throughput.



  • Seems to be a lot of ATI slamming going on. I for one know that ATI has NEVER made the fastest card in the market (although their marketing department seems to think so).

    But, I do have an ATI AIW 128 (not Pro). Boy is it nice to watch TV, or record a TV magazine for later. Its nice to be able to broadcast video in netmeeting or CUSeeMe. Its nice to be able to do all of this on one board. I have OpenGL driver support (even in NT!). I have DirectX support. It compensates DVD playback. Its a very well rounded board. And what's more, I get very decent game play at 32bit (whereas most traditional 3D boards cringe at 32-bit color and stick to 16-bit).

    Now, there were two ways to proceed in enhancing game performance:

    1. Put more of the rendering onto the chip (i.e. T&L)
    2. Use more chips

    GeForce did one and ATI did the other. The reviews I've seen place them very close. No which one of these guys will figure out how to use TWO T&L chips first?

    Anyways, I just wanted to point out that ATI is most prominently a marketing company selling to an OEM market. And they're doing a DAMN GOOD job at that. Their boards are not the best, but they're certainly far from the worst.

  • So... Any recommendations for a new AGP graphics card for my dual boot (RH Linux & NT4.0) machine? Damned Unreal just doesn't run _quick_ enough!!!
  • Let me get this straight; The card ONLY works in win98, doesn't have native OpenGL support, has a SLOW framerate, and it's ungodly expensive... Boy, sounds like a real winner to me. Why don't we all just go buy some old Cirrus chips, put 50 of them on a board, and sell it. Oh yeah..
    =======
    There was never a genius without a tincture of madness.
  • by Dwindlehop ( 62388 ) on Wednesday January 05, 2000 @06:27AM (#1401651) Homepage

    Sure, you can buy a MAXX product for $200~250 and have yourself a kick-ass video card. Or, you could shell out $200~300 for a GeForce-based card and get a kick-ass video card that might just have a longer lifetime in it.

    S3's and nVidia's new chipsets support hardware transformation and lighting--done right on the video card, instead of the CPU (which would be software). 3dfx's and ATI's new products don't. Now, it depends on game developer's support for this new technology, but chances are good that many games in the coming couple years will count on offloading these calculations to the video card in hardware T&L enabled cards. If that happens, then owners of these cards will experience serious performance boosts or be able to run games their non-T&L-card-owning bethren can't.

    Don't be fooled by the 64 Megs of RAM on the MAXX, either. It doesn't increase the total textures the card can handle, because each chip has to keep track of (almost) all the textures simultaneously. The RAM on this video card is not a particular selling point compared to other 32M cards.

    One point ATI might be able to score on is price. The MAXX is expected to retail for less than GeForce products, and may offer a better deal. Only time and the market will tell.

    Of course, MAXX products will really succeed in the OEM market, where ATI's strength is. And when (if) this technology gets ported to the Mac, it'll be a major boon to Mac gaming. Given ATI's current stranglehold on the Mac 3D video card market, I expect this card will find it's way there soon enough.

  • There has been a lot of hype about this card for the last N months.

    Mostly in the Macintosh world (remember some of us use supplemental operating systems) where this is the supposed solution to Apple being cheap and only given "power users" 4 slots (most need 6+ for some reason unknown to people such as me...then again I don't do graphics!)

    This was supposed to be the card that put ATI back on the map... apparently not! :( [Personally I've only had good experiences my ATI card, but then again I've only used the one(1).]
  • Whats the point? Dual processor graphics cards?? I don't get it. I get 55 frames per second in Forsaken. Not exactly impressive but it is easily enough for decent gameplay. My hardware: K6-2 450, 32MB RAM, Diamond Monster Fusion AGP. Not very heany duty hardware. Yet it will play anything acceptably well, and when I get a new 64MB module will be more than enough. But maybe my memories of the time when a gamer debated whether to go for the excellent resolution of Hercules mono or the 4 colors of CGA, and when you wondered who would really need one disk to hold 1.44MB, affects my views of minimum vs. excessive computer power.
  • Is there a "beginner's guide to video cards" anywhere on the 'net? I seriously don't understand the profusion of card types out there and I need to get something I can import/export video with.
    ---
  • Why on earth is that post - a reasonable summary of the two articles - moderated DOWN as a troll?

    Hoping to meet you in a dark metamoderation alleyway....
  • I agree the moderation seems to have been done by one of the AC idiots on this one. The post was certainly not a troll. Redundant I could understand, but not troll.

    I really would like to know why the person submitting was so excited by this card.

  • by BJH ( 11355 ) on Wednesday January 05, 2000 @06:45AM (#1401666)

    Is it just me, or has Sharky been infected with the "suck up to our advertisers" disease that hit Tom a while back? Get this quote from here [sharkyextreme.com]:

    ...the MAXX is a direct contender once again with the SDR card, however it almost overtakes the DDR board as well in high resolutions. Once again, the raw power of the two Rage 128 Pro chips stands up well to the extra high bandwidth and T&L of the DDR GeForce.

    Well, excuse me, since ATI has thrown two chips at the problem compared with one for the NVidia card, I would expect the words "raw power" to be applied to the GeForce. On top of that, he says that the ATI card "almost overtakes" the GeForce DDR; the framerate differences between the ATI and the SDR card on the three tests on this page were 0.4 FPS, 0.1 FPS and 0.4 FPS again, whereas the gaps between the ATI and the DDR card were, respectively, 5 FPS, 5.4 FPS and 5.1 FPS. Since we're talking about a nearly 20% difference in F/R between the ATI and the DDR cards, his comments strike me as being just this side of dishonest. He then goes on to say that the DDR GeForce card has better bandwidth and T&L, as if NVidia were cheating or something.

    If you look at the tests, many of them show the ATI card getting its ass well and truly kicked by the GeForce cards, sometimes by margins of 100% or more, yet Sharky skims by these figures as if they were of little importance, even though he's the one who did the tests. Faugh. Show us your list of advertisers, Sharky.
  • for that price,
    lay low for a couple of months,
    and buy yourself a Playstation 2

    J.
  • Yeah, Tom is a real dick for not walking himself into libel and defamation lawsuits. So you can feel better about yourself, mistalinux, I think I'll bring a few suits against you because that would obviously make you really happy...

    There is a difference between being libelous and reporting the truth. However, isn't it already too late, since he is posting benchmarks in which only one company or product will appear the victor? Or perhaps by saying "It was the program that stated your video card sucked, not me" is his way of avoiding lawsuits.

    Either way, if he knows something we don't, he should be obligated to let us, the consumer know, as it is the goal of his site. But, it is HIS site, and he is free to do what he wants with it.

  • Again its become obvious to me that instead of being able to post to "Linux" users only I have to cater to "others". Not that others matter to me anyway. Shall we proceed with a background of Slashdot and who Slashdot tries to cater to? This is one of the problems I have with reading Slashdot. It was founded in the Linux community yet when someone posts about a problem they see with hardware/software with its own respects to Linux they are blasted with replies as such.

    I don't care if you use Linux or not. In fact I don't care what you use as it is unimportant to me. However if Slashdot is to cater to the Linux community you shouldn't see a problem with such posts. I wasn't aware that this was a universal end all website. Maybe I should start reading elsewhere for linux related material.

    It seems that as more *nix/linux/bsd people post about their experiences and how a company and or person affects our community we offend the "other".

    With that I plan to make that the last time I read Slashdot. It has become something I don't wish to be a part of any longer and will explore other options; if any. Sad.
  • From Tom's Hardware:
    The Rage Fury MAXX also sports the ATI Rage Theatre chip that enhances DVD playback and offers impressive video encoding.
    From Sharky Extreme:
    Since the gamer is ATi's projected customer, no TV-out support was considered necessary. This means the Rage Theatre chip present on the Rage Fury Pro and All-In-Wonder 128 Pro cards is absent on the otherwise power-packed MAXX.
    So who's right?
  • ATI just really REALLY blows chunks, IMO. They've got the goofiest named video cards and ads and keep releasing junk with extra features that no one really needs. I think of them as a budget Matrox. Maybe if this card is 'all that' then they might improve their rep a bit.
    ATI Rage Maxx Wifebeater ATI Rage Maxx Road Warrior ATI Seething Anger Pro ATI Super Street Rage Alpha Hyper EX, Movie Edition. Etc. I had an ATI all-in-wonder pro AGP 32mb for a few days, and the 3d on it was OK, but the freakin' TV tuner hung my box whenever I used it. Supposedly a bad card, but the retailer didn't have a replacement, so I got a diamond *mumble* II. It's just to tide me over, though :) As for the card, it's OK, but per video chip/GPU, the geforce is the winner. The ATI board just managed to keep up with the low end Geforce with two high end ATI procs on it. It's still not as ludicrous as the new Voodoo cards -- Those things have external power supplies! Kicking the power strip can now bork over your video card in a Special Way. Whoever came up with that idea should be shot.
  • by reality-bytes ( 119275 ) on Wednesday January 05, 2000 @06:57AM (#1401677) Homepage
    ATI seems to have a very definite anger in its product names.

    *Rage* 128 *Fury* - whoa!
    *Rage Pro* (how to be professionally mad?)
    Is the next board going to be a buget version?: *Rage 128 Mildly-Upset*

    And of course dont forget their next board:
    *RAGE 256: HOMICIDAL MANIAC*
  • Read above to previous poster and my reply. As said; I don't care what you use as the post shouldn't relate to you if you don't use Linux.

    What was the point in replying? Also I wasn't aware that speaking of previous experiences and what has happened to me as a Linux user was something of an illegal post.

    Slashdot has now becomea site for news. Technical related news and less of a site that deals with Linux at all. Patents/Laws/Tech News/Rants disgusied as news. It's pathetic.
  • As long as drivers are availible for Windows first, then other products later, the "lag" will lead to a general public perception that Linux is not as "well-supported" as Windows is - and they're right.

    What we need is a public commitment by major companies to have Linux-ready drivers at the get-go. Preferrably open-source, but whatever tickles their whiskers... If they can't handle open-sourcing their drivers, then they can release binary only, and release specs on request for the hardware.
  • While the Voodoo 5 sounds like a great card, I think its unfair to compare a card to be released in 3-5 months to one that is out now (either the ATI or the geforce cards. What 3DFX has out now is what matters. The graphics market is changing month to month.

  • As others have said, hardware reviews are really not something I go to Slashdot for, enough game/hardware sites out there to keep me informed of those. It's not even a revolutionary card either.
  • "I wasn't aware that this was a universal end all website"

    Obviously you're not very observant. What have Jon Katz's rantings (stories from the hellmouth) or the perpetual stories on nanotechnology or the The Who have to do with Linux? If you look at the /. home page right under the logo, you will see that it is "News for Nerds", not "News for Linux People".

    "However if Slashdot is to cater to the Linux community you shouldn't see a problem with such posts. "

    I have a problem with the original post. It was whining and the opening question/statement was unnecessary.

    "With that I plan to make that the last time I read Slashdot. It has become something I don't wish to be a part of any longer and will explore other options; if any."

    Fine. Cut off your nose despite your face. You won't become a martyr.
  • Oh, come on. ATI has been the most difficult company to work with since the beginning. Why would they change now? :)

    -Chris
    (proud owner of a Mach32 that was fried by X)
  • when was the last time you tried an OCTANE with 4M of tram and two R12000 CPU's ? I've yet to meet a pc that can do floating point like that or real time 3D. There is more to life than playing quake...it will be some time before I can recommend linux as a production animation workstation. SGI caters not to the gaming market last I checked.
  • The PSX2 is set for a Christmas 2000 release in the US, Spring in Japan. I could wait "a couple months", but what good is a game if I can't read what it says? And even so, a PSX2 is not going to make Drakan or any of the other pc games I play run any quicker, until the software is available for it.

    But I agree. The PSX2 is going to kick severely large ammounts of ass. And I'd rather shell out for a Geforce now anyway
  • #1 - 2nd Video card (for two monitors)
    #2 - Video Capture Card
    #3 - Better Sound Card
    #4 - SCSI card

    With 10/100 ethernet onboard, the need for slot 5 has been alleviated. There used to be a day when people would plop two Apple Quickdraw 3D cards into a 9500, but video cards have eliminated that need. But some people still like a 3rd monitor (one for editing, one for previewing, and one for other apps... Or else, one for sound tracks, one for laying out, and one for video...

    And yet others seem to like 2 SCSI cards.... With the possibility of two or three live video sources, just now with that 160 MB/Sec variant of SCSI has it been possible to shove all that data down on SCSI channel.

    And there's more possibilities, too... but i don't want to ramble for too long about this
  • patience young Jedi, the world at large just realized Linux existed last year. Buy your Linux copy of Q3 and wait a little while longer. BTW, good 3d drivers won't do much for Linux without LOTS of good 3d games.
  • but the bottom line is that if Slashdot catered exclusively to the Linux community they wouldn't get much audience...

    You know? If I was going to post something this blatantly stupid, I'd post as an AC too! Do you not have any idea of this history of /.? Or did you just follow a link from Wired last week?

    But more to the point, both in reply to your post and that of the others flaming those who are concerned with lack of linux drivers, I've got a harsh reality for you - not every post on /. is going to be relevent to you. I don't go through flaming the hell out of everyone who posts something about a Palm Pilot, even though (!!!) I don't own one...

  • if you're bored and you don't read at -1, you HAVE to read the comment this is in reference to. It's pretty funny, if you've ever found a troll funny (I personally do sometimes).


    my lame ass attempt at a public service announcement ;-) And of course this is only good until the moderators wipe this away too . . . at least some funny points to the original, k? :-)
  • Heck, I have a ATI RAGE Pro that consistantly gets beat by a single Voodoo2 when it comes to QII framerate. My old G200 even comes really close to outperforming it!. I'm skipping the MAXX in favor of a newer Matrox dualhead; Respectable 3d performance AND it frees a PCI slot!
  • Momo.

    I have an Indigo 2, 250Mhz Extreme. 128MB RAM.

    That Sunuvabiatch kicks the living piss out of my DUAL PIIx400 in doing Rendering / Graphics work.
    (Admittedly, it's a little sluggish in a regular state, but that's not what i use it for).

    At work I use an O2 - 256mb RAM, (not sure the clock speed). (I called it PapaSmurf on the LAN, cuz it's blue)
    This sucker is quite the powerhouse. Rendered a 2 minute 3D animation at 1600x1600x32bit in my lifetime. (I went to lunch and it was done. not sure how long).

    Irix May not be as robust as other Unices, nor as fancy looking as linux, but it does what it does.
    (And I like 4dwm, if you must know... it don't look like windows)

    No go back to your cave, AC
  • Either way, if he knows something we don't, he should be obligated to let us, the consumer know, as it is the goal of his site.

    not in the least. He isn't "obligated" to tell you $hit. However, if he knows info but can't share it (fear of lawsuits, NDAs, etc) then he shouldn't even bring it up, otherwise small minded people get pissed off because their "right" to know is being violated.

    From what it sounds like, Tom has pissed off more than a few folks with biased reporting. Let that be a lesson to any would-be hardware pundits in the crowd (from both directions).
  • It's not that exciting! The graphics card only really uses 32Mb (two lots of 32Mb with the textures, etc., duplicated in both.) It sounds like your computer uses its memory more efficiently! ;)
  • in one aspect, you're entirely correct -- it would be completely unfair to compare the two. however, I was not stating a comparison; rather, I was putting forth the fact that, as the ATI card lacks TnL support, there's no reason to buy one. Instead, wait the 3-5 months.
  • Considering that between him and Gareth Hughes, there's pretty much an alpha driver for the Rage PRO available for the brave at heart to play Q3 and other OpenGL games on. He doesn't like the chip much (seems it's still missing some things- but you apparently can get by with it) but they've gotten the framerates close to what a G200 does right now. We're going to clean it up and use that driver as a reference Utah-GLX driver because it's the cleanest one to date. Shortly, you can expect a RAGE 128 driver to pop up (Beings that they've given a hell of a lot more info for it to us...).

    It's not so much the chips themselves but the drivers that make the chipsets worse than they actually are. Yes, the ATI offerings are nowhere near as good as the Matrox, NVidia, etc. offerings- but they're everywhere, cheap, and are serviceable. As for this card, we'll have support for the basic configuration shortly- all we need for the full support is the info to interlace them from ATI.
  • well, if you can use smp on motherboards, why not with the processors on vid cards?

    then again, to make real use of a good machine, you need a real operating system, and this card can only run with win98?

    it's a start, but it definately needs work.

  • The message said he "should be obligated" not that he is. The following sentence also states that he can do whatever he wants with his site, thus making your first unnecessary.
  • NVidia's not given out anything (honestly now, an obfuscated source of a driver layer is nothing compared to the technical specs and register level information of the chip) compared to Matrox, 3DfX, or ATI. If you don't want to give money to ATI, spend it on Matrox or 3DfX at this point. The GeForce isn't supported under 3D- the others are right now.
  • Quake runs fine.

    'Fine' is insufficient. If I can't see every individual rendered blood droplet when I blow your head through the back of the screen from all the way across the level without any slow down with ALL of the eye candy turned on then it isn't good enough!
    >:)

    Kintanon
  • Speakng in terms of the review this one was better than most. I think that is the first review I have read in a long while that pretty much does a comprehensive review of other video cards. Mentioning key things you will want to know when it comes time to buy one. The graphs that showed all of the other video boards were awesome. It is great they showed cards llike the G400 and the TNTUltra. 'these'. Are the real benchmarks. How fast it can run quake. not how well a company can rig a program or driver to smoke a 3D Benchmark program. Well.. If you are interested in video cards go read it its good stuff.
  • I like the Anandtech review [anandtech.com] as well.

    Pablo Nevares, "the freshmaker".
  • I can't seem the find the phrase you quoted anyhere in the review - also, the page you linked doesn't exist within that review (fury-14.html).

    Where was this?

    Pablo Nevares, "the freshmaker".
  • Besides, who cares about the hi-res results

    There goes all your credibility. Hi-res results are what actually demonstrate raw hardware speeds. Low-res scores reveal little about the actual speed of the card, because few chips are fill rate bound at low resolutions.

    Even though high-resolution game scores are a much more effective way to measure a chip's fill rate, they aren't the be-all-and-end-all of the chip's capability. I'd like to see how this ATI handles a 500,000 poly scene typical in the CAD world...


    ________________________________
  • I can't seem the find the phrase you quoted anyhere in the review - also, the page you linked doesn't exist within that review (fury-14.html).

    Where was this?

    It is still there, I just checked it. I would cut and paste the whole paragraph, but for some reason netscape wont let me select anything on the page. The sentance I quoted is in the first paragraph, about 3/4 of the way down.

  • No Linux support sucks, from a user perspective...

    No NT/Win2000 support sucks from a BUSINESS perspective. So many graphics card makers ignore the NT market despite the fact that game developers almost universally use NT or now Win 2000 to develop in...Who wants to write and debug code under 98? Not me. By not supplying an OpenGL driver for NT4 and not supplying OpenGL and DirectX 7 drivers for Windows 2000, companies are shooting themselves in the foot as far as getting good developer support and optimization for their cards.

    Only NVidia and 3DFX seem to really grasp this concept, which is probably why all the other card makers continue to exist only because of OEM deals where users dont really know what they are getting.

  • Nice flamebait, but since you're (supposedly) pro-Linux, why are you bashing SGI, who is also pro-Linux?

    The IR2 wasn't created for running Quake2. You're comparing apples to oranges...However, it does stand to reason that current and future consumer level PC cards will start beating the "big iron" for rendering of a few years ago. Yet more Moores Law in action, this time applied to the graphics processor rather than the CPU.

    As much as I like Linux, Linux doesn't specifically do 3D better than any operating system. In fact, technically, you could call Windows a better 3D OS, since at least (since Win98 B) it has come with a standard 3D API (OpenGL), whereas Linux has no true standard (Mesa is a bit of a de facto standard... XFree4 should fix the whole issue, but I digress...)

    Also, while Linux might get some of the big name 3D packages (Maya, etc), don't hold your breath for them to become Open Source.

  • I was under the impression that this article was aimed at gamers. Therefore I was not impressed by all of the talk of the hi-res results as no gamer would find those frame-rates acceptable anyway. When the frame-rates at the higher resolutions become useable, then they will become a more interesting benchmark. The only purpose of the high-res benchmark is to say that we're not there yet in the low-end 3D graphics cards.

    Sure, it would interesting to see how these cards work in the CAD world (I don't think you'll be seeing that in Sharky or Tom's reviews.) That is a world that I am not familiar with, but I was under the impression that these cards would be very low-end in that arena. I believe that these cards are aimed at gamers, but please correct if I'm wrong. I do see the benefit of a cheap card for people trying to do CAD work outside of a workplace, but the demand is obviously lower than in other parts of the market.
  • Well from what I here SGI has some of the same thoughts as you. I thought that I read that SGI was going to be moving more over to linux and in the end phase out irix, Of corse I could have just been dreeming that. But I know that they are oping xfs and some of there fine gran locking stuff this year.
  • Win98 won't even support SMP, and there's no linux drivers. This is just an attempt to get into the highend video card market with a bad chip. It's like building a dual processor computer using Winchips while your competitors are making them out of PIII/Athlons. Sure, it sounds impressive, dual chips, wow, but it's still old tech masquerading as high tech.
  • G400. The other choice would be a TNT2 - works, but the drivers are terrible. I have one, and it's probably half as fast in Linux than in Windows. The drivers for the G400, on the other hand, are from what I hear as good as or better than the windows drivers.
  • I wouldn't buy this card at any rate, because ATI is a bit of an "also ran" when it comes to current 3D technology, and this card is no exception.

    However, faulting a company releasing a graphics card right now without a Linux driver is a bit unfair. Lets wait until XFree4 is widely available (at which point Linux will finally have something like a standard 3D driver system) before we bash anyone for lack of drivers, and this includes NVidia, 3DFX, etc as well as ATI.

  • My voodoo2 3000 seems to run it just fine - but if you're looking high end, the Ultra2 TNT's seem to be worthy of hardcore gaming.

    One tip --make sure you have the processor(s) to support high-intensity gaming. I'm starting to see the age already on my k6-2 450 -- recommendable for you to get something with better FP (k7, p3 (shudder) or dual celeron (yay!)

    Just my thoughts.


  • If you try to go directly to the page it just shows the introduction page for some strange reason.. Looks like it's playing HTTP-Referrer tricks..

    Fastest way to get there, AFAICS, is to go to the intro, click "Benchmark results" down and the bottom, which puts you on page 8.. then keep clicking to the next page until you get to page 14...

    What an annoying site.

    ---
  • You're confusing the Rage Pro (which was not much :-) and the Rage 128. There aren't any iMacs with Rage 128, only Rage Pro (and a few with Rage II+). THe Rage Pro is not much of a card, I admit. But from what I've seen of the 128 (in person), it's decent, at least. Not phenomenal, but if you really want the trimmings (TV in/out,2d, reasonable 3d), it's not a bad deal. It's the Pro where Carmack was squezing blood from rocks (to quote you).

    Damn, I have a Rage II+. POS.

  • Ok, there is somehting wrong when you have more VIDEO RAM than SYSTEM RAM!!! Yeah, I'll just set up my VIDEO CARD as a SWAP FILE!!! Jeez. 80x25 all the way :-)

    A wealthy eccentric who marches to the beat of a different drum. But you may call me "Noodle Noggin."
  • I found the quote. Weird thing to say. It
    sounds above like he doesn't know what's going
    on, then he sais that he wants to tell but he
    cant.

    I've been dissapointed in toms sight lately, not
    really because he's arogant, or has 10 banner
    adds on his homepage (though that doesn't help).
    The sight has just dried up. The reviews are
    too slow, too late, and too few and far between.
    There's too many other good sights to read now
    and his isn't at the cutting edge anymore.
    I read about the MAXX somewhere else before I
    read Toms.

    His bread and butter was the Celeron overclocking
    stuff that he covered and now there's not a
    peep about overclocking the P3-x5x0E's. He
    did some really good stuff but he's been slipping
    lately.
  • Don't knock it until you've tried it. The nVidia GeForce is just 2 TNT's + T&L. While everyone likes a brand new chip, this is a costeffective solution that works. Remember that silicon is cheap; have you been to a beach lately?
  • We have reached a sort of plateau in computer game graphics: we can squeeze X triangles on the screen in any one frame (around 10K, given q3a's r_speeds), and render them damn fast. The problem is, the CPU is still doing the T&L in most cases, and that puts two limits on current games:
    1. We can't have any more than X triangles per frame, limiting geometric complexity.
    2. Nearly all (90%) of the computing power is going toward rendering, leaving precious little left for AI, physics, or anything else.

    The future is obviously in cards with T&L, and it will become clear in the next year that games that expect a T&L card will run MUCH faster. With a T&L engine, we can now fit many more tris on the screen (5x? 10x?) at the nearly the same frame rates. We can also have much more complex worlds.

    So while the MAXXXXX might be ok for now, it will lose out to the GeForce. Maybe not today, but it will. While most companies are pushing fill rate to beyond the max (1600x1200 at 120 fps? who needs that?), the geforce is the only card that will could run Myst in real time at 60 fps.
  • The MAXX simply isn't worth it...

    For $270 of *my* dollars, I'd rather spend $20 more and get a GeForce-DDR card - creative, diamond, and guillemot all make fantastic solutions that blow the pants off the performance of ATI's unspectacular MAXX.

    They messed up, basically... they find a hard time beating TNT2 Ultra cards, which run $100 cheaper and are well established (with good OpenGL)...

    Not too mention their lame driver support right now, versus nVidia's solid existing drivers, their commitment to driver optimizations, and their production of stable cards...

    the only thing GeForce is weak at is it's relatively slow SDRAM (only 150MHz) and its slim memory interface (64Mbit).

    An intelligent consumer would skip the ATIRage Fury MAXX (and the new S3) and go with a good TNT2 Ultra or shell out the big bucks for the GeForce-DDR.

    End of story.

  • $233 eh? I ordered my Creative A-Pro DDR for $246 Before Christmas, and it's still not here and won't be till Jan 24th. I bet you'll be able to geta MAXX REAL soon after it's released....Just because the Tests of it vrs the DDR version are very noticeable for the same price, and it will prob not be on Backorder for very long if it even gets to that state.

    I'm a Gamer, I demand Proformance from my hardware. :)
  • Let's see here:

    OS supported: Win9x. Win9x. and Win9x. Not even NT/Win2K.

    No T&L, and no fancy VSA-100 effects. You're gonna be suffering on slower CPUs. While it doesn't suck, it doesn't offer anything over the GeForce cards. And it costs too much in that respect, too.
  • Don't get me wrong, I could never argue with advice not to get an ATI board!
  • by Anonymous Coward
    any well programed OpenGL game (ignoring the steaming pile of shit that is direct3d) will utilize the transformation hardware. Most FPSes do lighting with lightmaps so the lighting portion is useless.

    Have you noticed that when a new card comes out, Microsoft scrambles to release a new direct3d, and game makers scamble to use the new API in their games to take advantage of the card? In contrast, OpenGL has been through all the stages of 3d acceleration that are just now finding their way into cheap consumer cards. Why keep learning new APIs and inferior ways of doing things when OpenGL has been ready for years?
  • For $270 of *my* dollars, I'd go to buy.com and get a Creative Labs Annhilation Pro (the geforce 256 with ddr ram), buy it for $250, and go grab myself a pizza to celebrate not buying an ATI card.

    Which I just did. Not like I can really afford it, but that's what credit cards are for, right?
  • by Jamie Zawinski ( 775 ) <jwz@jwz.org> on Wednesday January 05, 2000 @11:35AM (#1401750) Homepage
    What's with these idiotic macho names that video cards have these days? "Rage Fury Max Extreme, D00D!" Are these computer hardware or skateboards? Oh well, I guess their primary target market is the same: 12 year old testosterone-poisoned boys....
  • Ahh, but I am! But I don't use the mouse... I just run and jump like hell and time/lead straight shots well (no camping with the railgun for me!) In my experience, hi-res is less important than high framerate when playing with just the keyboard (obviously ping makes a huge difference too).

    50 fps is noticeable to me, but I can get used to it and get my rail gun shots back on target after about 15mins of play (I often have to lower my max framerate to 50 to cut down network lag.)

    Q3 on the otherhand is entirely different. I've gone from being a reasonably good Q2 player to a crap Q3 player. I think that it is all those funny angles so now I have to relearn with the mouse :(. Framerate's a bit lower but the rail gun isn't as good, so I'm not fussed right now (geForce will be my next investment, then dual MoBo - I work from home too).
  • Yeah, like someone said, "Why do ATI continue to give their cards names that make them sound like they are going to jump out of the machines and rip peoples' throats out?"

    ************************************************ ***

  • As far as I can remember both articles mentioned that for online-gamers the card would have a slight delay

    Toms had a "preview" and a full review of the release version. In the full review he goes in looking for this latency and failed to find any. He mentions that his search for latency was subjective, but he withdrew his earlier [statements?][predictions?]. Are you sure you read the latest version of the Toms review?

    (My employer's URL blocker won't let me look at the sharky's site.)
  • Some more of the context is I feel like a parent that has to take away a marketing tool, I mean benchmark, because no one is playing fair.

    It sounds like the card makers are hacking the benchmark rather than writing better general purpose drivers/hardware. I seem to recall some issues with Number 9 doing something like this a half decade or so back (obtaining benchmark results 50X better than anybody else or some such)

    People depend on benchmarks. Hardware companies can force-fit their stuff to match a particular benchmark. I think he was saying this between the lines, and implying that he thinks it is cheating.
  • And you know what, I don't really care how many chips it uses if the overall performance is good. While the single chip is pretty average, the integration of two chips using AFR is a state-of-the-art move and can be counted as "1 generation".

    Matrox could put 2 G400 chips. Trident can put together 10 8900 chips. You must be thinking that integrating 2 chips are technologically nothing. Go back and smoke some more crack.
  • Oh, are they trying to pull a /Videologic/ on us? That must have been a classic... Don't know if they ever admitted it.. Altough when people renamed the d3dtunnel.exe (or what the D3D file was called.) to something else their card suddenly didn't perform very well.

    And if you renamed a real game to that name you got nasty graphical errors.

    And on anther note, Tom falled from grace a long time ago. Around when he couldn't accept that 3dfx made better cards than nVida. (Nowadays nVida is on top, but they weren't in the beginning.)
  • I can't remember the exact benchmark numbers as I got one second hand, but I remember my ATI Graphics Pro Turbo. It was freakin' fast(TM)(C). It was the one with the Mach64 GX with 4mb VRAM, and at the time, that was a bunch, and the standard was real-slow dram, 1 or 2 MB.

    Yes, yes, it's a really old card. But it was The-Sh*t in its day!
  • In other words, Direct3D started out as a simplified version of OpenGL for gaming, but it has gradually overlapped more and more with OpenGL?

    Perhaps that's why Microsoft and SGI are folding Direct3D into OpenGL [microsoft.com]? (I make no claims that this will ever actually happen, of course, but that is the plan.)
  • If ATi supports Linux and Open Source why have they routinely told the Video4Linux developers exactly where they can cram their inquiries about programming specs?

    Yeah, they have a free, closed-source app now that lets you watch tv on an all-in-wonder, big deal.

  • The Voodoo 5 also need to be plugged into an electrical outlet because the motherboard can't supply enough power for its processors. Doesn't that seem like a serious design flat to you?

The trouble with being punctual is that nobody's there to appreciate it. -- Franklin P. Jones

Working...