Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software

Tom's Hardware: Win, Lose or Ti - 21 GeForce Titan Tests 109

msolnik writes "Got a huge wad of cash burning a hole in your pocket? Why not spend it on a fancy new video card... Uncle Tom has reviewed 21 different cards so you can make a well educated decision. This is by far the most best Geforce comparison out there. A definate read for all you hardcore graphics guys."
This discussion has been archived. No new comments can be posted.

Tom's Hardware: Win, Lose or Ti - 21 GeForce Titan Tests

Comments Filter:
  • by Junta ( 36770 )
    I'd rather have the ATI All-In-Wonder 8500DV. Sure, it might not have the performance of some of the GeForce3, but for Video capture and playback, it is great (even under linux soon, given the track record of the All-In-Wonders of the past). Of course, there isn't really any card I know of with *good*, well supported TV-out (yeah, there are tricks to use the framebuffer and unhooking the monitor, but that's ugly).

    nVidia isn't the only game in town, particularly not for those of use who do video playback and editing more than play games.
    • Yeah, I'd rather have a video card that reduces video quality in my favorite games on purpose just to get better benchmark info [hardocp.com] too. Oh wait, no I wouldn't

      I'm sick of ATI's bullshit, they've been poulling this kind of crap with their drivers for years.

      • Yeah, that was quite underhanded, but they retracted that. Though it was very bad, it does make us stop and think about how all these expert sites evaluate cards. It's always x card acheives y more FPS than z card, so it is clearly better. Price and quality are ignored (which is why ATI got away with their trick so long, quality sucked, but no one paid any attention...).

        Also, you could mention how ATI, by omitting the facts, kinda led sites to make the assumption that the good Anti-Aliasing was a well implemented multi-sampling solution when it was actually an inefficient super-sampling solution. True, ATI didn't ever claim that it wasn't supersampling, but they didn't seem to want to mention one way or another until confronted. Of course this strategy of misinformation through omission is common in corporations, but still seems a little devious to watch all these sites mention Smoothvision in contest of multi-sampling without offering corrections...

        On the other hand, if you are using ATI-written drivers, then you are running a Microsoft OS, and therefore you are already a customer of a company that has pulled some dirty tricks in its days. Pretty much all successful companies pull dirty tricks, and in comparison to other corporate acts, this one isn't that high on the sleaze scale. In this case, they cheated, but released fully functional drivers when called on it. It was really deceptive, but people and review sites need to learn not to judge a card mostly on a single game's acheived framerate so that companies won't get away with this sort of cheating.

        My primary interest is Linux, and the GeForce cards use unsupported chips for TV Capture/Playback. I'm not about to sacrifice functionality just so I can give money to a company that has probably pulled similar nasty tricks in its time, but got away with them.
        • My primary interest is Linux, and the GeForce cards use unsupported chips for TV Capture/Playback

          Umm... hello? What planet do you live on? NVidia has official linux driver support for all their cards, Twinview and TV out included. There aren't really any NVidia TV capure cards available anyways, but if you really care about video quality you'd be using a seperate vid cap card anyways, so this is irrelavent to your arguement. NVidia' funcitonality under linux far surpasses ATI's, including features such as full screen anti-aliasing (FSAA). When NVidia has a superior product, I fail to see why you would support a downright dirty company such as ATI.

          • Ummm...hello? What planet do you live on? Apparently, one where all video cards have only one chip on them. NVidia chips have Linux drivers, true, but the chips that are used on the same video cards do NOT have Linux drivers.
      • And if you think that nVidia and the other major video chip/card manufacturers (are there any other major ones at the moment?) don't pull this kind of shit too, you are truly naive.

        Dinivin
        • And if you read this article [gamers.com] (read through it now, dont just "skim"), you'll see that the tom-foolery that ATI has pulled with their mip-mapping in Quake3 has nothing to do with FSAA, and everything to do with cheating the user.

          • I've read the article..

            And if you honestly think that none of the other major video chip/card manufacturers pull that kind of shit, you are still truly, and foolishly, naive.

            Dinivin
      • That flaimbait, i'll bite:

        If you read the article you would have seen a reference to this article [tomshardware.com] at THG:

        it resolves the Quake 3 "issue;"

        also

        offers SmoothVision FSAA; and, enables 16tap anisotropic filtering. On top of that, it improves performance.

        in the conclusion it states:

        Nvidia also has some work to do in regard to FSAA
        • And if you read this article [gamers.com] (read through it now, dont just "skim"), you'll see that the tom-foolery that ATI has pulled with their mip-mapping in Quake3 has nothing to do with FSAA, and everything to do with cheating the user.

          • I could as well read an article about a g200. The firingsquad article is old news. you could as wel linked to the original article at hardocp. And also the firingsquad later had an interview with ATI about this.

            The new drivers have NOW solved the "quack 3 issue".

            And did you read my link to tom's? I did pull the fsaa out of context, but it was one thing the ATI card was better at that a geforce card. it does have plus points.
    • I actually prefer the Kyro 2 cards to the GeForce line. Those benchmarks don't reflect real gaming performance of the Kyro 2 cards. They are excelent for the price.
    • Actually, even for gaming, the 8500 is awesome. It may not have the same level of speed, but on most games, the framerate'll still be above 60, which is all that is needed.

      The 8500 also has much higher image quality than any GeForce card on the market.
  • The article whines a lot about inadequate tv-out capabilities for these cards. Call me crazy, but why would somebody blow the big bucks on something so high-powered as a Titanium, and then hook it up to a crummy TV? Seems like anybody who'd buy these things would rather use a big quality monitor instead. Even if you're going to use one of the nice big plasma flat panels from Pioneer or Sony, they come with VGA inputs anyway. You certainly wouldn't want to use TV outputs. What am I missing here?
    • Sometimes I need 1920x1200 resolution on the desktop, but othertimes I just want to display video on the TV.
      • Sometimes I need 1920x1200 resolution on the desktop, but othertimes I just want to display video on the TV.

        Yeah, I've got a TV card for that, though. That way, you don't have to replace it every time you upgrade your main video card. It's not like TV cards are getting more and more functionality with each new version.
        • Well thats an entirely different point. If you're asking "Why do they bother putting a TV-out there at all?", the answer is simply because the increased cost of the product is low compared to the increased functionality of not making people have a second board for no other reason than to output onto a TV.

          Same reason we no longer have to have both a 2D and a 3D graphics card any longer.
        • Well, but people can't have 10 pci cards it their computer, and combined tv-out and graphics board works really well..

          Who use their 19" screen instead of a 29" tv for DivX;-) anyway..?
    • You might want to watch movies, especially if you have a DVD player in your computer and don't feel the need for a separate DVD player for your television. It may come as a surprise, but there are quite a few people who prefer to watch movies in their couch, rather than by their desk.

      I've gone in the other direction; I've sold my tv and bought a tv card instead. After all, my desk chair is the most comfortable seat in my home and I spend a lot of time there anyway.

      /Janne
    • First, some of us haave the money for the card, but not for the Trinitron 22" Monitor.

      Second, Do you prefer you DVDs on TV (with armchair and Family) or on Monitor (on the bed with a coke) ?

      + Bleem (RiP)
      I know. But Tekken 2-3 are still Very Good Games
      As are most PS1 Games.

      And you can play Tekken at a reasonable speed, with Friends, (I mean dual PIII - 1Gig + small Ge2-64DDr IS overkill. But I never played so fast 8)
      • First, some of us haave the money for the card, but not for the Trinitron 22" Monitor.

        If you don't have the money for a decent monitor, why would you blow $300 on a video card?That's like getting a 2ghz P4 or Athlon, but stifling it with 64mb ram. You don't blow your whole wad on a single component, you spread it around so you can get a decent system.

        Second, Do you prefer you DVDs on TV (with armchair and Family) or on Monitor (on the bed with a coke) ?

        I prefer them on TV, and that's why I use a DVD player. PC's don't have remotes, and I don't want to have to get up and go to the PC every time I want to pause or jump around to different features. (Then again, I like watching every extra feature on a DVD, and most people probably don't.)
        • "If you don't have the money for a decent monitor, why would you blow $300 on a video card?"

          All in Wonder Radeon goes for $140-50. The cheapest 22" monitor I found was $528. The Sony Trinitrons are upwards of $1K.

          I don't want to have to get up and go to the PC every time I want to pause or jump around to different features.

          I use a wireless mouse and mini keyboard. They stash away in the coffee table when not in use.
          • The cheapest 22" monitor I found was $528. The Sony Trinitrons are upwards of $1K.

            Pricewatch shows 21" Sony Trinitrons for $650 from fly-by-night guys, and CDW has them for $799. I got my used one for $300 from a CAD shop that was switching over to big LCD's.

            But all of this is irrelevant, though - what I was asking is, why do people want a TV-out on a high-end video card? If you're putting together a machine to play DVD's and Bleem, you certainly don't need a Geforce Ti. Like you said, an AIW Radeon goes for $150, and that's more than good enough. This particular article was talking about $300 cards that don't even do video capture. For those cards, a TV out is almost useless.
            • I agree with you on that then. I had the TNT Ultra with TV out and it really wasn't that great of quality. I'm not sure I ever even used it. I am glad you found the Trinitron price. I was looking all over pricewatch for 22in and didn't even stop to think that Sony doesn't make a 22in.
              • Ok.

                Thanks Razzlefrog, you just answerd the same I'd have 8)
                I (also) have that TNT2 Ultra +Tvout, and I'm still using it to this day... on a PII350 that makes DVD+Divx+Mp3+TV Net browsing (It was such a fad at the time 8)

                I also put Logitech Radio Kb + Mouse, and those are nice remotes (104+Keys remote ! Wow 8)

                But back to the point...
                I took that old PC because turning it to a DVD player cost me less than a standalone, it can do all the thing a DVD does+the rest (Divx, Neorage,Browsing, Porn on Tv 8)

                Also, some people (me at the time) have the budget either for Pc or for DVD. This allowed me to take both, with some problems (drivers for DVD card, W98 stability against How The F**K Do I Get Linux on TV Out ?)...

                Today this is a W98 box (simpler) stable (=> so hacked that MS wouldn't recognize it's Registry 8) and Ghosted.

                I have no concerns, it works flawlessy, I play DVD all zones, Games (BroodWar : old; slow; thousands of players online, and VERY nice on TV), I have Internet And Mp3 on the HiFi, and I'm thinking about the Videoprojector and 5.1 speakers.

                Of course definition IS terrible, but my TV is the student model (Big&Cheap) and can accomodate without problem 800*600. It's even better than regular PAL, so 8))

                Sorry for the 22" Trinitron, I got carried away 8|
            • what I was asking is, why do people want a TV-out on a high-end video card?

              You might be asking, but you're obviously not listening [slashdot.org] ;-)

              For those cards, a TV out is almost useless.
              That's clearly false.

              0.02
            • > why do people want a TV-out on a high-end video card?

              I went that route, but I think you're right.

              Look at a 19" monitor from a couple of feet away.

              Look at a 33" TV from 10 feet away.

              About the same angle (field of view) in your eyes. Hellaciously more pixels on the 19". Better sound with a good pair of headphones and a 19".

              A TV tuner on your video card makes having a TV obsolete. (And the rest of your computer can then obsolete your VCR and DVD-player.)

        • I am planning on using the TV out on my Geforce 2 MX400 to watch DVD's on my couch. The "Computer DVD does not have a remote" thing is so much of an excuse. I bought a TV card that came with a remote for 30 bucks (as soon as I get my rebate back...:) Pinnacle Studio TV Pro, dbx Stereo TV and FM radio, $49.99 at CompUSA and a $20 dollar rebate.....no brainer there! :)). The remote that comes with this card is nice and it will work, for the meantime. It works off of the serial port which means you should be able ot hack something together for Linux or any other OS to make it work (execute keyboard macro when it recieves a certain code on the serial port). I do want to get a wireless (RF ONLY) keyboard for surfing the net on TV from the recliner built into my couch. I plan on using TV out for visualizations too(xtace on Linux, Winamp on Windows). If you use the Nvidia drivers for Linux, you can get the TV out to work pretty easily, although I have yet to get the cable I need for it. That said, any self respecting geek questioning the inclusion of a thing like TV out on a video card has GOT to be on drugs. It's just cool!

          One other thing: Your DAMN STRAIGHT this crap should work. It ain't hard! At least TV has a standard! Unlike some things on computers like MUSIC! MP3, OGG, MP3Pro, Real Audio, WMA which one is THE standard? I know default is MP3, but it's not a standard, to me, until it's the only thing used or even talked about, then we'll have a new standard for digital music. MP3 is close but we still hack and work on OGG right? Computers now have so many so called standards that, to me, nothing is standard anymore. This is, to me, the main reason some people never buy computers because there's so many frickin choices that they have no idea if this one will play the game they want or do what they want at an acceptable speed. This is why MACS are good for newbies cuz there's fewer choices (decent Apple built-in audio, Geforce 2 MX currently the default) and other things Apple does right. I don't own an Apple and I am not saying they are better then PC's. Sometimes they are not. But at least you can buy a Mac and count on it being able to run about any game you buy for the Mac. PC's it's a friggin crapshoot.
    • I wish more reviews covered the quality of the TV-out. I'm trying to put together a system to run as a jukebox through my TV, and I have yet to find a single card that procuces acceptable quality video on the TV.

      The TV is OK (sony trinitron), and my Dreamcast and PS2 both produces razor-sharp, rock-solid text and graphics on it. Why can't a PC video card do that?

      (Besides, the Ti-200 is priced not much higher than some GeForce 2MX - about 130UKP. It's not a high-end card. After christmas it'll be the standard 'okay I guess' card).
    • Easy, because the cash value of a "big bucks" GeForce is a *lot* lower than the "big bucks" of those really big monitors, plasma, or flat panels. For example, even a 36" traditional CRT TV runs 800 dollars or so, over twice what one would pay for a good GeForce. The really nice, big HDTV-type monitors with VGA or component connectors run *at least* two thousand dollars. Sure, people get nice monitors for their systems, but too small to really enjoy with a group of people. That 36" CRT-TV with S-Video connection may work well for you, and TV-out to those is very useful.

      Now if I had all the money I could ever want, I'd be hooking up my computer through composite connectors to a 30,000 dollar front-projection system in a really nice home theater room, but for now I'll be using that S-Video with a large CRT TV.
      • Yeah, but still, why not put a decent TV-out on - component connectors on a flying lead wouldn't add much to the price but would massively increase the quality of TV-out. It can't be that expensive, you can get standalone DVD players for $150 with component out these days, all you'd need to do would be overlay the sync signal on the Y channel from the RGB (I think).
  • 1 / As usual, see the most powerfull, Expensive and complete video card (which specs look slightly like my last computer, btw)

    2 / As my parents doesn'nt budget me anymore (Alas !), stop daydreaming and get a Geforce 2Mx, which is MORE than enough for now (ok, let's say enough)

    => I mean the day you have more than 2 softs that can use Geforce 3, maybe then...

    Until that date, Ge2Mx is more than enough for Quaking.

    I mean, for the price of a GE3Ti, I could buy a Desktop computer 8| This isn't the rat race, it's just a game race...

    Hoping to Frag you Soon 8)
  • I suppose the Quake 3 numbers are some indication of OpenGL performance for these mass karket cards, but I was curious how these stacked up against the traditional high end OpenGL cards (Oxygen, FireGL, etc. or even a whole SGI system) so that a price/performance comparison could be made. If CPU's are any indication, the market size for these cards could drive their performance to almost acceptable levels in more professional OpenGL applications and certainly at a lot less cost.

    Any references?

    • nVidia have a popular mid-range line of "professional" 3D chips, the Quadro series (sold by Elsa in its Gloria series). These are basically GeForce chips with a couple of extra features enabled [geocities.com], like hardware anti-aliased lines & two-sided lighting. They're quite a bit more expensive than a consumer GeForce, but a LOT cheaper than most workstation cards.

      There's a few [spec.org] places [amazoninternational.com] you can look for benchmarks on GeForce, Quadro and mid- to high-end workstation gfx cards. Currently the Wildcat 5110 pretty much rules the roost (at around $3k), with the Quadro2 Pro (under $1k) & FireGL4 (over $1k) competing hotly below that. Lesser cards (FireGL 1 & 2, Quadro, Quadro2 MXR & EX, and the older Oxygen models) can be had for well under $1k. Prices are only from memory, and are probably wildly inaccurate.

      Even a standard GF2/GF3 or Radeon does pretty well, impressively so for the price. Rendering quality has been compared (for the GeForce at least), and is roughly equivalent - no major texture or polygon errors, all cards generating the occasional off pixel.

      Bottom line: The majority of my customers (2D/3D FX) are switching to GeForce or sometimes Quadro cards - sought-after features include decent (not necessarily superlative) 3D app performance, dual monitor support (WITH hardware accel on both monitors!), and bang for the buck. Good DirectX support doesn't hurt either (very few cards from 3Dlabs support DirectDraw well, and some serious apps do need this).

      • FireGL2 is not a "lesser" card. In the dozens of SPECview tests I've run on every imaginable AGP chipset, it totally kicks Gloria III and Gloria DCC's ass. The Linux drivers are a *lot* more stable, too. For pure OpenGL performance and quality I just don't think anything from NVidia compares. Wildcat 5110 is another matter altogher. Too bad they won't release a Linux driver, though I see Xi Graphics is trying to fill the gap.


  • When compared to my 512k Trident VL-bus video card, I've had less problems with that than my Geforce256 DDR. Go figure.

  • So it begins... (Score:3, Informative)

    by Gannoc ( 210256 ) on Wednesday December 19, 2001 @08:31AM (#2725460)
    We tested Gainward's new GF2 Ti bearing the confusing Ti500 moniker, as well as the GeForce 3 Ti500 board carrying the equally inaccurate name Ti550 TV. Obviously, Gainward is trying to create an impression of technological superiority for its products. Nonetheless, these cards carry the same NVIDIA chips as the competition and not some newer version, as the name might imply to less informed buyers.



    Technological superiority? Try fraud. They name their boards the "Ti500" when it has the regular Ti, and NOT the Ti500 chip, then call their Ti500 board the "Ti550". If I was reviewing that, I'd certainly point that out a little more plainly than as a "technological superiority" attempt.

  • Stereo Glasses (Score:5, Informative)

    by soboroff ( 91667 ) on Wednesday December 19, 2001 @08:46AM (#2725504)
    I find it pretty interesting that some of these cards (according to the review) are being bundled with LCD shutter glasses... the glasses are synchronized with the screen to darken the screen over one eye while your monitor displays the view for your other eye. Refresh that at 120Hz, provide a slightly parallaxed view for each eye, presto, it's better than Jaws 3D.

    I used to work with these things a while back... it's ok as long as you don't move much, but if you like to move your head around you'll get headaches pretty quick, since the view doesn't change based on where you're sitting. We used head-tracking to accomplish this, but none of that stuff here. Another problem is screen distortion, which doesn't mean much when you're playing Quake, but if you're thinking of a really nice interface for Blender or Maya, this can make a big difference in being able to actually point the mouse where you think it's pointing.

    Without calibration to your personal interocular distance and eye-to-screen distance, and good correction for screen distortion, you can use these for max 30 minutes before getting eyestrain or just a plain headache. Add poor head-tracking and you can get seasick, too!

    Last thing: there is more than depth cues to seeing 3D: good lighting and shadow effects, _accurate_ perspective views, and use of color all come into play. These glasses are a lot of fun, and if a lot of folks have them then maybe the state of the art will go forward a bit.
    • I've noticed alot of who play games get "sick." When I'm in Fry's, EB's, or wherever else, I'll always meet the strange lad who got sick while playing Mech Warrior, Quake3, and whatever other 3d-worlds you can throw at them. Each and every time, I ask them if they get sick while playing a side-scrolling game like an actual Pinball machine and they'll sey, "Ya I get sick of pinball real quick, especially when it costs four quarters for three balls."

      Me, on the other hand, has only been sea-sick one time and it was way back in my childhood... I was about 5 years old and was fishing with my dad about 15 miles out in the ocean; just south of the Catalina Island Channel. I blew chunks simply because I was angry at my dad, didn't want to fish, and was pouting in the cabin. So, I got sick because I didn't focus on fishing and mainly because I went into the cabin.

      Now let me tell you about the ocean and a boat's cabin to keep from getting sea-sick... STAY OUT OF THE CABIN! STAY AS FAR AWAY FROM IT AS POSSIBLE! AND FOR THE LOVE OF GOD, WHEN A SWELL MOVES UNDER YOU, AND THE BOAT BEGINS ROLLING OVER IT, YELL "YEEEHAWWWWW", AND YOU WILL NOT GET SEASICK. I think it has to do with not LOOKING at one object at any one time. A person who doesn't get sea-sick and has a great time will be a little dizzy when you get back on shore. That would be a good study... compare the amount of dizzyness of a person who doesn't get seasick, to a person who always gets sea-sick. It takes a good 30 minutes for your ears to adjust on land because they no longer have to compensate for the swells rolling and and shifting the boat.

      ...That was the only time I ever got sea-sick and no videogame or Japanese cartoon's flashing lights ever made me sick. I can play those $5-a-pop flight simulators all day, but I started boycotting Disneyland ever since I discovered they hide porn and evil-inspired messages in their Animated movies they make for children.

    • Have any info/links on getting these to work in Maya? I'm a Maya user, I've never really put any thought into using LCD Glasses to get a good 3D interface (or any other method, for that matter.) Since I use Maya all day, every day, this is something that I'd like to toy with. Has anyone tried getting this working in Maya? Of course, playing AvP2 would with these might be fun too...
  • Jesus Christ (Score:1, Flamebait)

    by ellem ( 147712 )
    Uncle Tom
    ...most best...
    definate

    Get a friggin' copy editor.

    Uncle Tom is just wrong on a lot of levels
    ...most best... WTF does that even mean?
    definate perhaps in several hundred years the word will be spelled the way it is pronounced by dullards but for now it is definite. The opposite of infinite.

    Guys, use Word... it will fix things like this automatically.
  • by swb ( 14022 ) on Wednesday December 19, 2001 @09:03AM (#2725546)
    I have a 2+ year old (in tech terms) ATI Rage 128 based card (AIW-128) running under XP and with the newest ATI drivers and the games I've played with it (most recently the Medal of Honor demo), performance is just fine by my eyes @ 1024x768 and 16 bit color.

    I've seen nVidia GeForce2 cards going for $100 but I just don't see the point. There was a time when moving from a 2D card to a 3D card like the orginal Voodoo was really worth the $300 or so it cost -- performance and quality skyrocketed. Similarly the move from the voodoo I to the II, and from the II to that card's next generation (the ATI 128).

    Past that point, unless you have some specific non-gaming application that really needs the 3D performance it seems like kind of a waste. 3D performance has been pushed beyond the point where it matters, even for gaming and the features being added seem trivial -- just TV out?

    All new cards it seem should come not only with good 3D, but video in and out, TV tuners, and the ability to do hardware MPEG2 compression of full-frame video at zero cost to the CPU. At that point the video card arms race would make more sense..
    • All new cards it seem should come not only with good 3D, but video in and out, TV tuners, and the ability to do hardware MPEG2 compression of full-frame video at zero cost to the CPU. At that point the video card arms race would make more sense..

      But I don't want to pay for a TV tuner with my video card any more than I want an Instant Messaging app with my OS [microsoft.com] or Browser [netscape.com].

      What I would expect is that if they are going to offer these features, then they should at least be of some reasonable quality - see my other post about quality of picture on TV-outs.

      I'd also expect to be able to trade off features/performance for either price or power consumption (and therefore heat/noise), but I'm apparently the only person who cares about that. Or PCI for a second-head.
      • You are NOT the only one that cares about that!

        What I care about is: Passive cooling (no noise), good 2D image quality (that is what you will be using most of the time) and good drivers of course (both for Windows and Linux).
        3d Performance is last - I don't play games very much.

    • Yes your performance in 16bit color will be very good. But all the newer games, including the forementioned Medal of Honour are optimized for 32bit color AND 32bit textures (putting even more strain on your setup).
      That is another thing altogether. You cannot expect to run Medal Of Honour / Return To Caste Wolfenstein / ... in full 32bit color AND 32bit textures with some medium detail setting on your ATIrage128 or for example a TNT2.
      And since you'll be losing out on most of the graphic details in modern games in 16bit mode ( look at the skie and smoke in RTCW ) a Geforce-x upgrade seems imminent ;-)
      I know what I'm talking about cause my overclocked p3-450@558 and TNT2-125@155Mhz can't keep it up for much longer ;(
    • You forget to mention 64-bit color [theinquirer.net] as one of the final features. (no i am not making this up)

      But I think video card stop evolving when they reach realtime reallife quality.

      It is however the case that a lot of games (I don't know about medal of honour) don't use all the latest features, to reach a larger customer base.
      • Aren't human eyes limited to seeing color of no greater quality than 24bit color? 64bits seems to be quite a bit of overkill.

        The site you referred to said that 64bit color helps out in rendering of shades and complex images in non-realtime for movies and such. I guess I can accept that - but for video cards on PC's, I feel it is just too much cost and complexity for the minimal gains in quality.

        >But I think video card stop evolving when they
        >reach realtime reallife quality.
        Yeah, I agree with you. They (the card makers) need to work on optimising cards to remove inefficiencies. For example: The addition of hardware shaders and texturers like in the GF3 series is a major step ahead in card evolution. The optimisations in the Kyro II cards that don't draw or texture unseen triangles is another example.

        Maybe after they have hit the wall on how much they can improve 3D graphics on a 2D surface, they will begin to put more research into a more 3D display mechanism. Comfortable, affordable, easy-to-use, reliable and functional 3D glasses or headsets - with spatial feedback and stereo sound. WOHOO!
        • 24 is, but they use the other channels for effects... like alpha blending etc. Not sure what new effects 64 bit get you though.
        • Aren't human eyes limited to seeing color of no greater quality than 24bit color?


          Not quite. From memory, the number of colours shown (16.7 million) is close to the number of distinguishable colours, but the two sets of colours are not in the same colourspace, so it it isn't actually good enough that you can't see the difference between adjacent colours in RGB-8bit space, even though the number of colours is right.

          Also, the shades within the RGB-8bit space are distributed evenly amongst red, green and blue, whereas the eye is more sensitive to green, then red, then blue.

          Look up 'gamut' in a decent graphics book, like Foley & Van Dam.
    • Past that point, unless you have some specific non-gaming application that really needs the 3D performance it seems like kind of a waste.

      You should try some different games.

      I have a GeForce2 and I've been thinking the same thing for a while, but I just bought the EverQuest expansion "Shadows of Luclin" and now I'm looking for a new video card. My GeForce2 (on a 1.3Ghz Athlon with 1GB of RAM) can't draw the new 512x512 pixel textures and high-polygon character models guickly when I get into areas with lots of other players or lots of vegetation, even at 1024x768 resolution.

      EQ has never been the most efficient game in terms of power required to render its displays, but the approach EQ takes is what games *should* be able to do: EQ describes its world in terms of polygons, texture maps and light sources and lets the computer/video card do the rest. Not spending a lot of developer time on making nice-looking graphics render quickly on low-end (or even not-so-low-end!) hardware means more developer time that can be spent on enlarging the virtual world (and Norrath/Luclin is *huge*).

      I hear that with some of the $300+ cards, SoL action is smooth at 1600x1200 resolution with all of the bells and whistles turned on... too bad my wife already bought my Christmas presents :(

    • performance is just fine by my eyes @ 1024x768 and 16 bit color.

      You just answered your question. by my eyes isn't good enough for hardcore gamers. There are valid reasons to need 125 fps in Q3A. My girlfriend currently gets 200.9 fps in Q3A with a Geforce 2 GTS, Athlon 1900+, and she's getting a Geforce 3 Ti500 from me for xmas (shhhhhhh). At that point I can only guess that her FPS will be in the 300+ range. Definitely more than necessary, but having 90 horsepower in your car is probably more than necessary too. Doesn't mean I wouldn't rather have 300.

      Additionally, does your video card have things like full scene anti-aliasing? That's one of the major selling points of the gf3, as it improves image quality a lot.

      I recently built a computer for my grandmother. I put a geforce 2 mx in for $60. Sure, you can find acceptable video cards for $30, but for another $30 you get one that you really don't have to worry about. Plus, you never know, grandma might decide to play CS or Q3.
  • Speaking of tv-out, which video card should I invest in if I want really good tv-out? I need the video card for games as well. I currently have a Matrox G400, but the drivers doesn't work to well with Windows XP (the tv-out part), and the 3D perfomance is a bit lagging. I've been looking at the Kyro II (seems like it performs like a Geforce 2 GTS in most cases), but I haven't been able to find anything on it's tv-out capabilities.

    I want a card that is able to output high-quality video to my tv while my normal monitor is showing my desktop in another resolution and higher refresh rate.

  • by BigJimSlade ( 139096 ) on Wednesday December 19, 2001 @09:26AM (#2725655) Homepage
    to get me a most bestest video card for crissmas. Geforce am a very goodest chipset for me to play em my bestest games.

    For Great Justice!
  • This is by far the most best Geforce comparison out there. A definate read for all you hardcore graphics guys.

    For the sake of someone who couldn't pass 3rd grade spelling or grammar, I sure hope you aren't in the market for an expensive new grapics card...

  • by DG ( 989 ) on Wednesday December 19, 2001 @10:48AM (#2726101) Homepage Journal
    My primary system is a Pentium I 233MMX, 64 MB RAM, Linux 2.4.14 box. It's based on a Baby AT format case, so any processor upgrades are a case + motherboard + processor deal, and I've been just too damn lazy & cheap to bother.

    The graphics card built with this system was a Matrox Mill II - so no 3D acceleration to speak of.

    Playing Quake and Quake 2 on this system was Just Fine, but anything more modern was just not possible. I tried playing the Quake 3 demo, but was getting something on the order of 1 FPM, so I've been pretty well shut out of all the 3D stuff.

    Then the other day, I noticed that the price on an XTacy GeForce MX400 PCI card (no AGP!) was like $150 CAN - so what the hell, I bought it.

    It turned out to be DOA (system would not POST) so I exchanged it for the only other PCI card they had in stock, an XTacy MX200 card (which was like $120 CAN)

    They also happened to have Quake3 (in the tin box, no less) SoF, and Descent3, all the Loki ports, in the bargin bin for like $10 each, so I got those too.

    Stick in the card, grab NVidia's drivers, configure XFree to use them, fire up Q3 - and bam! Playable! Just like that.

    Things get a little choppy if more than about 10 people are in a room shooting at each other, and SoF and Descent3 (played in 800x600 with full textures) will "skip" once and a while, but for the most part, the game experience has been just fine.

    Interestingly enough, when I turned on the frame rate display on Q3, I was getting anywhere from 10 fps to about 27 fps, with an average of about 15 - and the play experience is just fine. Faster framerates would be nice, but this IS old hardware, and really, it'd just be gravy. I don't particularly find myself wishing that the framerate was higher than it is - in fact, before I turned on the fps display, I thought I was making 30 fps. To see the average was about half that was a real suprise.

    I can't help but wonder if the processor or bus is the bottleneck, or if the MX400 card had've worked the display might be a touch faster - but it doesn't really matter. The MX200 is "good enough".

    So overall, I'm a happy camper.

    .
    • by Anonymous Coward
      So overall, I'm a happy camper

      I fucking hate campers in Q3 -- douche!
    • I can't help but wonder if the processor or bus is the bottleneck, or if the MX400 card had've worked the display might be a touch faster - but it doesn't really matter. The MX200 is "good enough".

      Your CPU/bus is the bottleneck.
      To be competitive @ QIII you should be running at 100 FPS (some say 120 feels even better) 30-40 is the bare minimum. Check out The Upset Chaps Guide [savagehelp.com] to get your framerate up.
      For under US$ 350 you could have a Duron 850, mid range MoBo, case, AGP MX 400, & 256MB of PC133 SDRAM. For an extra US$ 100 or so, you could have a high end MoBo with an upgrade path & 265MB PC2100 DDR SDRAM.
      Your kernels will compile a lot faster too :)

  • Has anyone used the VGA-HDTV (component video) converter? That seems like the way to go if you are wanting to use your TV as a monitor. (You all have HDTV don't you?)

    Barjam
  • I like mo' better comparisons.
  • Bah. (Score:1, Insightful)

    by Anonymous Coward
    I think this is a bit sad, really. Once upon a time, the test would have between between 21 different cards from 21 different manufacturers, with 21 different chipsets. Now, the vast majority of poeple just go for an nvidia gfx card.

    A similar thing happened in the computing world - these days, most people just get an x86 PC - once upon a time, you cuold choose with relatively equal ease between Amiga, Acorn, Atari, Mac, PC, etc. Each had different advantages and disadvantages. Now we get generic boxes based on the mediocre x86 architecture that are differentiated by marketing and hologram badges on the cases...
  • The only real news here is that the GeForce 3 technology is available for about half the original price point. All these "titanium", "speed bump", and "overclocked" versions have are within 25% of the base GeForce 3.

    There's still not much out there that actually uses the vertex shader capability in the GeForce 3, anyway. NVidia's chameleon demo is beautiful, but that's about the only impressive vertex shader app. So the GeForce 2 technology is good enough for most gamers right now.

    NVidia does a great job; their boards work well, the drivers are reasonably solid, and their ELSA business unit, which sells boards, offers a six-year warranty, rare in this industry. And they support OpenGL seriously. Now that they have the price down to a more affordable level, go for it.

Let's organize this thing and take all the fun out of it.

Working...