Tom's Hardware: Win, Lose or Ti - 21 GeForce Titan Tests 109
msolnik writes "Got a huge wad of cash burning a hole in your pocket? Why not spend it on a fancy new video card... Uncle Tom has reviewed 21 different cards so you can make a well educated decision. This is by far the most best Geforce comparison out there. A definate read for all you hardcore graphics guys."
Bah... (Score:2)
nVidia isn't the only game in town, particularly not for those of use who do video playback and editing more than play games.
Re:Bah... (Score:2)
Yeah, I'd rather have a video card that reduces video quality in my favorite games on purpose just to get better benchmark info [hardocp.com] too. Oh wait, no I wouldn't
I'm sick of ATI's bullshit, they've been poulling this kind of crap with their drivers for years.
Re:Bah... (Score:2)
Also, you could mention how ATI, by omitting the facts, kinda led sites to make the assumption that the good Anti-Aliasing was a well implemented multi-sampling solution when it was actually an inefficient super-sampling solution. True, ATI didn't ever claim that it wasn't supersampling, but they didn't seem to want to mention one way or another until confronted. Of course this strategy of misinformation through omission is common in corporations, but still seems a little devious to watch all these sites mention Smoothvision in contest of multi-sampling without offering corrections...
On the other hand, if you are using ATI-written drivers, then you are running a Microsoft OS, and therefore you are already a customer of a company that has pulled some dirty tricks in its days. Pretty much all successful companies pull dirty tricks, and in comparison to other corporate acts, this one isn't that high on the sleaze scale. In this case, they cheated, but released fully functional drivers when called on it. It was really deceptive, but people and review sites need to learn not to judge a card mostly on a single game's acheived framerate so that companies won't get away with this sort of cheating.
My primary interest is Linux, and the GeForce cards use unsupported chips for TV Capture/Playback. I'm not about to sacrifice functionality just so I can give money to a company that has probably pulled similar nasty tricks in its time, but got away with them.
Re:Bah... (Score:2)
My primary interest is Linux, and the GeForce cards use unsupported chips for TV Capture/Playback
Umm... hello? What planet do you live on? NVidia has official linux driver support for all their cards, Twinview and TV out included. There aren't really any NVidia TV capure cards available anyways, but if you really care about video quality you'd be using a seperate vid cap card anyways, so this is irrelavent to your arguement. NVidia' funcitonality under linux far surpasses ATI's, including features such as full screen anti-aliasing (FSAA). When NVidia has a superior product, I fail to see why you would support a downright dirty company such as ATI.
Re:Bah... (Score:1)
Re:Bah... (Score:1)
Dinivin
Re:Bah... (Score:2)
And if you read this article [gamers.com] (read through it now, dont just "skim"), you'll see that the tom-foolery that ATI has pulled with their mip-mapping in Quake3 has nothing to do with FSAA, and everything to do with cheating the user.
Re:Bah... (Score:1)
And if you honestly think that none of the other major video chip/card manufacturers pull that kind of shit, you are still truly, and foolishly, naive.
Dinivin
Re:Bah... R8500 (Score:2)
If you read the article you would have seen a reference to this article [tomshardware.com] at THG:
it resolves the Quake 3 "issue;"
also
offers SmoothVision FSAA; and, enables 16tap anisotropic filtering. On top of that, it improves performance.
in the conclusion it states:
Nvidia also has some work to do in regard to FSAA
Re:Bah... R8500 (Score:2)
And if you read this article [gamers.com] (read through it now, dont just "skim"), you'll see that the tom-foolery that ATI has pulled with their mip-mapping in Quake3 has nothing to do with FSAA, and everything to do with cheating the user.
Re:Bah... R8500 (Score:1)
The new drivers have NOW solved the "quack 3 issue".
And did you read my link to tom's? I did pull the fsaa out of context, but it was one thing the ATI card was better at that a geforce card. it does have plus points.
Re:Bah... (Score:1)
Re:Bah... (Score:1)
The 8500 also has much higher image quality than any GeForce card on the market.
TV-out question (Score:2)
Re:TV-out question (Score:2)
Re:TV-out question (Score:2)
Yeah, I've got a TV card for that, though. That way, you don't have to replace it every time you upgrade your main video card. It's not like TV cards are getting more and more functionality with each new version.
Re:TV-out question (Score:2)
Same reason we no longer have to have both a 2D and a 3D graphics card any longer.
Re:TV-out question (Score:1)
Who use their 19" screen instead of a 29" tv for DivX;-) anyway..?
Re:TV-out question (Score:1)
I've gone in the other direction; I've sold my tv and bought a tv card instead. After all, my desk chair is the most comfortable seat in my home and I spend a lot of time there anyway.
/Janne
Re:TV-out question (Score:1)
Humm (Score:1)
Second, Do you prefer you DVDs on TV (with armchair and Family) or on Monitor (on the bed with a coke) ?
+ Bleem (RiP)
I know. But Tekken 2-3 are still Very Good Games
As are most PS1 Games.
And you can play Tekken at a reasonable speed, with Friends, (I mean dual PIII - 1Gig + small Ge2-64DDr IS overkill. But I never played so fast 8)
Re:Humm (Score:2)
If you don't have the money for a decent monitor, why would you blow $300 on a video card?That's like getting a 2ghz P4 or Athlon, but stifling it with 64mb ram. You don't blow your whole wad on a single component, you spread it around so you can get a decent system.
Second, Do you prefer you DVDs on TV (with armchair and Family) or on Monitor (on the bed with a coke) ?
I prefer them on TV, and that's why I use a DVD player. PC's don't have remotes, and I don't want to have to get up and go to the PC every time I want to pause or jump around to different features. (Then again, I like watching every extra feature on a DVD, and most people probably don't.)
Re:Humm (Score:1)
All in Wonder Radeon goes for $140-50. The cheapest 22" monitor I found was $528. The Sony Trinitrons are upwards of $1K.
I don't want to have to get up and go to the PC every time I want to pause or jump around to different features.
I use a wireless mouse and mini keyboard. They stash away in the coffee table when not in use.
Re:Humm (Score:2)
Pricewatch shows 21" Sony Trinitrons for $650 from fly-by-night guys, and CDW has them for $799. I got my used one for $300 from a CAD shop that was switching over to big LCD's.
But all of this is irrelevant, though - what I was asking is, why do people want a TV-out on a high-end video card? If you're putting together a machine to play DVD's and Bleem, you certainly don't need a Geforce Ti. Like you said, an AIW Radeon goes for $150, and that's more than good enough. This particular article was talking about $300 cards that don't even do video capture. For those cards, a TV out is almost useless.
Re:Humm (Score:1)
Yes / Yes / No (Score:1)
Thanks Razzlefrog, you just answerd the same I'd have 8)
I (also) have that TNT2 Ultra +Tvout, and I'm still using it to this day... on a PII350 that makes DVD+Divx+Mp3+TV Net browsing (It was such a fad at the time 8)
I also put Logitech Radio Kb + Mouse, and those are nice remotes (104+Keys remote ! Wow 8)
But back to the point...
I took that old PC because turning it to a DVD player cost me less than a standalone, it can do all the thing a DVD does+the rest (Divx, Neorage,Browsing, Porn on Tv 8)
Also, some people (me at the time) have the budget either for Pc or for DVD. This allowed me to take both, with some problems (drivers for DVD card, W98 stability against How The F**K Do I Get Linux on TV Out ?)...
Today this is a W98 box (simpler) stable (=> so hacked that MS wouldn't recognize it's Registry 8) and Ghosted.
I have no concerns, it works flawlessy, I play DVD all zones, Games (BroodWar : old; slow; thousands of players online, and VERY nice on TV), I have Internet And Mp3 on the HiFi, and I'm thinking about the Videoprojector and 5.1 speakers.
Of course definition IS terrible, but my TV is the student model (Big&Cheap) and can accomodate without problem 800*600. It's even better than regular PAL, so 8))
Sorry for the 22" Trinitron, I got carried away 8|
Re:Humm (Score:2)
You might be asking, but you're obviously not listening [slashdot.org]
For those cards, a TV out is almost useless.
That's clearly false.
0.02
Re:Humm (Score:2)
I went that route, but I think you're right.
Look at a 19" monitor from a couple of feet away.
Look at a 33" TV from 10 feet away.
About the same angle (field of view) in your eyes. Hellaciously more pixels on the 19". Better sound with a good pair of headphones and a 19".
A TV tuner on your video card makes having a TV obsolete. (And the rest of your computer can then obsolete your VCR and DVD-player.)
Re:Humm (Score:2)
One other thing: Your DAMN STRAIGHT this crap should work. It ain't hard! At least TV has a standard! Unlike some things on computers like MUSIC! MP3, OGG, MP3Pro, Real Audio, WMA which one is THE standard? I know default is MP3, but it's not a standard, to me, until it's the only thing used or even talked about, then we'll have a new standard for digital music. MP3 is close but we still hack and work on OGG right? Computers now have so many so called standards that, to me, nothing is standard anymore. This is, to me, the main reason some people never buy computers because there's so many frickin choices that they have no idea if this one will play the game they want or do what they want at an acceptable speed. This is why MACS are good for newbies cuz there's fewer choices (decent Apple built-in audio, Geforce 2 MX currently the default) and other things Apple does right. I don't own an Apple and I am not saying they are better then PC's. Sometimes they are not. But at least you can buy a Mac and count on it being able to run about any game you buy for the Mac. PC's it's a friggin crapshoot.
Re:TV-out question (Score:2)
The TV is OK (sony trinitron), and my Dreamcast and PS2 both produces razor-sharp, rock-solid text and graphics on it. Why can't a PC video card do that?
(Besides, the Ti-200 is priced not much higher than some GeForce 2MX - about 130UKP. It's not a high-end card. After christmas it'll be the standard 'okay I guess' card).
Re:TV-out question (Score:2)
Now if I had all the money I could ever want, I'd be hooking up my computer through composite connectors to a 30,000 dollar front-projection system in a really nice home theater room, but for now I'll be using that S-Video with a large CRT TV.
Re:TV-out question (Score:1)
Re:TV-out question (Score:1)
Re:Huh? (Score:1)
Not everybody here uses English as his/her primary language.
My English level is somewhat low, BTW. So I suposse that I may have wrote several grammatical and sintactical errors in this message (5+). Laught at them, quick!
Re:Huh? (Score:1)
Definately!!
Re:Huh? (Score:1)
Gotta love that
I hate to be a grammar/spelling Nazi, but it's a pet peeve of mine when people can't be bothered to check their own spelling, it certainly diminishes the importance of anything they might have to say.
- Z
Re:Huh? (Score:1)
But seriously, if the point of language is to convey a thought, and he conveyed his thought, his language is fine. Perhaps it conveyed to you a little more than his intended thought...like that he's an ass too lazy to check his own spelling. I give the benefit of doubt...to me it conveyed that he didn't know the dictionary spelling of a few words and is perhaps a bit redundant (most best), but then I've known some friggin' smart people who couldn't speel they're weigh oot uv a payper baag.
BTW, anyone intent on helping others understand language should read The Language Instinct [fatbrain.com] by Steven Pinker. There's a lovely chapter on people who insist on correcting the grammar/spelling of others.
At any rate, while I agree that presentation says a lot about you, I hardly think
I hate being a grammar/spelling Nazi too...so I stopped.
Re:Huh? (Score:1)
Your interpretation is one possible way to read my statement. But it is just as technically correct to speak in the present tense with an implied "have" in front of the word stopped.
As it is also correct with the ellipsis serving as a separator of two statements and indicating something ommitted. The two statements stand on their own and do not necessarily require being of the same tense.
Also, while switching tense is often taught as improper, in common language it is done with great regularity and is not in any sense improper. Partly, my point is that
Suggesting to someone that they might be taken more seriously or be viewed as more intelligent if they took the time to spell-check their posts is fine. Deriding someone for not doing so is condescending and only makes the corrector look more of an ass than the correctee.
Re:Huh? (Score:1)
Well, if you ask me, presentation says almost as much as the content of your words. If you can't even be bothered to check something as rudimentary as spelling, I'm sorry, but anything you have to say is moot, because your opinion and the validity of your words comes into question.
I'm not saying you need to be perfect, everyone makes the occaisional spelling meestakes
As for English not being the writer's native tongue, that's entirely possible, but if so, then the onus is still on the writer to make that fact clear so that the reader doesn't infer anything. Again, just another laziness/sloppiness factor if you ask me.
- Z
Re:Huh? (Score:1)
However, after reading that book I mentioned, I have to agree with Mr. Pinker's assessment that language is *supposed* to change. If it didn't, and words like "upcoming" (which used to be a pet peeve of mine) or derivative spellings of words were not allowed to become a part of the accepted language, we'd all still be speaking in some ancient tongue.
Trying to force language to adhere to the same set of rules forever is impossible. So complaining about it changing (which people have been doing for centuries) is rather pointless. So I learned to just roll with it. I always try to be as correct (according to the rules I learned in school) with my spelling and grammar as I can, but don't think less of people who do otherwise in informal settings but still take issue with mis-spellings or grammatical errors in things like resumes.
IMHO, the word "definitely" is mis-spelled SO OFTEN as "definately" that it is only a matter of time before lexicographers add it to the dictionary.
Re:Huh? (Score:1)
I completely agree that language is not forever, nor are (or should they be) the rules be so inflexible that they require precise usage to garner any merit. A perfect example of this is just as you pointed out. Another is the induction of the phrase "D'oh" into Oxford's (I think it was Oxford's, I could be mistaken here) next revisioned dictionary.
What I cannot agree with, nor do I consider it acceptable, is the sloppy or lazy use of written language, when one attempts to convey information to a large number of people. It's one thing to instant message a friend and another entirely to post to
The shift of mentality that it's OK to be sloppy and lazy in presenting information is the problem I have. Change is fine, as long as it's acceptable in my eyes, I guess
Oh, and I definAtely shudder at the thought of *that* day coming
- Z
Why Read the test ? (Score:1)
2 / As my parents doesn'nt budget me anymore (Alas !), stop daydreaming and get a Geforce 2Mx, which is MORE than enough for now (ok, let's say enough)
=> I mean the day you have more than 2 softs that can use Geforce 3, maybe then...
Until that date, Ge2Mx is more than enough for Quaking.
I mean, for the price of a GE3Ti, I could buy a Desktop computer 8| This isn't the rat race, it's just a game race...
Hoping to Frag you Soon 8)
Re:Why Read the test ? (Score:1)
SPECviewperf numbers? (Score:2)
I suppose the Quake 3 numbers are some indication of OpenGL performance for these mass karket cards, but I was curious how these stacked up against the traditional high end OpenGL cards (Oxygen, FireGL, etc. or even a whole SGI system) so that a price/performance comparison could be made. If CPU's are any indication, the market size for these cards could drive their performance to almost acceptable levels in more professional OpenGL applications and certainly at a lot less cost.
Any references?
Re:SPECviewperf numbers? (Score:1)
Re:SPECviewperf numbers? (Score:2)
There's a few [spec.org] places [amazoninternational.com] you can look for benchmarks on GeForce, Quadro and mid- to high-end workstation gfx cards. Currently the Wildcat 5110 pretty much rules the roost (at around $3k), with the Quadro2 Pro (under $1k) & FireGL4 (over $1k) competing hotly below that. Lesser cards (FireGL 1 & 2, Quadro, Quadro2 MXR & EX, and the older Oxygen models) can be had for well under $1k. Prices are only from memory, and are probably wildly inaccurate.
Even a standard GF2/GF3 or Radeon does pretty well, impressively so for the price. Rendering quality has been compared (for the GeForce at least), and is roughly equivalent - no major texture or polygon errors, all cards generating the occasional off pixel.
Bottom line: The majority of my customers (2D/3D FX) are switching to GeForce or sometimes Quadro cards - sought-after features include decent (not necessarily superlative) 3D app performance, dual monitor support (WITH hardware accel on both monitors!), and bang for the buck. Good DirectX support doesn't hurt either (very few cards from 3Dlabs support DirectDraw well, and some serious apps do need this).
Re:SPECviewperf numbers? (Score:1)
Video Cards (Score:1)
When compared to my 512k Trident VL-bus video card, I've had less problems with that than my Geforce256 DDR. Go figure.
So it begins... (Score:3, Informative)
Technological superiority? Try fraud. They name their boards the "Ti500" when it has the regular Ti, and NOT the Ti500 chip, then call their Ti500 board the "Ti550". If I was reviewing that, I'd certainly point that out a little more plainly than as a "technological superiority" attempt.
Stereo Glasses (Score:5, Informative)
I used to work with these things a while back... it's ok as long as you don't move much, but if you like to move your head around you'll get headaches pretty quick, since the view doesn't change based on where you're sitting. We used head-tracking to accomplish this, but none of that stuff here. Another problem is screen distortion, which doesn't mean much when you're playing Quake, but if you're thinking of a really nice interface for Blender or Maya, this can make a big difference in being able to actually point the mouse where you think it's pointing.
Without calibration to your personal interocular distance and eye-to-screen distance, and good correction for screen distortion, you can use these for max 30 minutes before getting eyestrain or just a plain headache. Add poor head-tracking and you can get seasick, too!
Last thing: there is more than depth cues to seeing 3D: good lighting and shadow effects, _accurate_ perspective views, and use of color all come into play. These glasses are a lot of fun, and if a lot of folks have them then maybe the state of the art will go forward a bit.
OFFTOPIC: Seasickness (Score:1, Funny)
Me, on the other hand, has only been sea-sick one time and it was way back in my childhood... I was about 5 years old and was fishing with my dad about 15 miles out in the ocean; just south of the Catalina Island Channel. I blew chunks simply because I was angry at my dad, didn't want to fish, and was pouting in the cabin. So, I got sick because I didn't focus on fishing and mainly because I went into the cabin.
Now let me tell you about the ocean and a boat's cabin to keep from getting sea-sick... STAY OUT OF THE CABIN! STAY AS FAR AWAY FROM IT AS POSSIBLE! AND FOR THE LOVE OF GOD, WHEN A SWELL MOVES UNDER YOU, AND THE BOAT BEGINS ROLLING OVER IT, YELL "YEEEHAWWWWW", AND YOU WILL NOT GET SEASICK. I think it has to do with not LOOKING at one object at any one time. A person who doesn't get sea-sick and has a great time will be a little dizzy when you get back on shore. That would be a good study... compare the amount of dizzyness of a person who doesn't get seasick, to a person who always gets sea-sick. It takes a good 30 minutes for your ears to adjust on land because they no longer have to compensate for the swells rolling and and shifting the boat.
Re:OFFTOPIC: Seasickness (Score:1)
thnx
ej
Re:Stereo Glasses (Score:1)
Jesus Christ (Score:1, Flamebait)
...most best...
definate
Get a friggin' copy editor.
Uncle Tom is just wrong on a lot of levels
...most best... WTF does that even mean?
definate perhaps in several hundred years the word will be spelled the way it is pronounced by dullards but for now it is definite. The opposite of infinite.
Guys, use Word... it will fix things like this automatically.
Past the point of v ideo cards mattering? (Score:5, Insightful)
I've seen nVidia GeForce2 cards going for $100 but I just don't see the point. There was a time when moving from a 2D card to a 3D card like the orginal Voodoo was really worth the $300 or so it cost -- performance and quality skyrocketed. Similarly the move from the voodoo I to the II, and from the II to that card's next generation (the ATI 128).
Past that point, unless you have some specific non-gaming application that really needs the 3D performance it seems like kind of a waste. 3D performance has been pushed beyond the point where it matters, even for gaming and the features being added seem trivial -- just TV out?
All new cards it seem should come not only with good 3D, but video in and out, TV tuners, and the ability to do hardware MPEG2 compression of full-frame video at zero cost to the CPU. At that point the video card arms race would make more sense..
Re:Past the point of v ideo cards mattering? (Score:3, Interesting)
But I don't want to pay for a TV tuner with my video card any more than I want an Instant Messaging app with my OS [microsoft.com] or Browser [netscape.com].
What I would expect is that if they are going to offer these features, then they should at least be of some reasonable quality - see my other post about quality of picture on TV-outs.
I'd also expect to be able to trade off features/performance for either price or power consumption (and therefore heat/noise), but I'm apparently the only person who cares about that. Or PCI for a second-head.
Re:Past the point of v ideo cards mattering? (Score:2)
You are NOT the only one that cares about that!
What I care about is: Passive cooling (no noise), good 2D image quality (that is what you will be using most of the time) and good drivers of course (both for Windows and Linux).
3d Performance is last - I don't play games very much.
Re:Past the point of v ideo cards mattering? (Score:2, Funny)
That is another thing altogether. You cannot expect to run Medal Of Honour / Return To Caste Wolfenstein /
And since you'll be losing out on most of the graphic details in modern games in 16bit mode ( look at the skie and smoke in RTCW ) a Geforce-x upgrade seems imminent
I know what I'm talking about cause my overclocked p3-450@558 and TNT2-125@155Mhz can't keep it up for much longer
Re:Past the point of v ideo cards mattering? (Score:2)
But I think video card stop evolving when they reach realtime reallife quality.
It is however the case that a lot of games (I don't know about medal of honour) don't use all the latest features, to reach a larger customer base.
Re:Past the point of v ideo cards mattering? (Score:2)
The site you referred to said that 64bit color helps out in rendering of shades and complex images in non-realtime for movies and such. I guess I can accept that - but for video cards on PC's, I feel it is just too much cost and complexity for the minimal gains in quality.
>But I think video card stop evolving when they
>reach realtime reallife quality.
Yeah, I agree with you. They (the card makers) need to work on optimising cards to remove inefficiencies. For example: The addition of hardware shaders and texturers like in the GF3 series is a major step ahead in card evolution. The optimisations in the Kyro II cards that don't draw or texture unseen triangles is another example.
Maybe after they have hit the wall on how much they can improve 3D graphics on a 2D surface, they will begin to put more research into a more 3D display mechanism. Comfortable, affordable, easy-to-use, reliable and functional 3D glasses or headsets - with spatial feedback and stereo sound. WOHOO!
Re:Past the point of v ideo cards mattering? (Score:1)
Re:Past the point of v ideo cards mattering? (Score:2)
Not quite. From memory, the number of colours shown (16.7 million) is close to the number of distinguishable colours, but the two sets of colours are not in the same colourspace, so it it isn't actually good enough that you can't see the difference between adjacent colours in RGB-8bit space, even though the number of colours is right.
Also, the shades within the RGB-8bit space are distributed evenly amongst red, green and blue, whereas the eye is more sensitive to green, then red, then blue.
Look up 'gamut' in a decent graphics book, like Foley & Van Dam.
Re:Past the point of v ideo cards mattering? (Score:3, Interesting)
Past that point, unless you have some specific non-gaming application that really needs the 3D performance it seems like kind of a waste.
You should try some different games.
I have a GeForce2 and I've been thinking the same thing for a while, but I just bought the EverQuest expansion "Shadows of Luclin" and now I'm looking for a new video card. My GeForce2 (on a 1.3Ghz Athlon with 1GB of RAM) can't draw the new 512x512 pixel textures and high-polygon character models guickly when I get into areas with lots of other players or lots of vegetation, even at 1024x768 resolution.
EQ has never been the most efficient game in terms of power required to render its displays, but the approach EQ takes is what games *should* be able to do: EQ describes its world in terms of polygons, texture maps and light sources and lets the computer/video card do the rest. Not spending a lot of developer time on making nice-looking graphics render quickly on low-end (or even not-so-low-end!) hardware means more developer time that can be spent on enlarging the virtual world (and Norrath/Luclin is *huge*).
I hear that with some of the $300+ cards, SoL action is smooth at 1600x1200 resolution with all of the bells and whistles turned on... too bad my wife already bought my Christmas presents :(
Re:Past the point of v ideo cards mattering? (Score:3, Funny)
Re:Past the point of v ideo cards mattering? (Score:1)
You just answered your question. by my eyes isn't good enough for hardcore gamers. There are valid reasons to need 125 fps in Q3A. My girlfriend currently gets 200.9 fps in Q3A with a Geforce 2 GTS, Athlon 1900+, and she's getting a Geforce 3 Ti500 from me for xmas (shhhhhhh). At that point I can only guess that her FPS will be in the 300+ range. Definitely more than necessary, but having 90 horsepower in your car is probably more than necessary too. Doesn't mean I wouldn't rather have 300.
Additionally, does your video card have things like full scene anti-aliasing? That's one of the major selling points of the gf3, as it improves image quality a lot.
I recently built a computer for my grandmother. I put a geforce 2 mx in for $60. Sure, you can find acceptable video cards for $30, but for another $30 you get one that you really don't have to worry about. Plus, you never know, grandma might decide to play CS or Q3.
TV-out on non-nvidia cards (Score:1)
I want a card that is able to output high-quality video to my tv while my normal monitor is showing my desktop in another resolution and higher refresh rate.
Re:TV-out on non-nvidia cards (Score:1)
Me am can't wait... (Score:4, Funny)
For Great Justice!
Funny you would mention "well educated"... (Score:1)
For the sake of someone who couldn't pass 3rd grade spelling or grammar, I sure hope you aren't in the market for an expensive new grapics card...
Observations on an Old System + GeForce MX200 PCI (Score:3, Interesting)
The graphics card built with this system was a Matrox Mill II - so no 3D acceleration to speak of.
Playing Quake and Quake 2 on this system was Just Fine, but anything more modern was just not possible. I tried playing the Quake 3 demo, but was getting something on the order of 1 FPM, so I've been pretty well shut out of all the 3D stuff.
Then the other day, I noticed that the price on an XTacy GeForce MX400 PCI card (no AGP!) was like $150 CAN - so what the hell, I bought it.
It turned out to be DOA (system would not POST) so I exchanged it for the only other PCI card they had in stock, an XTacy MX200 card (which was like $120 CAN)
They also happened to have Quake3 (in the tin box, no less) SoF, and Descent3, all the Loki ports, in the bargin bin for like $10 each, so I got those too.
Stick in the card, grab NVidia's drivers, configure XFree to use them, fire up Q3 - and bam! Playable! Just like that.
Things get a little choppy if more than about 10 people are in a room shooting at each other, and SoF and Descent3 (played in 800x600 with full textures) will "skip" once and a while, but for the most part, the game experience has been just fine.
Interestingly enough, when I turned on the frame rate display on Q3, I was getting anywhere from 10 fps to about 27 fps, with an average of about 15 - and the play experience is just fine. Faster framerates would be nice, but this IS old hardware, and really, it'd just be gravy. I don't particularly find myself wishing that the framerate was higher than it is - in fact, before I turned on the fps display, I thought I was making 30 fps. To see the average was about half that was a real suprise.
I can't help but wonder if the processor or bus is the bottleneck, or if the MX400 card had've worked the display might be a touch faster - but it doesn't really matter. The MX200 is "good enough".
So overall, I'm a happy camper.
.
Re:Observations on an Old System + GeForce MX200 P (Score:1, Funny)
I fucking hate campers in Q3 -- douche!
Re:Observations on an Old System + GeForce MX200 P (Score:1)
Your CPU/bus is the bottleneck. :)
To be competitive @ QIII you should be running at 100 FPS (some say 120 feels even better) 30-40 is the bare minimum. Check out The Upset Chaps Guide [savagehelp.com] to get your framerate up.
For under US$ 350 you could have a Duron 850, mid range MoBo, case, AGP MX 400, & 256MB of PC133 SDRAM. For an extra US$ 100 or so, you could have a high end MoBo with an upgrade path & 265MB PC2100 DDR SDRAM.
Your kernels will compile a lot faster too
HDTV TV-Out converter. (Score:1)
Barjam
Most Best, Eh? (Score:1)
Re:Most Best, Eh? (Score:1)
Bah. (Score:1, Insightful)
A similar thing happened in the computing world - these days, most people just get an x86 PC - once upon a time, you cuold choose with relatively equal ease between Amiga, Acorn, Atari, Mac, PC, etc. Each had different advantages and disadvantages. Now we get generic boxes based on the mediocre x86 architecture that are differentiated by marketing and hologram badges on the cases...
We did the GeForce 3 a few weeks ago (Score:2)
There's still not much out there that actually uses the vertex shader capability in the GeForce 3, anyway. NVidia's chameleon demo is beautiful, but that's about the only impressive vertex shader app. So the GeForce 2 technology is good enough for most gamers right now.
NVidia does a great job; their boards work well, the drivers are reasonably solid, and their ELSA business unit, which sells boards, offers a six-year warranty, rare in this industry. And they support OpenGL seriously. Now that they have the price down to a more affordable level, go for it.