Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software

GeForce4 Ti 4200 Preview 248

Mike Chambers writes "Hi All, I've completed a preview of NVIDIA's GeForce4 Ti 4200 graphics chipset. Although the preview contains your typical benchmarks, it's centered around game play and antialiasing image quality. Here's a list of the games involved - Quake 3 & Team Arena, IL-2 Sturmovik, Nascar Racing 2002 Demo, Jedi Knight 2, Serious Sam 2, Max Payne Demo, Comanche 4 Demo, Dungeon Siege and Tiger Woods PGA Tour 2002 Demo. Since antialiasing image quality, especially Quincunx and 4XS, was an important aspect of the preview, all of the screen shots were saved in high quality PNG format. For those Slashdot readers that are avid gamers, you might want to check this out."
This discussion has been archived. No new comments can be posted.

GeForce4 Ti 4200 Preview

Comments Filter:
  • Good stuff (Score:3, Informative)

    by MoiTominator ( 75986 ) on Monday April 22, 2002 @01:34AM (#3385922)
    Good review. Detailed and uses several pretty new games to benchmark, instead of relying on the old Q3 tests.
  • by Anonymous Coward on Monday April 22, 2002 @01:35AM (#3385930)
    I was looking forward to uncovering these features of the GeForce4 Ti 4200 myself. Now you've spoiled everything.
  • by bleeeeck ( 190906 )
    AnandTech had a good sub $200 video card review [anandtech.com] that includes the GeForce4 Ti 4200 (it also covers ATI's 128MB Radeon 8500LE).
  • I'd just like to let everyone know that the screenshots are GORGEOUS, and while I'm going to be and don't have time to read everything skimming the pictures was a review in and of itself.
  • by HaggiZ ( 68526 ) on Monday April 22, 2002 @01:46AM (#3385971) Homepage
    Time and time again these fantastic new sound/graphics/whatever cards are released, and almost always targetted towards gamers. Is it just me? Am I the only one happy with the quality I get out of my current card and the games available for it? The graphics are done well in most games to offer a fantastic and believable escape into the games. And in the end it all comes down the the gameplay anyways.

    That being said, I'm not against the new developments. It certainly does look like an awesome card, just seems to me that this particular market segment could almost be bled dry and these cards may have to find something else they are useful for to continue to survive. I dont have a deep enough understanding of the market or those in it to be able to make a serious call on it though.

    I remember reading a long long time ago about developments that were looking at moving cycles across to other processors (i.e., big nasty graphics cards) that could be used to offset workloads when they weren't being fully utilised (99% of the time you aren't game playing). Anybody know what happened?
    • by MisterBlister ( 539957 ) on Monday April 22, 2002 @01:54AM (#3385994) Homepage
      Time and time again these fantastic new sound/graphics/whatever cards are released, and almost always targetted towards gamers. Is it just me? Am I the only one happy with the quality I get out of my current card and the games available for it?

      Well, there's no reason to get a GeForce 4 now unless you're a software developer or really need those extra 5 FPS in Quake3 (305 FPS instead of 300 FPS)... GF4 (and GF3) do offer significant advantages over older cards, but since the development cycle for high-quality games is about 3-4 years, compared to the development cycle of new graphics cards (6 months - 1 year), the game engines are always lagging behind. Nothing out yet even really takes advantage of what the GeForce 3 has to offer -- until Doom3 is released, anyway.

      The best thing to do is just ignore these new card releases, and let other fools buy them just to be 'l33t'. In a year or more when you can actually buy games that will use GF4 features, the cards will be much cheaper than they are now...

      • If you've read the article, you notice that all the benchmarks are run with antialiasing. So in fact, these cards do breathe a new life into old games - the author remarks that he would never go back to running games without antialiasing turned on. The performance of these cards is high enough to get get good framerates with antialiasing.
      • That's just not true. Anti-aliasing, in particular, depends entirely on the card's performance. I have a GF3 Ti200 and I have a GF4 Ti4600. In any given resolution, I get the pretty much the same frame rates on the latter with anti-aliasing on as I do for the former with anti-aliasing off. The apparent difference in visual quality is significant; texture crawl and edge jags pretty much disappear.

        Sure, I'll be glad when games specifically target my card, but for now, I'm enjoying some particularly clean looking software. It's worth the extra money to me, and it has nothing to do with being l33t.
      • Well, there's no reason to get a GeForce 4 now unless you're a software developer or really need those extra 5 FPS in Quake3 (305 FPS instead of 300 FPS)


        The words of a true non-gamer willing to expound their wisdom for all to see. I play the game Urban Terror (urbanterror.net), which is a Quake 3 mod, on a medium end system including a GeForce 3 Ti200 : I have to turn a significant number of features down to run smoothly at 1024x768 32-bit, and even still certain parts of certain maps slow to a relative crawl (crawl being 20fps or so : It feels sloggy and throws your timing off, not to mention that it ruins any immersion). Don't even get me started on AA, because in real applications (i.e. not a stock Quake 3 which virtually no one plays anymore) that is a frame rate super killer.

    • While I certainly do agree with you that the rate of developments in this field is dizzying (for the pocket as well) there is definitely something to be said for HOW FUCKING AWESOME all the new games are.

      It may even feel a bit like a conspiracy between the game developers and the 3D card makers. You may even think that the differences between last year's games this year's are real subtle. Just compare Quake I, Quake II, Quake III and maybe Soldier of Fortune 2 (released a day or two ago) to get a better appreciation of how far we've come.

      CySurflex
      (Unhappy with my old 16MB card)
      • I'm suffering mightily with a PII-400 and Matrox G200 - holding out until I know I'll have a system good enough for Doom3.
        • Any word on what kind of system that is going to be? I'm with a pretty similar set-up - PII-450 with a Matrox 400-TV. Whatever card I get will have the TV/Video IN+OUT just like the 400-TV - and I'm trying to decide between just upgrading the card to getting a new system.
          • Question: what model motherboard are you running?

            If it's an Abit, Asus, or other motherboard that allows you to seriously overclock, you might want to see if the motherboard will support a Socket 370 to Slot 1 adapter. That will allow you to run the Powerleap PL-370/T CPU upgrade (which should be released very soon); this will bump up the speed of your system from 450 MHz to 1,200 MHz! :-)

            At 1,200 MHz CPU speed, even the Matrox G400-TV should be fast enough to run most modern games.
    • I hear you. My outlook on most new hardware is that its immediate usefulness is in driving down the prices of the currently average hardware. At the time, I'm still running on a 3dfx Voodoo 3 2000, and it can handle most games just fine. Some of the newer ones cause it to choke at times (Jedi Outcast, Yavin) but it has served me quite faithfully.

      That said, I'm upgrading to a low-end GeForce4 tomorrow. The Voodoo just doesn't cut it for more heavy-duty stuff, and it has no native MPEG decompression abilities, rendering DVDs a bit choppy at times. Also, it has no TV out - a feature "standardized" in cards just recently, and IMO, a necessary feature when used in conjunction with a DVD drive. Hence, an upgrade. There will always be people who have a) the money, and b) the desire to have the "latest and greatest", and they are the ones who fuel these new products, although at $200 it's not a bad price. I'm sure that companies take the fact that second and third generation hardware sells better...but it's not second or third generation if there isn't new hardware to succeed it.

      Pfeh...I need sleep.
      • A low end GF4 is _not_ the same as a GF4ti 4200 or 4600 (I forget the low end designations). Anyway, iirc the low end GF4 is based on the same chip as the Xbox, only with fewer shader pipelines, while the high-end GF4 uses a new chip, more similar to GF3 than to the Xbox (obviously they all have similar features, but these have significant performance differences). As Carmack said recently, you don't want to buy a low-end GF4 to play future advanced shader-enabled games, because it doesn't have the hardware for that. Instead buy a high-end GF4 or a GF3 or a Radeon 8500+.
    • I remember reading a long long time ago about developments that were looking at moving cycles across to other processors (i.e., big nasty graphics cards) that could be used to offset workloads when they weren't being fully utilised (99% of the time you aren't game playing). Anybody know what happened?

      This rings a bell. The phenomenon is nothing new (note the date below!) and known as Wheel of Reincarnation. Quoting the Jargon File 4.3.1 [tuxedo.org]:

      wheel of reincarnation

      [coined in a paper by T.H. Myer and I.E. Sutherland "On the Design of Display Processors", Comm. ACM, Vol. 11, no. 6, June 1968)] Term used to refer to a well-known effect whereby function in a computing system family is migrated out to special-purpose peripheral hardware for speed, then the peripheral evolves toward more computing power as it does its job, then somebody notices that it is inefficient to support two asymmetrical processors in the architecture and folds the function back into the main CPU, at which point the cycle begins again.
    • My opinion is that we are at a kind of plateu where improvements aren't making that much of an effect for end users.

      However I hope progress isn't slowed as the holy grail is to achieve real photo quality in real time and that would be an awsome sight.


    • I want to creat 3D animation - the choice of animation software isn't yet set. The ones that I am looking at are Lightwave, Maya, 3DS Max, Blender and POVRay.

      Which one do you recommend ?

      On hardware side, which graphic card do you recommend ?

      I am sticking with the X86 platform, OS can be Windoze, Linux, BSD, or BeOS.

      All suggestions will be very much appreciated !

      Thanks in advance !!

    • I think a decent graphics card that uses the GeForce4 Ti 4200 will end up being extremely successful in the marketplace.

      There are two reasons for this:

      1) It is less expensive to implement, so OEM's will be far more interested in installing this card instead of the much more expensive cards that use the Ti4400 or Ti4600 chipsets. Besides, the performance drop is not significant, so most users won't see any performance hits on even the latest games. This is why I expect many system builders to incorporate graphics cards that use the GeForce4 Ti4200 chipset onto new systems on a large scale by July 2002.

      2) Because it is an NV25 chipset, it also means that the card will sport higher-level MPEG-2 decoding support. That means hardware assistance for playing back DVD discs as good as what ATI has done with their Rage 128 and Radeon chipset series.

      I think you must like the Matrox G400/G450/G550 cards. Yes, they have excellent 2-D display, but the GeForce4 Ti4200 has vastly surpassed it in 3-D graphics and with the right manufacturer achieved almost as good 2-d quality display.
  • I've got a dual-proc P3/800 on my desk right now, a half-gig of RAM on an Apollo mobo. It has a single PCI card (a 3Com 905B Cyclone) and a GeForce 4 on the AGP slot. Now, what's my problem?

    Everything about the damn GeForce.

    First, it was having constant conflicts with Something-Or-Other during POST--I'd get a really annoying system beep and no video output, period. Yanked my SoundBlaster AWE32 and presto, it boots. Weird. Why was the GeForce 4 conflicting with my SB?

    Now it works reasonably well, except that I'm forced to use my on-board AC97 audio (which sounds like ass, and esd really doesn't like it). Reasonably well, except for the occasional spontaneous reboot... which occurs for reasons I haven't been able to track down yet.

    In Win2000 it's the same story--except that when I connect to the 'Net via my external modem (COM1), I'll randomly get a BSOD or a spontaneous reboot.

    Why in the billion names of JR "Bob" Dobbs the GeForce 4 causes so many hardware conflicts, I have absolutely no idea.

    When it's running, though, it's a pretty sweet board.

    By comparison, my last card was a Voodoo3. Nice, simple AGP card; I plugged it in, it worked, never conflicted with anything.

    • I have two dual machines. One dual pIII 600 on
      a Aopen mobo. And one dual PIII 733 on a MSI
      mobo. I have been experiencing lockups and
      sudden rebboots on both machines. When using
      various geforce cards. Everything from
      GeForce 256 (sdr) to Geforce 3 64M. It is
      very unstable and it mostly started after
      nvidias release of the 6.xx drivers. So I
      think they have done something that will make
      SMP systems unstable. To get more juice out of
      the cards. Since the 6.xx driver offered some
      20% more fps in certain games.

      I have tried to get some info from nvidia or
      other sources on this subject. But I has been
      unable to find any. I first tought it was my
      current hardware setup that was the problem.
      But now I have tried two different mobos and
      some 5-6 different geforce cards (different
      manufacturers to).

      Its very annying in the long run since I am
      also working with 3d computer graphics. And if
      I work on a project I dont like a sudden
      lockup.

      -P
    • Comment removed based on user account deletion
    • I'm using a Visiontek TI 4600, and haven't had any problems with it - I'm using an Athlon XP 1800+ with a Soyo Dragon+ (using the Via chipset that everyone complains is buggy), but installation was a snap, replacing my Creative Labs 32MB Savage 4 Pro card. And I'm using Win2k and the onboard 6.1 audio and 100BT, "to boot"...

      If it is the GeForce4, it's probably your particular card.
      Did you try taking it back for a replacement, before telling us all how bad it is?

      More likely you have some weird BIOS issues or power problems... you should check those, too.
      • More likely you have some weird BIOS issues or power problems... you should check those, too.

        I think that could be part of the troubles.

        I wonder does clearing the CMOS NVRAM and getting a decent 300 watt ATX power supply will help things along. Believe me, I've seen where clearing the CMOS NVRAM on the the motherboard fixes a LOT of Windows 9x/ME/2000/XP Plug and Play setup issues.
        • I wonder does clearing the CMOS NVRAM and getting a decent 300 watt ATX power supply will help things along.


          Yah... I keep my BIOS as barren as possible, and I use a 400-watt in my tower =) Although from reading other comments, there may be a real issue with some SMP systems that people have...
      • Yes, I did take it back for a replacement. Replacement does the exact same things. It's not a "weird BIOS issue", given that in the pursuit of solving this problem I've disabled just about everything that could be doing it--disabled AGP 4x, gone from 256M window to 16M, etc. Problem still persists.
  • With all the conjecture regarding XBox, PS2, GC of late, a new (high priced) release seems an ideal time to question the future of the PC stronghold(?) in the gaming market.

    From memory, the new Nvidia card was listed at around $350, and it can be noted that the tests were performed on a high end processor with a healthy serving of ram. Although this concoction transparantly serves as a powerful pc for all your non-gaming needs, does this serve as a warning to building 'game boxes'?

    Even against Sony's impressive software library, I would argue that the PC offers the best range of gaming (quaking etc.), but with M$ entering the console market, will this be the case in times to come and is it possible we are dawning on a separation of mainstream pc uses and gaming?

    Food for thought anyway.

    • You are referring to recent articles where M$ seems to be pushing games from the PC to a console so the PC can be the entertainment (har har) center of the house?

      Well, fashions come and go every decade.. is it time for the games to move back to the console.. (only to have them come back to the PC in 5 years time?)

  • The GeForce 3/4 line has been stuck at 64MB for a while now, and as a result, all the boards with GeForce 3/4 parts have roughly the same performance, within 25% or so. For marketing reasons, there are about half a dozen models, but not much difference between them (ignoring the GeForce 4 MX, which is a GeForce 2 engine, without the vertex or pixel shader hardware.)

    Now, finally, a memory upgrade and a visible performance improvement.

    • The performace difference across the series of cards is due to memory speed, not size. The 128 MB allows antialiasing at very high resolutions.
    • 128MB on GF3 doesn't make much of a difference compared with 64MB on today's games. http://www6.tomshardware.com/graphic/02q2/020418/v gacharts-01.html
    • There are 128MB GeForce 3 Ti200s (no clue on the Ti500). It's just not a very popular configuration. Pity, too, because you can really see the different 128MB makes on heavily textured games, like Jedi Knight 2 and EverQuest.

      Hopefully the GF4s will break that trend, since 128MB is the rule there, and not the exception.

      Geo
    • by Anonymous Coward
      yeah my 128meg geforce4 goes twice as fast as my 64meg geforce2 mx! That extra memory really makes it fly!
    • Someone give this guy the "clueless" sticker.

      128mb vs 64 makes maybe 2-3% difference. They've benchmarked the same card (ti4200) with a 128 and 64mb model, and the 64mb model was FASTER due to more expensive ram.

      THe BIGGEST difference in all the gf3/gf4 cards is the memory speed.

      Graphics are still limited by fill rate in 90% of games, so if you have 10% faster memory, you get 10% faster framerate.

      Problem is 10% faster memory costs 20% more, and so on, due to yield concerns.

      All the 128mb vs 64mb will let you do is:
      -run a higher res anti-aliased... but this doesn't matter if you dont have the speed
      -use more textures... but all textures are now compressed in games (just about all) so they're not even filling up 64mb
  • Isn't is pointless to use current games to benchmark future videocards? I much rather like the idea of using the latest build of engines of future games (i.e. Unreal Tourney 2003) because it pushes the card harder than the final game will, plus it allows developers to fix bugs that arise before both the game and card is released.
  • ...make me wanna do just one thing. reboot into win98 and play quake :)

    games are evil, and the largest threat to widespread use of opensource software...
  • Maybe a little offtopic but:
    I have made 3 empirical observations of the game industry:
    1. Games are about 5 years behind cutting-edge graphics research (academica (SIGGRAPH), mostly.
    2. The graphics/engine programmers generally have the best hardware in the graphics team to allow them to test out the latest hardware advancements
    3. Games nowadays takes around 2-3 years that the cutting edge hardware they use at first will become midranged by the time game got released.
  • I don't get it... (Score:1, Flamebait)

    by ferrocene ( 203243 )
    The reviewer's setup:

    "The following is a list of the hardware and software used in this preview.

    AMD Athlon XP 1800+ @ 1.53GHz
    NVIDIA Reference Motherboard (nForce Chipset)
    256MB Corsair PC2400 DDR RAM
    21-Inch Sony Multiscan E500 Monitor
    NVIDIA Reference GeForce4 Ti 4600 (300MHz/650MHz) - 128MB
    NVIDIA Reference GeForce4 Ti 4200 (250MHz/500MHz) - 64MB
    NVIDIA Detonator XP Driver Version 28.32
    32-Bit Color / Sound Disabled * / Vsync Disabled / 75Hz Refresh Rate
    Windows XP Professional / DirectX 8.1"

    Ok, you're reviewing a card with 128mb of video memory, yet your main system memory is only 256mb? On WinXP? Dude, just shell out the extra $$ for at least 512. Unless using 2 DIMM's somehow cuts your performance. Who's using 256? Compaq?
    • Comment removed based on user account deletion
      • boggle How do you figure that? DDR (Double Data Rate) referrs to the speed at which the system can pull data from the chips, it has nothing to do with the size of the memory (which was the original posters point, 256 MB with WinXP is swap city once you start running moderatly memory hungry apps (like games!)).
  • Good stuff... (Score:2, Insightful)

    by Moonshadow ( 84117 )
    nVidia continues to impress me. They continue to raise the bar for hardware, and they are enabling programmers to beef up their poly counts, particle systems, etc.

    Yummy. I want one.
  • But why do you need antialiasing at 1600x1200? Can anyone honestly see the pixels at that res?

    The average user doesn't need his screen being blurred, the monitor does that well enough for him/her

    "omg timmy! did you see those jaggies!"
    "dude, don't be a magnafying glass hog!"
    • by Anonymous Coward
      But why do you need antialiasing at 1600x1200? Can anyone honestly see the pixels at that res?

      The human eye resolution is of 1/60 of a degree. So your screen resolution will match your eye resolution (i.e you'll see the jaggies) if your distance to the screen is X times the width of your screen (or closer), where

      X = 360*60/(2*PI*1600) = 2.15

      So I'm afraid you'd have to stand as far as 4 times the width of your screen to blur the jaggies with your eyes at 1600x1200 resolution (assuming you have a good vision).

      (Sorry, I just feel like calculating stuff tonight.)
    • Antialiasing isn't just blurring the image, it is actually putting more than one "sample" of information into every pixel.

      1600x1200 4xFSAA is like a 3200x2400 pixel screen, slightly blurred.
    • You wouldn't think it'd make a difference, but it does. At that resolution, it's not so much jaggies as it is texture crawling and shimmering. Anti-aliasing fixes both.

      Geo
  • i'm a bit confused as to why this particular preview was deemed better than all the others with the same information. hardocp.com posted their preview [hardocp.com] on the 8th and added another segment [hardocp.com] on the 10th (other sites reviewed it in the same timeframe, but hardocp is the one i read). so this particular ti4200 preview is old news. slashdot keeps wandering into the hardware news arena, but doesn't seem to pay quite enough attention to do it well.
  • Is this so special? (Score:2, Informative)

    by Gerb ( 88657 )
    Tom's hardware already reviewed this card on April the 9th. You can find it here [tomshardware.com].

    Gerb
  • Seems that Nvidia's counter to ATI's cheap 128mb Radeon card went over rather well with reviewers.

    If you want some more information, here's some good reviews/articles I saw today during my daily browsing:

    Compare these numbers against Nvidia's previous attempt at the budget arena, the MX 440 here [sharkyextreme.com]. A much needed improvement!
  • Super nice!!! (Score:1, Interesting)

    by Anonymous Coward
    Very good preview. The GF4 generation GPU is mighty strong, and I disagree with those people that are saying that they don't need a GF4. I have bought a GF3 Ti200 recently and I don't plan to update anytime soon, but I am already envy of those guys that can afford a GF4 and a cool plasma 34" screen.

    Get a grip dudes. Even if you don't like it, GF4 IS progress. What I am worried about though is the competition. Will ATI have the power to respond with an equally good video card ? if it doesn't and NVIDIA stays the only player in the graphics card market, we are doomed, and open source OSs especially.
  • It's Spring, and the need for a new video card presents itself. Why? Because the one you bought 6 months ago is "outdated," meaning it doesn't get the highest FPS on some benchmark site like the one whored in this article.

    So is it time to drop the $400? To rely on buggy drivers rushed out by ATi or nVidia? To snarl at DirectX's mysterious problems, which may or may not be related to some of your older hardware not agreeing with your new card?

    You've stared at the numbers on the site, and you don't see any reason why not. Did you know some sites exist (and make money) just by getting new video cards and "benchmarking" (aka "playing") them? Is this fair? Are you going to contribute to this universally unfair practice? Of course, you clicked through to buy from the first vendor listed on the site. You can hardly wait for the UPS man to come tomorrow (you can afford expedited shipping, you only paid 95% of what you'd pay at a retail store anyway).

    As a savvy PC gamer, you've already downloaded the latest crack off Usenet. You never pay for software-why should you? The hardware costs enough as it is, besides, each game on the PC is just an iteration of Doom or Command and Conquer. Brainless blowing away, or boring resource management? You love 'em both. Or at least, they're available, and you play them.

    You laugh at your buddies with an Xbox, because "I can build a more powerful system than that for half the cost!" You've scorned the Gamecube because "The Gamepurse is for kiddies!" Your Playstation 2, purchased for Final Fantasy X, lies collecting dust next to your DVD player (which sucks compared to the one on your computer-NATCH!)

    You pause a bit to think about your computer purchases over the last year:

    • Athlon T-bird and motherboard-$250.

    • Athlon XP and motherboard-$400.

    • "L337" Custom Water Cooled Case-$300

    • 1 Gig RAM (purchased 256MB at a time)-$400.

    • SB Audigy-$95.

    • GeForce 3-$350.

    Now this GeForce 4 will be about $400, but it's worth it! Buy a Mac? Never! They don't have games, and besides, they're too expensive.

    Buyer's remorse never seizes your temples with its steely vice grip. You'll never lose your job at the helpdesk, and even if you do, Mom and Dad will be there to help you out. You're a sharp guy, and you're surely going places. Right after this game of Return to Castle Wolfenstein, that is...

    • GeForce4 4200 $200
    • This reminds me, a Fry's Electronics had an ad in the paper today that I was really tempted by. Their website, Outpost.com [outpost.com], has the same product for about twice the price. Anyway, the deal is you get a Duron 950 CPU with a motherboard and case with a 300W PSU for $99. Add a fan, drives, video card, and RAM, and you have yourself a pretty killer machine for around $300. That's less than the GF3 Ti500, and it would require some top of the line games to notice much of a difference. Anyway, check it out [outpost.com], outpost has it listed for $179.00 but I can pick it up at Fry's for $99.
    • The Geforce4 that is being previewed is 179-199$ usd. While I find your post amusing and your point valid. I despise people who comment on articles before reading them, well I hope you did not read it anyway.
  • by pointwood ( 14018 ) <jramskov AT gmail DOT com> on Monday April 22, 2002 @04:58AM (#3386365) Homepage

    As with almost all graphic card reviews, the only tests/benchmarks this review has is games. I don't know about the rest of you, but I actually don't play games the majority of the time I'm using my PC and therefore this review is sadly almost useless to me.

    I would like to see a review that actually had a serious focus on 2D performance and quality.

    No matter what, I'll not buy a Geforce4 card - AFAIK they have and need active cooling and I don't need that - I want a card with passive cooling! A Geforce3 TI200 should actually be able to run with only a nice large heatsink and that is what I believe I'll be buying soon. It is much cheaper too and it's 3D performance is still excellent.

    • Every GF3 Ti200 I've seen has a fan, and needs it. It's very quiet fan (on my Siluro)...probably about 25db, and a relatively low pitch...but it is a fan. You could run it in a quiet PC without a problem, I think, but if you expect noiseless, you're out of luck. I run a quiet PC, and I'm OK with it...I can only barely hear it in the middle of my other voltage-reduced fans....but you might be pickier than I am.

      You'll have to go down to a GF2 MX or TNT2 before you get a card that doesn't need a fan at all. Your 2D performance will be roughly the same as the GF3 (though the 2D quality is noticeably higher on the later nVidia cards), but 3D performance will be significantly worse.

      Geo
      • I'm pretty sure I'm correct since the Nvidia Geforce3 TI200 reference card came with passive cooling only ;)

        A lot of cards only come with a fan because it "looks cool" - not because they need it. It's the same thing with motherboards and the chipset. A lot of motherboards makers put a small heatsink and a fan on it, even though a larger heatsink would be enough. Take the latest boards with the Via KT333 chipset - a lot of the boards have a fan on the chipset - the Asus boards doesn't.

      • Hm. You're right, re: reference. I'd forgotten about that. Still, none of the manufacturers went with it. It may be market pressure, and it may be stability. I guess you'll find out. :)

        FWIW, I was able to get one hell of an overclock on my card, with a pretty craptastic heatsink and a thin fan. I guess given that, you probably -will- be able to run at stock speeds with a heatsink and decent airflow.

        Still, don't pull off the fan until you've tried it. Unless you're running a fanless system, I'll lay odds you won't hear the GF3 fan. Assuming your design doesn't divurge much from the one I have, it's really quiet.

        Geo
  • Is it just me (Score:5, Insightful)

    by CyberDruid ( 201684 ) on Monday April 22, 2002 @05:36AM (#3386437) Homepage
    ...or did 3D-gaming get old several years ago. Granted - Doom was damn cool. Ultima Underworld was nice too. The zillionth FPS was just a yawn.

    In the mid 90s, for some reason, something happened. Suddenly the mainstream opinion was that a game without 3D was somehow inferior to the 3D ones, so *everything* had to be 3D. Face it - 3D is just a gimmick like anything else. For most games, 3D is just wrong. It makes the interface bad and worsens gameplay. We humans are by nature not fully 3D-compliant (e.g see Rubik's Cube for proof). Imagine what a pain in the ass a 3D window manager would be (yeah I know some people research it, but that is their problem, isn't it?).

    IMHO games are now in the childish state of "the more real it looks, the better". Now, I am certainly not opposed to the idea of beautiful games. I want stunning, great looking games. But where would art be today if it had stopped at the rather primitive notion that the painting that most resembles reality is the most beautiful?

    I don't know about you, but when Heroes of Might & Magic III came out (New World Computing makes arguably the most beautiful 2D-graphics in the world), I was far more impressed by the beautiful details and the general mood that they managed to generate, than by the graphics of Quake III (or whatever FPS-clone was the current rave then).

    Don't get me wrong, there are games that benefit from 3D (Tekken comes to mind), but not *all*. Is there even a non-3D game available for the xbox?
    Damn the lemming mentality of the game publishers... Will I ever see stunning artwork again?
    • One of the coolest games out for the mac is EV Nova [ambrosiasw.com]. It's a simple 2D game that features very rich game play. It's rare these days. Sad.
    • Don't worry- the same thing happened to the art world when making realistic paintings first became possible. Then the camera make it irrelevant and art reacted by moving away from photo-realism. The same will happen in computer graphics and games. Once we have perfected photorealism it will become passe and new areas of creativity will be explored.
      My guess is when 'Final Fantasy' (the movie) level graphics appear in games we will see a reaction against photo-realism and see some really creative ideas start popping up (most really terrible, but some great).
    • Yeah, I guess it's just you.

      Personally, I have recently been stunned by the quality of the graphics in Serious Sam 2E. I thought I wouldn't see anything more beautiful than Unreal/UT before U2 came out. I was wrong.

      And what's wrong with playing the old games? I have wasted some major time recently on replaying Crusader and X-COM. Kickass games are worth keeping and replaying.

      Keep in mind that good games come out rarely. That doesn't mean they don't ever come out.
    • Whether or not you're impressed by 3D games is not really relevant to a review of a 3D accelerated video card. You comment is nice, but it's a bit like discussing the merits of driving vs. walking in response to a review of a car. If you don't like driving, that's nice, but it has nothing to do with the relative merits of the car vs. other cars.

      This is a review of a 3D accelerated video card. It is designed to render 3D games, so reviewing it with respect to how well it does that job is really the only useful way to discuss it.

      I have no comment on your ideas about the merits of 3D gaming. I happen to enjoy 3D games a great deal. I also like chocolate and I don't like cheese. What of it?
  • A better preview, done weeks ago, by a more reputable site:

    [H]ard|OCP's first Ti4200 preview [hardocp.com]

    [H]ard|OCP's second Ti4200 preview [hardocp.com]
  • by IAmBlakeM ( 469721 )
    is a comparison between the 64MB and the 128MB version. He tests the 64MB version time and time again, but then tosses in a reccommendation for the 128MB card at the end? A little explanation would be nice.

    The 64MB card, at the stock clock of 500Mhz, outperforms the 128MB card at 444Mhz in almost every single test, obviously because of the large difference in memory bandwidth available from memory to core and back. The HardOCP review of the same card shows the 64MB card beating the 128MB by a few FPS in almost every test. The 128MB card should be the one sought after, but only because the memory on the 128MB card can be overclocked to exceed to 500Mhz memory spc on the 64MB card. You can always overclock the 128MB card, but you can't add more memory to the 64MB one.

    Wish reviewers did a little better job of explaining why the reccommend things.
  • I've been looking to find good video cards for high resolution flat panel monitors but want them to be driven digitally instead of with an analog signal (even one sneaking in through the analog connectors in the DVI-I connector).

    But really high resolution displays have been made useless for many graphics cards that only support resolutions up to 1280x1024 or 1600x1200.

    I had hoped that the recent nVidia chipsets would have some good TMDS hardware.

    Do they?

You are always doing something marginal when the boss drops by your desk.

Working...