Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

SMP-Oriented Video Card Round-up 151

Jason Mitchell writes "I just noticed that 2CPU.com has posted a rather large video card round-up. They ran game and application benchmarks on a dual Athlon MP and Xeon workstation and also did some unique qualitative testing pertaining to s-video output quality. It's a good read."
This discussion has been archived. No new comments can be posted.

SMP-Oriented Video Card Round-up

Comments Filter:
  • Why? (Score:2, Informative)

    by floamy ( 608691 )
    Why only the older video cards no 9700pro/gffx?
    • Re:Why? (Score:3, Insightful)

      We're not all rich momma's boys, you know. I'm still running a Ti4400, and I consider it to be very good :)
      • Re:Why? (Score:1, Insightful)

        by Anonymous Coward
        GeForce3 Ti440! Luxury, luxury! And you call him a rich momma's boy?

        I've only got a $50 GeForce2 MX for crap's sake!
        • I have a Creative Labs 3D Blaster Banshee (16MB AGP). Bugger off with your fancy-ass video card.

          kthxbye ;-)
          • Re:Why? (Score:3, Funny)

            by teamhasnoi ( 554944 )
            I have a 4 meg Voodoo, and a 4 meg S3.

            Ha! My cards SUXXORS the most!

            • I have a mobo sitting on my dresser that features an intel 810e. 4mb too. It's easily the shittiest motherboard I've ever seen, even the guy who built my current box had to ask "what the hell is this shit?". It's a really fucked-up board. It has a Socket 370 and a Slot 1, and if you plug in a USB hub, it'll keep the CMOS memory alive, in case the battery dies, which happened to mine. The computer it came in had a 145watt PSU.

              Compared to my current box (Asus P4B533, P4-1.8, 256MB, Sapphire Radeon 7000 64MB, 420watt Vantec Stealth Aluminum PSU, etc), it's amazing I lived with that P.O.S. for so long.

              • My computer outputs all display info to COM1 which is attached to an automated asskicking machine. That machine proceeds to kick my ass right in the head until I refresh my whiteboard with the next frame using nothing but dry-erase markers. And the black pen is shit out of ink.
              • The computer it came in had a 145watt PSU.

                So do the new professional line of Gateways. The ones in the Oakland Raiders color scheme, with the plastic foreskin that flops down over the drive bays.

                It might even be a 120.

                Jesus wept.

                --saint
        • hey, at least that ought to outpace my voodoo3.

          but the old clunker's plenty fast enough for me - it can outrun my 233MHz pentium MMX pretty easily. all i really need it for is bzflag, anyway, and the way i play, i'd be getting my ass whupped no matter *what* video card i owned...

    • What I want to know is why they didn't include a review of my Riva 128 based card?

      I mean, I've built myself an awesome little dual-processor Xeon machine, but I really want to know how it stands up to those Voodoo2 SLI configurations I keep hearing about.

    • If you want to see the warts come out on an immature video driver, try to use it on a high end dualie. Hard to run benchmarks when the driver keeps barfing.

      Even when the benchmark isn't multithreaded, having a second CPU means one CPU is free from all of the other chores to simply run the benchmark. Example: I get a decent frame rate with Quake II running under WINE in a window on my dualie. Try that on a single CPU box.
    • I have a Linux server with some ISA slots. It is a dual Pentium 133 runing Redhat 8.0. I uses a MONO MONITOR CARD, 8-bit ISA bus and a monochrome monitor. For those that do not know what I am talking about, this video connection is a DB-9 not the VGA style HD-15. The card is implemented in relatively discrete logic and is a full length card (meaning it engages the support slots at the front of the case).

      From my experience, RedHat has not spent too much effort on supporting this level of technology. Upgrading was a bitch, some things in the choices could not be seen.

  • by blurfus ( 606535 ) on Thursday February 13, 2003 @09:33PM (#5299165) Journal
    From the site
    If you're bored and feel like sifting through 20 pages of analysis,click here
    However, if you are really bored or at work, click here [slashdot.org]
  • by teamhasnoi ( 554944 ) <teamhasnoi@yahoo. c o m> on Thursday February 13, 2003 @09:35PM (#5299175) Journal
    Also known for his tremendously anal and picky nature...

    That's a bad combo if you're going to sit down and review graphics cards...

  • by Acidic_Diarrhea ( 641390 ) on Thursday February 13, 2003 @09:37PM (#5299187) Homepage Journal
    Hmm, my good ol' Voodoo 3 isn't listed. Maybe it's time to upgrade?

    Although, as I've gotten older I've lost my interest in the computer games market and thus, my video card isn't quite so important. I just like my consoles, where I can just pop the disc in and start playing (after significant load time.) Having to worry about and, for that matter, consider if I have the right drivers is something I just don't have time for these days.

    • My STB 128 isn't there either...
    • Yah, I just "upgraded" from my Voodoo3 to a $50 GeForce2 card. Bought the V3 'cause at the time, 3dfx was the only company truly supporting 3d (Quake/Quake2, my only priorities at the time) in Linux. Upgraded not for performance but for ease of configuration and such in Linux.
    • Take, say, UT2k3, which is incredibly fun and addictive. And I would never think of playing it without a mouse; nor zooming in w/ a rifle on TV resolution.

      How about War Craft III? and if we want an un-arguable "good" strategy game: CIV? the due-out MOO3?

      I mean, I tried Max Payne and MDK2 on a PS, and I am sorry but it's just not the same. Max Payne is about downright not-playable. MDK2 is bearable but suffers a great deal. a far cry from their computer counterparts.

      But on the other hand, I agree that video cards has been less of a concern as of late. I run UT2k3 on my laptop w/ a mobile radeon 7500. not highest resolution (1024x768), and "normal" features - so while the framerate is not tops and the picture quality is not super-fidelity, (and I have to admit that I do have some texture problems every now and then) - it's playable and I deal with it. For all liklihood I will not touch desktops again, unless some serious disposable income comes my way.
    • Games may seem unnecessary but there are other things to do with a video card. Like run a 21" monitor at 1600x1200@85Hz. My Geforce 2 GTS, not an incredibly old card, can't do that. Or run dual-head. Or tune TV and capture video.

      I don't think it will be that long before a 3D accelerator will be required to run the default settings for new OS releases. I believe OS X already does. There are several non-default settings for Windows that are much better with hardware acceleration. And I'll bet you dollars to doughnuts there's a bunch of open-source developers coding up a new 3D desktop paradigm as we speak.
  • This puts my Banshee to shame.

    If I had some actual money, I might grab a Radeon 7500 VIVO ... of course, what I really want is some Matrox eTV card ... :)
  • Yawn (Score:1, Funny)

    by stratjakt ( 596332 )
    another video card roundup.

    Anything we haven't heard a bajillion times?

    and the ATI vs nVidia fanboy flamewar rages on...
  • ...a dual Athlon MP and Xeon system...

    One of each, or what?
  • Good to see (Score:4, Interesting)

    by amigaluvr ( 644269 ) on Thursday February 13, 2003 @09:48PM (#5299240) Journal
    Great to see some comparison that's more than just framerates in quake

    Quality for video output into different devices other than a standard monitor are important. Television is a lesser technology than say a trinitron or LCD monitor, but still there is a great difference from a good card to a bad one.

    Getting the most out of hardware is sometimes difficult when you dont fit the standard gamer user profile. I hope to see more reviews like this

    note: slashdot user 'danamania' is a transexual. be careful talking to him
    • Re:Good to see (Score:2, Offtopic)

      by Mononoke ( 88668 )
      note: slashdot user 'danamania' is a transexual. be careful talking to him
      Why all the trolling about this. Who really cares? Is there something about his/her converstational patterns that might affect us in some way?

      Or were you embarassed by the answer you got to the lame old 'A/S/L?' question?

    • Getting the most out of hardware is sometimes difficult when you dont fit the standard gamer user profile. I hope to see more reviews like this

      Amen to that. My home system is quite similar to the review system (dual athlon MP), and I don't play games. Thus, looking at reviews of video cards is usually (for me) pointless. They give framerates in all the latest 3d games, but don't really give usability indications. Maybe if I'd seen this review before I bought my computer I'd have bought a different video card instead of the GF4 I decided on.


    • Television is a lesser technology than say a trinitron or LCD monitor,

      I'm sorry, what was that? I could'nt hear you over my trinitron TV exploding into a puff of logic.
    • No real point in testing LCD since, barring a fuckup, all cards should look the same. If you are using a DVI connection (and if you have an LCD you should) then it's all digital. So teh quality is entirely dependant on the LCD.
  • by Dunark ( 621237 ) on Thursday February 13, 2003 @09:55PM (#5299268)
    Too bad they didn't mention one of the bummers about the Matrox G550: It only supports video playback to the S-Video output when you set your whole desktop to 1024 x 768 16-bit color. This is a major disappointment if you're used to running your display at 1600 x 1200 24-bit.
  • s-video (Score:4, Informative)

    by Mononoke ( 88668 ) on Thursday February 13, 2003 @09:56PM (#5299270) Homepage Journal
    The resolution of the s-video output was limited to 800x600...
    I'll bet the s-video output was limited to 400 lines of NTSC video. There is no XXXbyXXX measurement of NTSC video.

    Yeah, I know what he meant (ie: The highest resolution that could be downconverted to NTSC was 800x600.) but most people won't, and that's the whole point of a review.

    Next time you want to compare s-video outputs, use the proper tools and terms.

    • Re:s-video (Score:3, Interesting)

      by Jeff DeMaagd ( 2015 )
      Worse yet, a lot of cards by default won't let you use the overscan area on the s-video output, so if you have a small overscan margin, the picture is surrounded by a huge black border, and the image has some downscale blurring even on a 640x480 setting that I don't appreciate.

      I understand that it is to make sure that the entire desktop fits a typical screen, I would like to have easy access to how it is set. I actually try to see as much of the actual video signal as I can so I've adjusted my overscan to about 1%.
  • Freak (Score:2, Funny)

    by vandel405 ( 609163 )
    Anyone else notice that they link to slashdot with the word "Freak" in the svideo round up?

    Repeal the story!
    • Goatsx links...

      Penis Birds...

      Natalie Portman dancing, covered in Hot Grits...

      Soviet Russia...

      3. ????
      4. Profit!!

      Don't get me wrong, I love Slashdot... but what about the above makes the moniker "freak" seem terribly unreasonable? (OK... the Natalie Portman thing could be chalked up to adolescent testosterone poisoning, but Goatsex? Cmon!)
  • "Not all of them are the newest and swankiest on the block (some are actually quite dated), but we wanted to at least include analysis of cards from all the major players in this industry"

    Good to see this...I hate only seeing reviews of the latest $300+ cards, since I'm not THAT rich

    "Furthermore, we will hit you with a smorgasbord of benchmarks on both a dual Athlon MP system and a dual Xeon workstation."

    And here, they lost me. How about some AVERAGE systems to go along with the average cards... I don't know how much of a difference 2x processing will make in most games, and I'm certainly not likely to even consider that route for a gaming system.

    • I don't know how much of a difference 2x processing will make in most games, and I'm certainly not likely to even consider that route for a gaming system.

      Errr, that was the whole point - to look at video cards from a non-gamers point of view. Not all of us use our systems for gaming, you know.

      If you want gaming reviews, go to a gaming site...

    • Re:Interesting mix (Score:4, Interesting)

      by WiPEOUT ( 20036 ) on Thursday February 13, 2003 @10:15PM (#5299355)

      As the site's name implies, the review is oriented towards examining the video cards on current SMP (two-CPU) systems.

      There's hundreds of non-SMP reviews out there, but here's one that's useful for those among us that have duals. You know, to actually do things as well as play games, to be able to really multitask, and to develop for SMP (read: server) environments. Add improved stability, and you've got a case for improved productivity despite the increased cost.

      It's just as shame they didn't include the high-end cards.

    • When I'm not geeking for work on my desktop machine (dual Xeon 2.6's), I do enjoy shooters like quake and unreal tournament.

      Quake3 is unusual in that it actually does use SMP and runs flawlessly at 1600x1200x32. Other games don't use the second processor... but the nice thing about it is if you leave a dozen apps running, you won't suffer death by lag when you log onto UT2003.

      Of course, there is a down side.. I have one less thing to blame when I get my butt kicked by a couple of 12-year-olds using worn-out P3 machines with dial-up connections. Ow.
    • Comment removed based on user account deletion
      • Wait, don't you still need to unlock the XP chips? And while you're unlocking, you should be able to OC the 1600s to something a little hotter, right?

        Someone who knows, please clue me in. I've gotten as far as pricing dual athlon systems, but I'm still putting together an ideal (but inexpensive) set of specs...
  • by McQuaid ( 524757 ) on Thursday February 13, 2003 @10:13PM (#5299345)
    My experience from my geforce 4 to my friends ATI radeon is that radeon's svideo out is much better than geforce's offerings but neither are that great. I also have a external scan converter (iMicro avermedia) which probably beats them both, but still has issues with filling the screen properly and vsync issues. Are the manufacturers just being cheap on s-video out or is their some technical hurdle that makes it impossible to have a video out that can rival a dvd player?
    • A standard NTSC TV only has 525 lines or so (I think it's 625 for PAL), but that includes the VBI (where closed caption stuff is sent, among other things, IIRC). So if your watching DVDs using some software DVD player, then what's going on is this:
      1. DVD Video is usually about 480 lines high (I think)
      2. This gets convered to 600 or 768 or some other resolution for display on the monitor
      3. The TV-Out hardware has to take that number, and turn it back into the resolution of the TV, whatever that may exactly be
      So basically the problem is that you're blowing it up, then shrinking it back down, and you lose a little quality each time.

      Now if your card is good (or if you have a hardware DVD only card like a ReelMagic (great cards)) then things are different. When you use the TV out on such cards (or if a normal card can put the DVD out directly to the TV skipping the middle) then you only have one scaling, the same scaling that a DVD player would do. This gives you a much better picture. My ReelMagic Hollywood+ rivals most standard DVD players (up to about $150 maybe?) in my eyes.

      So basically the problem is that it's much easier (I think) to simply decode the DVD onto the framebuffer and output that (the first method) than to bypass the framebuffer and output directly to the TV (the second method). And let's face it, while it was great to watch DVDs on your PC and hook it to your TV years ago, you can now get a decent new DVD player for under $80, so it's not a feature in high demand.

      • NTSC DVD resolution is 720x480. However, it's not always quite that simple. Many NTSC titles are stored in a 16:9 anamorphic format. If you have a 16:9 display they are just outputted as is, but if you have a 4:3 display then they have to be stretched horizontally to produce the proper aspect ratio.

        Some video cards (such as the GeFroce 4) support a 720x480 resolution.
  • Video Card Reviews (Score:3, Informative)

    by argmanah ( 616458 ) <argmanah AT yahoo DOT com> on Thursday February 13, 2003 @10:14PM (#5299348)
    My personal preference as far as hardware review sites is Tom's Hardware Guide [tomshardware.com] (formerly http://sysdoc.pair.com). He gives much more insight into testing methodology and has access to a greater variety of hardware than the article linked to in the story. He also does more testing than game framerates, like Solidedge and 3D Studio Max benchmarks.

    In addition, Tom sorts his results! The results in the story's article aren't sorted by performance, so if I want to find the card that performed the best in any specific benchmark, I have to scroll up and down the chart to see which number is highest.

    Admittedly, your mileage may vary on a system with multiple processors, but in the end, this is a video card test, isn't it?
  • ... or minesweep perhaps. Maybe for chatting in a REALLY fancy font.
  • Dual Head on Linux (Score:5, Interesting)

    by Soko ( 17987 ) on Thursday February 13, 2003 @10:25PM (#5299391) Homepage
    I recently (well, 2 months ago) upgraded my workstation to a P4, and had the pleasure of trying to set up a dual head system under RedHat 8.0. I tried the following cards, in order:

    Matrox G450 DualHead (Cost: Rescuing it from the trashbin at work):

    I loved Matrox cards under Windows, and they had a good rep with the Linux crowd, so I gave this one a whirl. I got the dual head working with the Matrox drivers without too much fuss. However, artifacts from one screen would just appear on the other screen, borking my display. For example, any time I used a pull-down menu on the second screen, the fly-down would apear on both screens. Couldn't fix that for love nor money, so I decided to part with some $.

    ATI Radeon 9000Pro (Cost: $229 CDN):

    Bleah. This card worked OK on single screen, but even there it just "felt" a little shaky for some reason. Dual head just would not work at all - X would panic each and every time. After 4 nights of mucking about with it, I gave up and exchanged it.

    Pine XFX GeForce Ti4200 128Mb (Cost: $349CDN):

    I had this card in, running X and set up in dual head in under 2 hours. 2D is crisp, fast and the dual head works as you'd expect. It's a keeper (esecially after trying out the UT2K3 demo). Updating the kernel causes a re-compile of the drivers, but I wrote a script to do that so it's no hassle now. OK, they're closed source drivers in reality, but I don't care - my card works as I want.

    In the end, the drivers that a video card uses are just as important (see ATI) as the hardware itself. Think about that before you buy that dual head card for your workstation.

    Soko
    • by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Thursday February 13, 2003 @11:00PM (#5299558) Homepage
      I just spent the last two weeks to get the TV out on an ATI All-In-Wonder Pro 8mb AGP (old :) to work under Linux (which incidentily is pretty bad, though it was great under windows), and it was a major pain. To get it to work I have to run a driver from the GATOS project that is from their CVS, patch it heavily so it supports things like XV, and run it on a bleeding edge copy of XFree just to get the semi-crudy TV out to work. It's the worst time I've ever had getting X to work correctly in years and years. ATI seems to think they support Linux, but they don't really do anything, as far as I can tell. They won't even tell people how their cards from 5 years ago work so those can be made to work flawlessly, even though they make no money off them at all. nVidia may not have opensource drivers, but their closed source driver works great. I've never had problems getting nVidia cards to work under Linux. It even works quite well with the "unsupported" development (ie 2.5) kernels, with just a tiny patch that they actually seem to promote. Alot of companies could learn from nVidia, IMHO.
      • To get it to work I have to run a driver from the GATOS project that is from their CVS, patch it heavily so it supports things like XV, and run it on a bleeding edge copy of XFree just to get the semi-crudy TV out to work. It's the worst time I've ever had getting X to work correctly in years and years.

        But remember how video codecs were like that a couple years ago, now they're faster, more reliable and better quality than on Windows, in spite of the intentional obstacles in the way.

        ATI seems to think they support Linux, but they don't really do anything, as far as I can tell.

        Yes, it smacks of evil. Well, continuous gentle pressure on ATI is the best medicine, and do like you did, buy from the competition. They'll get the message. The only wrong thing to do is stay silent.
    • Updating the kernel causes a re-compile of the drivers, but I wrote a script to do that so it's no hassle now. OK, they're closed source drivers in reality, but I don't care - my card works as I want.

      Please clarify (how does one re-compile a closed source driver?). Not a big deal, I'm just curious what's needed to get this setup running.
      • As I understand it, NVidias drivers basically use the assmbler in gcc build the "bianry only core", and source to link that core into the current kernel. IOW, thier package includes a binary image of thier driver that they insert into kernel space with a source code stub.

        Full installation instructions are on the NVidia website.

        Oh - the script. I use the source tarball from the site with the script below. The RPMs aren't updated quickly enough. 2 caveats:

        The must be run as root, and you require the kernel headers for it to work.

        #!/bin/bash
        cd /untar/NVidia/NVIDIA_kernel-xx.x-xxxx
        make install
        cd /untar/NVidia/NVIDIA_GLX-xx.x-xxxx
        make install

        If you've done this once and modified /etc/X11/XF86Config already, you're good to go. If not - Read The Fine Manual on setting that up, available on the NVidia site

        It could be prettier (use sudo, check for errors etc.), but I'm a bash noob really- works for me. Use at your own risk!

        Soko
    • Thanks for the report. It is interesting to learn what doesn't work so I can avoid it. For those of us who want to run with only Free Software loaded on our systems, any suggestions on which card(s) to buy? I'm chiefly interested in running a dual-head system at 1600x1200@85Hz with a pair of Mitsubishi 2040U monitors (if the monitor make and model matters).

    • Matrox G450 DualHead (Cost: Rescuing it from the trashbin at work):

      I loved Matrox cards under Windows, and they had a good rep with the Linux crowd, so I gave this one a whirl. I got the dual head working with the Matrox drivers without too much fuss. However, artifacts from one screen would just appear on the other screen, borking my display.


      I think this may well have been a problem with your window manager - I run Windowmaker under X with a dualhead setup on this card, and I've never had the problems you describe. The only issue is that Windowmaker's Xinerama support is occasionally a little flaky (ie dialog boxes in the middle, spanning both monitors).

      --saint
  • MX? (Score:3, Informative)

    by dolo666 ( 195584 ) on Thursday February 13, 2003 @10:25PM (#5299393) Journal
    I'm fairly happy with my Geforce 4 MX, which is a big step up from my old Geforce 1 ddr. I've been utilizing the video capture feature of it and you can download a movie of me outrunning cops in Vice City, HERE [fileplanet.com].

    I'm just curious... what is so bad about MX that it only cost me $112 Canadian dollars to get the card? I find that it gives me pretty good fps in Quake 3.

    But Doom 3 will be another story, methinks. :)

    • Re:MX? (Score:3, Interesting)

      by Edgewize ( 262271 )
      The Geforce 4 MX isn't really a Geforce 4 at all. In fact, it isn't even a Geforce 3. It has a core very similar to the Geforce 2 series, with only fixed-function pipelines. In other words, it is a DirectX 8.0 part and cannot run programs that rely on vertex programs or pixel shaders. It can't do any of the really cool things that programmable pipeline cards can, like per-pixel environment reflection mapping, motion blur, or special lighting effects (saturation/desaturation, color warping, etc).

      It's reasonably fast at what it does, so it will run Doom 3 at a decent framerate. But it doesn't support the expected features for a card of its generation, so it will be running at low detail with no special effects.
      • Re:MX? (Score:5, Informative)

        by htmlboy ( 31265 ) on Thursday February 13, 2003 @11:25PM (#5299677)
        It's reasonably fast at what it does, so it will run Doom 3 at a decent framerate.

        in a .plan update from a few months ago, john carmack said this:

        Do not buy a GeForce4-MX for Doom.


        Nvidia has really made a mess of the naming conventions here. I always thought it was bad enough that GF2 was just a speed bumped GF1, while GF3 had significant architectural improvements over GF2. I expected GF4 to be the speed bumped GF3, but calling the NV17 GF4-MX really sucks.


        so a gf4mx will run doom, but it won't be pretty.
    • That's why if you are looking for hardcore gaming
      performance, the best bang for the buck right now
      is one of the geforce3 class cards. You can pick
      up a ti200 for about 70 bucks at pricewatch.com.
      You can then overclock it using the detonator
      drivers on windows, or nvclock on linux. If you
      can find the original geforce3, it's slightly
      faster than the ti200. If you luck out and find
      a ti500 like I did (65 bucks on eBay) get one of
      those. They are only about 4 percent slower than
      the geforce for ti4200, and a lot cheaper if you
      can find one.
  • While the author qualifies his conclusion with "Unfortunately, due to the rather long, and, I'm afraid, unavoidable delays in publishing this article...The video card market has simply advanced too far", he then goes on to say "On the other end of the spectrum, there's the erratic performance of the ATi cards. Whether it they were limping along behind even the G550 in Solidworks, or taking 75% performance hits with dualhead enabled in many of the SPECviewperf tests, we were left shaking our heads. There's no denying that ATi's drivers have come a long way in the past few years, but it seems as if they have yet to reach nVidia's level. "

    well newsflash they have skyrocketed past them with the r9500. Anyone who has been following the release of GeforceFX knows that the seven month ati card holds its own against the nv30, which Nvidia have decided to stop making before it even hits the shelves as the performance gap is so stunning.

    • But he said their drivers need work, and they do for
      the reasons he states. The hardware is pretty damn
      impressive. It's a shame the drivers aren't. And
      how can the performance gap be stunning if the
      ati card holds it's own? From what I've read they
      perform about the same when you average out what
      each card does better. I'm willing to bet cash you
      own one of these :)
  • by Anonymous Coward
    Here's what to do:
    • Steal some pictures from Penny Arcade
    • Quote some specs from the web page for a video card
    • Take a picture of your tv set
    • Use lots of 'hip' things like bloodshot eyeballs in graphs.
    • Don't worry about including useful information, it'll get posted to slashdot.
  • zerg (Score:3, Insightful)

    by Lord Omlette ( 124579 ) on Thursday February 13, 2003 @10:50PM (#5299504) Homepage
    You know how I know these guys are keeping it real? Look at the author's video card: Matrox G550! (my card, w00t!) Reason he won't get a new card? He has no cash! No one's paying him to do reviews! He has no conflict of interest!

    And that's why I might be inclined to take this seriously, if I could actually afford hardware made after 1999 :(
    • Re:zerg (Score:2, Interesting)

      by Jim_2CPU ( 650362 )
      Heh. I could afford something faster. I just don't feel the need because I game about once a year at this point in my life. If I feel like gaming, I use the Mobile Radeon in my laptop. (I wish I was joking, but I'm not)
      • Using a dual celeron 466, a Matrox G550, and an Audigy Gamer, I can run Age of Mythology and Warcraft III. So in my silly mind, I'm ok for gaming. It's not Quake, but it's still gaming.
  • That sucked! (Score:2, Insightful)

    by Lurgen ( 563428 )
    It's not a "Good read", it's a bloody lousy read!

    Nothing new in there, the hardware was either old or uncommon, and I didn't see a single detail that was unique to them.

    What the f&#k were they thinking, including an antique Matrox in the list? And that Radeon 7500...? OK, they were nice a year ago, but who cares! I mean really, if you are going to invest in a dual-CPU machine you obviously have a clue about performance. Why the hell would you read a review of crappy old cards?

    They skim over dual-head results, which was the thing I was really interested in, since despite having a dual-monitor setup at home I have yet to find a game that makes use of it in a nice way (except FlightSim 2003, which really benefits from it).

    Come on editors, wake up and post something relevant! (or at least have the decency to read the review before putting it on the front page, duh!)
    • Re:That sucked! (Score:4, Informative)

      by Arandir ( 19206 ) on Friday February 14, 2003 @02:15AM (#5300157) Homepage Journal
      What the f&#k were they thinking, including an antique Matrox in the list? And that Radeon 7500...? OK, they were nice a year ago, but who cares!

      Okay, time to burn some of my limitless karma...

      Since when is one year ago ancient? Just because something is older than the last time you changed your underwear doesn't make it ancient. I'm getting pretty sick and tired of you munchkins running the video market. It makes it tough for the rest of us who want a solid stable video card instead of whatever the prepube crowd wants this week.

      These aren't "crappy old cards", they're superb modern cards that have been around long enough to prove their merit. Maybe you should go read some reviews about last months cards. Or are those still too archaic for you? Maybe you need to wait till next week to read about this week's cards.

      Some of us have better things to do than to buy a new video card everytime the industry says "jump". Some of us has dropped out of the constant upgrade rat race that you kids insist on playing.
  • From the article (Score:4, Insightful)

    by sawilson ( 317999 ) on Thursday February 13, 2003 @10:58PM (#5299541) Homepage
    "I would imagine a sizeable portion of the readership of 2CPU.com simply don't have the time or the desire to constantly engross themselves in games."

    Up until recently, I would only run dual proc
    systems. Part of it was geek pride and bragging
    rights. I eventually got absolutely sick of dealing
    with the hidden hassels involved with dealing
    with SMP. I took my dual 1 ghz pentium III system
    apart, along with my raid, and enormous server
    case, and sold the whole thing as parts on eBay.
    With the money I made, I put together a freaking
    screamer of a system based on an overclocked
    tbred 1700+. Know what I miss? Being able to run
    xmms while playing quake3 or unreal. I can't do
    that now. Sure, my fps is 4 times faster at
    higher resolutions, and I can play ut2003 and
    it's really pretty. But the fastest video card
    in the world isn't going to make me able to play
    quake3 or any other CPU intensive game if I have
    xmms running, or even kazaa-lite open using wine.
    I really liked killing a few hours waiting for
    music to finish downloading (I'm on dialup) by
    playing games. The next motherboard I get will be
    dual proc.
    • Hmmm...perhaps you should renice xmms to, say, -20 (If you trust it that much), and Kazaa Lite to 19.

      XMMS takes next to no processing power on an XP 1700+. I can run XMMS and Q3 at the same time and still get FPS in the 100s. Q3 isn't that CPU intensive. Nor is Unreal these days. UT2003 I could understand, but that's because it canes my system without XMMS running (XP1600+, GF3).
  • by -tji ( 139690 ) on Thursday February 13, 2003 @11:31PM (#5299708) Journal
    They really missed the boat on this one. They need some real SMP video action.. Our friends at 3Dfx pioneered this with the Voodoo2, operating in SLI (Scan Line Interlace) mode. Two PCI cards, connected via a jumper cable, each handling half the scan lines for the display.. SMP at it's best!

    http://www.hwupgrade.com/skvideo/voodoo2_sli.htm l
  • by AntiBasic ( 83586 ) on Thursday February 13, 2003 @11:49PM (#5299769)
    Two cpus are better than one! The box is in the middle of the slashdot-effect and is handling like a champ.
    • Here [2cpu.com] is a link to the (very short) discussion thread on 2cpu.com regarding being slashdotted. To quote one post, "Which just goes to prove: the Slashdot effect is negated by competent administration (and SMP of course :-))". Here [2cpu.com] is a description of their web-server.
  • What about FireGL? (Score:2, Insightful)

    by qa'lth ( 216840 )
    Makes you wonder why they weren't testing ATI's FireGL series cards - the 8800, and the newer line based on the R300 chips. Doing all those GeForce-based cards doesn't really give any valuable insight into what card might possibly be best - just which geforce.
  • Just found it today, via my gf, which is being showcased at the Softimage XSI Roadshow [softimage.com] being the Matrox Parahelia [matrox.com] which has a TRIPLE-HEAD OUTPUT....among other things

    If only they could afford the $600 to benchmark it against all the other there, eh?

    Later
    Josh
  • I have a dual-P3 setup here for my main rig. Though it's starting to show its age -- a mere 1GHz with PC100 RAM -- I still use it for gaming. Unfortunately, while SMP seems to be a consideration among graphics card vendors (especially since Carmack made mention of threading in Q3A), sound card vendors don't appear to be quite as clueful.

    I bought a Hercules Game Theater XP. I expected -- reasonably, I thought -- that such a high-end accessory would work solidly on an SMP system. Nope. Despite two major driver revisions since I bought the card, it is still horribly unreliable when all the HW accelerations features are turned on.

    If I launch HalfLife with EAX/Sensaura enabled, the game will eventually crash. Leading up to the crash, the echo effects are completely botched. The echo sounds can be heard before the main sound. Sound effects are abbreviated -- the sound will stop before the sample has played out completely (especially true of footsteps). This suggests that buffers are being retired too early, which further suggests that the driver writer isn't locking access to the buffer queues correctly.

    If I use the Audio Properties panel to back off HW acceleration one notch, then the card behaves reliably. Of course, I lose 90% of the cool sound effects...

    It's vaguely possible that my motherboard may be twitchy (Asus P2B-D with ACPI fixes), but since it's never given trouble in Linux or BeOS, I'm not inclined to think so. So far, Hercules hasn't been very responsive on this issue. (Of course, I haven't pressed them very hard on this, either.)

    So, yeah, having a sound card roundup for SMP systems would be a nice thing.

    Schwab

  • Why is Gabe from Penny-Arcade [penny-arcade.com] appearing as 'Jim' in the review? Compare:

    'Jim' [2cpu.com]

    Gabe [penny-arcade.com]

    Very odd.
  • I don't need no stinking high-spec display card in bash. This is not flame-bait but:
    I use a computer for what it is intended to do. I use my PS2 on the other hand for what it is intended to do and besides, if I want to play games , I want it to run in a stable environment like the PS2.

  • There's a question I asked myself at christmas but didn't find the answer and given that we are talking about S-Video here it seems not too inappropriate to discuss it here.

    Friends of mine recently bought a laptop with a DVD drive and a S-Video output. Given that they don't have a home DVD player I tried to hook up the laptop with the TV (which has got S-Video input) and got as far as getting the desktop displayed on the TV.

    However, the Video displayed black. Since then I have come across information that indicates that it is because of using overlay (the video is overlayed directly on the output to the VGA port and not on the S-video) but I am not sure how to turn it of. The information I have got said that I had to turn of Hardware acceleration and I understand it to be:

    Display Properties->Settings->Advanced->Troubleshoot->Hard ware Acceleration->None

    Unfortunately, being on holiday I haven't been able to test it yet, and, given the topic I thought I would ask the Windows-using /.ers their advice.

    Note: I am also interested in how to display on the S-video port with Linux.

    Anyway, even if I cannot help them here at least I managed to switch them from an unregistered Office with 50 tries left to Open Office for Windows and so far they seem happy with it.
    • I have a Dell laptop with S-Video out capabilities. I can get it to display on the LCD, and the TV using the S-Video cable. However, if both displays are left on, DVD video does not display on the TV. The trick for me is to disable the laptop display after booting up and before playing any video. I do this through my ATI driver utility.
  • by adolf ( 21054 ) <flodadolf@gmail.com> on Friday February 14, 2003 @08:56AM (#5301291) Journal
    What troubles me about video card reviews in recent years is that they harp on at length about the ins-and-outs of antialiasing, and framerates, and memory bus bandwidth, but apparently nobody bothers to look at the picture on the fucking monitor.

    It used to be different. In the early-mid 90s, PC rags far and wide would rate video cards primarily on how good they looked. This is mostly dependant on the analog signal path of a specific card, and not tied to a given chipset - things would (and still do) vary widely between different implementations of the same chip. I'm talking about horizontal sharpness (limited bandwidth), image distortion (bad topology), contrast compression (shitty amps) and ghosting (poor termination), to name a few.

    The physics haven't changed since then, and indeed have become more difficult. Resolutions and refresh rates keep pushing upward, and this makes the analog stage proportionately trickier to design properly. Designing an analog circuit for signals ranging anywhere from DC to 400MHz (a pretty common RAMDAC spec, lately) is quite non-trivial.

    Despite this growing problem, even Tom's Hardware doesn't bother to tell you (subjectively, or otherwise) just how good, or bad the picture is on a given card/monitor combination. The closest they come is a note at the end of a Ti4600 review which states that all of the tested cards looked a bit fuzzy on their Eizo monitor, relative to whatever it is that they normally use with it (which they unhelpfully do not identify).

    This German page [tecchannel.de] has some very nice multichannel 'scope plots generated by the RGB output of a plethora of different cards, but offers no subjective interpretation of what they look like on-screen, as far as my English-trained eyes can see.

    Even the most hardcore of gamers probably spend most of their time in front of the PC reading text and looking at porn. Are there any reviewers left in the world who actually make a point of evaluating image quality?

    Here's my stab at it:

    I've got a Voodoo3 3500TV. Works great in X, all features except vidcap working perfectly. Image quality at 1600x1200x75Hz is remarkably good, free of ghosting and pretty sharp on a 4-year-old 19" CTX VL950, though it could be slightly sharper. In terms of speed, it's about as fast with X as it is with XP, and handles all but the latest shoot-em-ups quite playably. The included 5/8"-thick, 6' snake makes for handy connections to the card's well-stocked array of inputs and outputs.

    Its 3.3-volt AGP interface presents an insurmountable hurdle for modern use, however, when one is looking to buy an nForce2-based motherboard (none of which have 3.3V AGP sockets).

    Thus, it needs replaced.

    If anyone has any anecdotes on the fidelity of a current video card, please submit them below. Specifically, I'm looking at ATI-branded Radeon 9000 Pro or Radeon 8500, or who-knows-what-brand GF4 Ti4200. Preferably, the reviews will be more from the perspective of a graphic artist, instead of a gamer, and be based on what things look like at high resolution and refresh rates.

    But at this point, I'll gladly listen to anyone's opinion about visual quality, even if it involves a Happy Mountain Computing Xabre400, plugged into a 15-year-old, fixed-frequency Sun display, and is written by a twitching 9-year-old crackhead who once lost eight teeth to an unfortunate hockey incident.

    Anyone have some light to shed on the subject?

    [I'll leave my tirade about the absolute dearth of modern CRT monitor reviews for another day.]

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...