Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Windows Operating Systems Software Entertainment Games

Vista vs. XP Game Stability and Performance 114

boyko.at.netqos writes "HardOCP does a side-by side comparison with a battery of games to check stability and framerates in Windows XP and Windows Vista. In addition to the lowered framerates in Vista, they had stability issues in Need for Speed: Carbon and Prey. From the article: 'For some titles, especially Company of Heroes and Need for Speed, we saw dramatic framerate discrepancies. What's more, both of these titles have recently released patches! Other titles showed a slight, but essentially negligible difference, such as BF2142, World of Warcraft, and Prey. Really, there was only one instance where Vista was able to pick up a few more frames than XP — World of Warcraft at greater than 90fps, where the human eye can't even see the difference. To see this overall trend against Vista is very interesting and makes us wonder as to the cause.'"
This discussion has been archived. No new comments can be posted.

Vista vs. XP Game Stability and Performance

Comments Filter:
  • An old adage: (Score:5, Insightful)

    by Knight Thrasher ( 766792 ) on Tuesday May 08, 2007 @07:34AM (#19034587) Journal
    'Newer' doesn't necessarily mean 'better.'
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      I'd like to know how much worse Vista is for older games designed for Windows 95/98 (almost everything past that I use DosBox for). XP breaks compatibility on several old ones, I'm wondering how much different Vista is. You'd think if they were smart, they'd drop as much legacy support as they could from the OS itself, but enable some emulation options for older software.
      • There are quite a few games made for win98/95 that I sometimes like to pull out and play. And there are quite a few of thatm that fail to run well inder win2k/XP.

        X-Com Interceptor is one of them. There are lots of others too.

        --Coder
    • by C_Kode ( 102755 )
      That usually depends if you are looking at a x.0 or a x.1 release. :)
    • by Anonymous Coward
      Games designed for XP work best on XP? I'm amazed!
    • Re:An old adage: (Score:4, Interesting)

      by Opportunist ( 166417 ) on Tuesday May 08, 2007 @10:13AM (#19037245)
      Actually, quite a while ago newer actually started to mean worse. Remember those old floppy drives? And how they lasted for an eternity and longer?

      Have you ever bought one in the last, say, 5 years? And if, do they still work? Mine don't. But the one that came with my 486 is still doing its job.

      Same applies to CD-Roms and a lot of other hardware. When I've learned something from my purchases during the last few years, then that newer actually means worse. Not better.
      • by blueZ3 ( 744446 )
        Back in the day, "hi tech" devices like floppy drives and CD-ROMs were manufactured by companies that had a reputation to keep and some notion of quality control. With the massive move of manufacturing to China and the victory of the generic device (and price over all other considerations), that's no longer the case.

        If you buy a CD drive from Frys made by NewCoTech and it fails, it's pretty unlikely you're going to remember NewCoTech when you're out buying a replacement. Even if you do, chances are that by
        • by hany ( 3601 )

          For something like a plastic toy that my daughter is going to play with for a year, it's fine.

          There are a lot of toys from my childhood which are still usable and can whithstand another round of playing by my doughter now in essentialy same condition - and good condition (with the exception of toys meant for older children then she currently is - that is undersandable).

          But the stuff we buy now ussualy does not survive even first round of playing without major damage.

          And I do not think my dougter is playi

      • by TheLink ( 130905 )
        Yeah my old and slow Lite-on CD-RW is still working whereas my newer LG CD-RW died, then the Benq DVD writer I replaced with it stopped working (after not much longer than a year), now I'm with a LG DVD writer. Maybe I should stick to Lite-On ;).

        Or maybe the early batches are sometimes overspeced - because they don't know which corners they can cut yet. Then once they figure it out, the later models die not too long after warranty ;).
        • Or maybe the early batches are sometimes overspeced - because they don't know which corners they can cut yet. Then once they figure it out, the later models die not too long after warranty ;).

          Meanwhile, Lite-on seems to have figured it out. A friend of mine had a few Lite-on CD drives die in the last years. Now, he avoids anything from that company ;-)
          • by TheLink ( 130905 )
            Looks like all of them "figured it out" then :(.

            My first Lite-On CD writer cost 4 x more than my 2nd CD writer and 1st DVD writer, and about 5+ x more than my 2nd DVD writer :). I believe it had the 1st gen Sanyo "Burnproof" mechanism.

            I can't afford a high end Plextor (if my current drive dies, I'll just buy another one), and I heard nowadays the low end Plextors have the same insides as "the others".
            • I have a somewhat older Plextor (CD-Rom Plexwriter from 2003) and it is a very good drive. Later, however, they started to relabel drives they bought elsewhere and sell them for twice the price.
              Today, I like to buy Samsung:
              Cheap enough and so far quite reliable for me and my friends. But take it with a grain of salt because we don't have the numbers of computers to give you meaningful statistics.
      • by LKM ( 227954 )
        I bought a Yamaha CD burner about 15 years ago. Still works. The modern CD/DVD burners I've seen (both external and internal in computers I own) generally stop burning reliably after about a year. Which pisses me off to no end.
  • Useless comparison (Score:4, Informative)

    by GreenEnvy22 ( 1046790 ) on Tuesday May 08, 2007 @07:39AM (#19034649)
    As discussed in the actual article, this review is useless. All it shows is that Nvidia systems perform much slower on Vista then XP. They then go on to conclude that Vista must be slower then XP. It's quite well known Nvidia's drivers for Vista have been absolute trash, while ATI has been on the ball. While Vista will be slower for most games even with ATI hardware, the difference is far, far smaller.
    • by psikys ( 1098367 )
      You are wrong good friend. nVidia drivers for their 8xxx series cards have had problems however the previous generation of cards have performed just fine with little or no problems in Vista - hence why they compared the 7600 to the 8800. Now I agree that they should have tossed an ATi card in there for good measure but regardless - the artical is not - nor is it expressed in the artical - useless. It very much does show that Vista does in fact run slower. Stop being a fan boy and look at facts.
      • Ati fanboy? I'm no ATI fanboy, not by any stretch of the imagination. I have a mix of ATI and nvida hardware in my pc's. It's just simple fact that Nvidia had screwed their drivers for the Vista launch. The new 158 series are better, but still missing functionality. I reread my post, I meant to say "as discussed in the article's feedback section", this is where the posters discussed the lack of ATI hardware.
      • by bynary ( 827120 )
        Define previous generation. I have an NVidia FX 5700 card and am running Vista. Yes, I have 2GB Micron PC4200 RAM, Athlon 64 3000+, blah blah blah. It's not bleeding edge, but it's hardly a slow machine so I think it's safe to rule out the rest of my components. I just updated to the latest drivers. All my games perform horribly under Vista; they ran fine under XP. Guild Wars runs at ~25 fps. Rogue Spear won't even run. I stopped trying to run anything that relies heavily on 3D acceleration under Vi
    • by sbate ( 916441 ) on Tuesday May 08, 2007 @07:50AM (#19034833)
      It is not useless if you have a 2gz P4 Nvidia system and you wonder what would happen if you "upgraded". The article was interesting had a wide range of games and had Christmas tree graphs that showed how lame Vista is in game performance and how little you get for your money by "upgrading". It is fair to say that on newer computers you get about the same performance as your old computer so you will not be losing anything. I cannot for the life of me think about anything worth upgrading to vista for a home PC yet. It is not any safer and is more annoying and has the same printdriver scanner support as Ubuntu. When I upgrade my work computer it will be to Ubuntu and my gameing rig to an DS and PSP duct taped together back to back.
      • True, it's not useless if all you are interested in is how Nvidia hardware works in Vista. I guess it would just be better for them to advertise it as an article about Nvidia hardware on Vista/XP
      • by itchy92 ( 533370 )

        When I upgrade my work computer it will be to Ubuntu and my gameing rig to an DS and PSP duct taped together back to back.

        For whatever reason, that last part made me think of Sidetalkin' [sidetalkin.com]... specifically, this image [imageshack.us] (harmless link, just re-hosted out of respect to the Sidetalkin' site).

    • by aafiske ( 243836 )
      Except that ATI as far as I know doesn't have a dx10 card out, which is the main allure of upgrading to vista, as far as I'm concerned.
      • once dx10 games come out yes, if ATI doesn't have their card out by then they will be disadvantaged, but until that time, DX10 cards are not needed.
  • reason (Score:3, Insightful)

    by spykemail ( 983593 ) on Tuesday May 08, 2007 @07:47AM (#19034763) Homepage
    I just don't see a compelling reason to upgrade to Vista. I already have Mac OS X and Windows XP, why should I buy a new version of Windows when I can already play games on XP and work on OS X? I realize that at some point I'm going to need to upgrade because Windows-only developers will leave XP behind, but still. That won't be for a while.

    Most of the games I play are classics at this point anyway, unless Blizzard's new game requires Vista I think I'll be ok :).
    • DirectX 10?

      All DirectX versions have been adopted by games devs, very shortly after they have been released.

      That's why... Then, I'm not a gamer and thus do not care.

      • by miro f ( 944325 )
        I think this will be the exception. I very much doubt DirectX 10 will be adopted as quickly as previous versions, as it would alienate a vary large portion of the potential userbase.

        Of course, it probably will eventually get used, otherwise MS will probably just release it for XP.
        • Hmmmm, well, the problem is that people that know about DX10 are the "hardcore games". Those are the kind of people that do not mind spending 3000€ on a new machine just because the game they want to play doesn't run at least at 60fps during action sequences. Sure, there are savvy gamers, but the bunch of them are clueless lusers that only want to play games and couldn't install an operating system if their life depended on it. To them it is "DX10 is better than DX9, so I need DX10!".

          Most "gamers"

        • by LocoMan ( 744414 )
          Not if they do like previous DirectX generations, where the games ran in both new and older version of DirectX, but looked prettier or had more effects in the newer one.
        • by Flodis ( 998453 )

          I very much doubt DirectX 10 will be adopted as quickly as previous versions, as it would alienate a vary large portion of the potential userbase.

          Of course, it probably will eventually get used, otherwise MS will probably just release it for XP.

          Since games are one of the very few things that may lure people over to Vista, I seriously doubt MS has any interest whatsoever in porting DX10 to XP. Unless - of course - their port runs horribly slow, just hinting at the marvelous graphical effects possible in

      • by Tarlus ( 1000874 )
        While more and more devs may begin to take advantage of DX10, most of them aren't doing it exclusively. Which means, for a while, new games will run on DX9, but can take advantage of DX10's enhancements if available.

        Making games DX10 exclusive for right now would knock a major dent in their potential sales.
  • by Cathoderoytube ( 1088737 ) on Tuesday May 08, 2007 @07:47AM (#19034777)
    I think the people doing this test should really have a look at the vista manual. If they did they'd know the OS was designed to block suspicious looking frames from the games we play to keep your computer secure. This generally means a slight performance hit. But hey, if you're willing to shell out hundreds of dollars on upgrades for an OS, what's a few more hundred to buy an even new video card that'll allow you to play games with performance comparable to those on a machine running XP with less powerful hardware?
  • by Mprx ( 82435 ) on Tuesday May 08, 2007 @07:53AM (#19034883)
    The human eye is an analogue device, and does not see in frames. Because computer games generally do not feature realistic motion blur, we can see a benefit from increased frame rates well above the 72fps which would be sufficient with perfect motion blur. Accurate motion blur can be thought of as "temporal antialiasing", analogous to the spacial antialiasing supported by modern graphics cards.
    • The human eye is an analogue device, and does not see in frames.

      Now that we have done away with CRTs in both the camera and monitor, do we need to have frames at all? For video we should be able to transmit pixel changes directly from camera to monitor. For games, update the monitor whenever you like.

      • by linzeal ( 197905 )
        I'm sorry but CRTs are still around even if the more popular Trinitron tubes are no longer being manufactured. Me and my girlfriend get migraines and we prefer a cheap used 17" monitor [amazon.com] for long sessions at the computer than our 20" monitors [amazon.com], for one while the CRT has hummed along without complaint from moving around the house and the country, both of the LCD monitors have had to be returned under warranty for problems with holding a specific resolution or just turning on. When LCDs are as rock solid as my
        • While Trinny's are no longer being made, these other companies might still be producing their versions:
                  * Diamondtron (NEC/Mitsubishi)
                  * SonicTron (ViewSonic)
                  * Technitron (MAG Innovision)
      • Now that we have done away with CRTs in both the camera and monitor, do we need to have frames at all?

        Yes. For a few reasons:

        1. If you don't transmit an entire frame of information at once, you're likely to get odd rendering artifacts such as tearing.

        2. If you want to fool the eye, consistency is the key. Having a fixed framerate (whatever that may be) will always give the smoothest results. Games today do tend to render their backbuffers much faster than the monitor can update, but that's overcome because

    • by MotherMGA ( 777300 ) on Tuesday May 08, 2007 @08:07AM (#19035165)
      What the article does not state is that there reason for such slowness is that behind the game, Vista is poping up the message:

      "You are attempting to refresh the screen. Cancel or Allow?"
    • The human eye does not at all experience motion blur. Motion blur is purely an artifact of recording devices. Why anyone would want to make computer games look like you watch a recording is beyond me.

      • by SighKoPath ( 956085 ) on Tuesday May 08, 2007 @08:39AM (#19035651)

        Motion blur is purely an artifact of recording devices.
        That is why recording devices can get away with recording at under 30 frames per second. For example, movie projectors display at a mere 24 frames per second, with no perceived problems! Good luck playing any 3D game at that frame rate without noticing. However, if the game had motion blur, it would look just fine at 24-30 FPS.

        The big question is, is this even practical? To me, it seems that running at the higher frame rates is easier than correctly rendering motion blur.
        • by Mprx ( 82435 )
          Obviously spoken as somebody who's never watched a movie at a higher framerate. Go watch some 60fps Showscan or something, and the deficiencies of 24fps movies become very obvious.
          • Play any of the Metal Gear Solid games and you'll see excellent use of motion blur in video games.
            • by Mprx ( 82435 )
              It's not realistic motion blur, it's just fading out frames slowly instead of immediately replacing them, and it doesn't provide any additional motion information. Compare with true motion blur as demonstrated in the site Floritard linked in post 19035607 [slashdot.org].
        • The big question is, is this even practical? To me, it seems that running at the higher frame rates is easier than correctly rendering motion blur.
          It's interesting. I wonder if this is the same kind of trade-off as single-core vs. multicore. Will GPU engineers eventually implement motion blur, given the possible performance gain and lower ROI in other areas? Or does properly computing motion blur require more rendering than the extra frames it saves?
        • For example, movie projectors display at a mere 24 frames per second, with no perceived problems!

          You perceive no problem because most film makers take that into account when filming. Which is why you very rarely see horizontal pans with stuff like people in them. Some movies do them (Matrix 2, I think, has some awful horizontal pans with Smiths in them), and the issues become very obvious.

      • by Jaysyn ( 203771 )
        Oh. Well I guess I *did* do too much acid in high school.
      • by Mprx ( 82435 )
        The human eye does experience motion blur. Simple test: hold your hand out away from your computer screen (otherwise you'll get strobing), and wave it about as fast as possible. We don't experience as much motion blur as we see in 24fps movies, but that is because 24fps is the *minimum* needed to produce smooth motion. See the research of Douglas Trumbull (Showscan) for details.
    • Re: (Score:3, Informative)

      by Floritard ( 1058660 )
      Hugo Elias has an excellent demo of this effect on his site. Check it out and tell me this spinning cube [virgin.net] doesn't look more real with the motion blur. It's a little eerie. I've seen this effect in some footage for that new game Little Big World among others. It's a framebuffer effect I believe. I wonder if its inclusion in more game will have any effect on traditional framerate requirements for believable motion. Might get by with less as you say. Then again, to do it correctly I believe you have to render e
    • Because computer games generally do not feature realistic motion blur, we can see a benefit from increased frame rates well above the 72fps which would be sufficient with perfect motion blur.

      Human eyes may not think of motion in terms of discrete frames, but computer display devices do.

      If your display has a 75Hz refresh rate, it doesn't matter if the game engine is generating 75 frames per second or 175 frames per second; the same number are going to reach your eye.

      (However, higher frame rates can be used t
    • Don't forget the monitor's refresh rate will limit the frame rate even with vertical sync options off (so might as well keep it on to prevent "tearing"). If your refresh rate is at 60Hz it doesn't matter how fast the game goes... you'll only see 60 individual frames (with tearing if it doesn't happen to be an even multiple of 60). So don't forget about this, since you can probably up it to 75 or something and squeeze a few frames out (some older monitors might... um... break if you accidentally set them t
    • The human eye is an analogue device, and does not see in frames.

      That's something I'm actually wondering about. But if it is the case, then explain to me why I sometimes see car wheels going backwards IRL (not on TV)?

      Here's an interesting article: [livescience.com]

      (...)
      One proposes that the visual cortex, much like a movie camera, processes perceptual input in temporal packets, taking a series of snapshots and then creating a continuous scene. Perhaps our brain processes these still images as it does frames in a movie

      • But if it is the case, then explain to me why I sometimes see car wheels going backwards IRL (not on TV)?

        I've seen this phenomenon too, but only ever under stroboscopic illumination such as street lamps. The article you linked claims reports of similar observations under 'continuous light' - I've never heard of any such thing, but it's important to note that a small stroboscopic component will cause this effect even in the presence of continuous illumination, because at high enough speed, the subject will blur sufficiently under the continuous light that only the instants illuminated by the strobe will contai

        • by LKM ( 227954 )

          I've seen this phenomenon too, but only ever under stroboscopic illumination such as street lamps.

          Nope, I've seen it in broad daylight. Last week I was in Cuba, and they had these old horse carriages with the huge wheels. I've seen it for each one of them if they moved, in different locations.

          And I don't think they have any lights on during the day in Cuba :-)

  • by tygerstripes ( 832644 ) on Tuesday May 08, 2007 @07:57AM (#19034945)
    First, you compile Wine to run in Vista...
  • I'm playing all my games with specs quite similar to the ones in the article (slightly better processor, exact same 8800 GTS card). Between XP and Vista, I've honestly noticed very little, if any, difference in my game performance. I run the settings quite high.. the only real issue I've had is that Sim City 4 will simply not work under Vista.

    I'm not trying to be a troll here, but when you're playing a game in fullscreen, isn't it basically getting your machine's full attention? What's Vista doing that make
    • slightly better processor, exact same 8800 GTS card You have one fo the best cards on the market right now. 50fps or 70fps makes very little differnce to the human eye. If you run a midrange box, and up grade to vista, then you will probly run into some more noticeable slow down.
  • by Anonymous Coward
    TFA says that Vista is not a very good gaming OS, which may be true compared to XP. Is Windows XP and Windows Vista in competition with each other? Maybe, but they will not be for long unless MS is stupid. So, the fair comparison would be to compare these games running on Vista, Ubuntu and OS X!
  • As I've said [slashdot.org] before [mdlug.org], Vista has been horrible for games. And these aren't new, flashy, supreme games. These are games from a few years back, that should fit comfortably on the hardware, and I'm not cranking up the resolution or the detail or anything. The hardware I refer to: AMD Sempron 1.8GHz (allegedly equivalent to a 3GHz+ processor), 1GB RAM, 80GB disk, Geforce 6150 integrated graphics. Not (at all) a speed demon, I know, but I'm not asking for miracles. Look at the games I'm trying to run:

    • Aliens versus Predator 2: Runs slow, audio is skippy. 90% of the time fails to launch properly.
    • Tron 2.0: Slow, skippy audio. Seems to always launch into the game menu, but firing up a save game crashes the program much more often than not. Vista doesn't crash at this point, but it takes about five minutes for it to recover.
    • No One Lives Forever 2: Actually runs okay, much of the time. But about 20% of the time it won't launch a saved game, it instead crashes to the desktop. At least it's faster than Tron 2.0 at crashing, and a relaunch usually (~90% of the time) is successful.
    • Freedom Force: "This program is not compatible with Vista."
    • Freedom Force vs. The Third Reich: Seems to run as well as NOLF2.
    • Half-Life, Half-Life 2: Worked for the limited testing I did. HL2 is quite slow and jittery on this system, though. (Not totally surprising, but still...)

    So, really, only two games actually run well enough to bother with: NOLF2 an FFvTTR. (Oh, okay, HL2, Blue Shift, Opposing Force work all right.) Obviously I'm not a huge gamer, and I know this is a low-end machine, but oy. My previous experience was with XP on a dual Athlon MP 2600+ system (2GHz real clock), 1GB RAM, GF5700LE card. A better system (and a lot more expensive when I got it four years ago) but not that much better.

    • by Rycross ( 836649 )
      I'm guessing its the graphics card. On a AMD X2 4200, 7800 GT, and 2 GB of RAM, my games have worked fine, with the exception of GalCiv 2 (this was back in January). The developers for that game stated that the problem was with nVidia's drivers in particular, not with Vista. Newer drivers may have rectified the situation, but I haven't tested yet.

      Half Life 2 and FEAR both worked perfectly, and I also got Psychonauts off of Steam (which also worked).
      • by Osty ( 16825 )

        I'm guessing its the graphics card. On a AMD X2 4200, 7800 GT, and 2 GB of RAM, my games have worked fine, with the exception of GalCiv 2 (this was back in January). The developers for that game stated that the problem was with nVidia's drivers in particular, not with Vista. Newer drivers may have rectified the situation, but I haven't tested yet.

        I can confirm that (the nVidia driver problem, not whether or not it's been fixed). GalCiv2 and GC2:DA run flawlessly on my laptop's ATI x300 GPU under Vista.

    • I agree, I have an IBM PC-XT and it ran fine with DOS2.0 and then I installed Vista and it doesn't even work and the game framerates are awful!

      It's not just M$ too I have a 286 running at 12MHz and I installed Edgy Eft and turned up all the graphics details and it runs incredibly slowly, it's just unacceptable.

      Enough jokes at the expense of the parent aside. He's running Vista on hardware that is a couple of years old, and he has the audacity to complain about performance on legacy equipment.

      Vista was des
      • He's running Vista on hardware that is a couple of years old, and he has the audacity to complain about performance on legacy equipment.

        Um, actually, it's a Dell C521 purchased two months ago [dell.com]. And I upgraded the RAM for it, too. Imagine what it would be like with only 512MB...

        Here's the system requirements for Aliens vs. Predator 2: "Pentium 3 or Athlon 450 MHz or higher, 128MB RAM or higher, 16MB DirectX 8 compatible 3-D video card, 1.3 GB hard disk space, 4X CD-ROM drive or greater,16 Bit DirectX 8 co

        • I read your post. The specifications for the machine you bought are at least a year old, it's contemporary for 18 months ago at the performance end.

          So, you bought old Dell stock, they saw you coming and they took your cash. I guess that's why Mikey Dell is such a happy guy. If Dell made a system with an Intel 8086 CPU running at 4.7MHz today, it would be brand new, but it wouldn't run Vista so great. If you're not sure about PC hardware specifications, there are a lot of websites out there, HardOCP, Tom
          • Maybe (I think I said this earlier), you would find that running an OS that is designed for the hardware you have instead of the hardware you don't, you'll get application performance on par with the quality of your legacy hardware.

            Ah, I see. I suppose it was presumptuous of me to relay my experiences regarding the difference between Vista and XP for gaming stability and performance in a topic titled "Vista vs. XP Game Stability and Performance". I'll try to be more on-topic in the future. Please correct

            • We presume we are testing Vista and XP gaming performance on hardware capable of running both OS's correctly, not on legacy hardware. Your assertion that Vista performance is bad simply because you don't own hardware that Vista was designed for is questionable (and barely scraping in with minimum requirements will not win you any performance races, either).

              Otherwise, embedded linux beats all other OS's because Vista doesn't run well off a wristwatch, whereas the embedded linux OS runs fine.

              No, the validity
  • Immature Drivers (Score:3, Insightful)

    by SpryGuy ( 206254 ) on Tuesday May 08, 2007 @08:37AM (#19035603)
    I'm certain the cause is the immaturity of the video drivers.

    I was forced to upgrade to Vista at work, and I've expeirenced all sorts of driver related problems, from inablity to recover if the monitor is unplugged and plugged back in (or KVM's away and back), to repainting issues in several apps (most notably, Visual Studio 2005). In addition, I've seen some very poor performance in many instances, including the much-"Wow"-ed feature of 3D task switching.

    I'm sure most of these issues will be ironed out over the next year or so as the drivers become more optimized and stable.
  • by Minwee ( 522556 ) <dcr@neverwhen.org> on Tuesday May 08, 2007 @09:02AM (#19036017) Homepage

    Wow. Games which were designed, tested and optimized for XP run better on XP.

    What exactly is there to wonder about?

    • by jools33 ( 252092 )
      Actually Company of Heroes and Flight SimX (a microsoft game) were amongst the first "Games for Windows" - supposedly coded for Vista compatibility - and COH in the article is highlighted as having Vista performance issues - probably Nvidia driver related.
  • Once upon a time, it was us Mac users making comments like the emphasised part from the article summary (my emphasis):

    "Really, there was only one instance where Vista was able to pick up a few more frames than XP -- World of Warcraft at greater than 90fps, where the human eye can't even see the difference. "

    Ah, the good old days, when it was all so simple.
  • by Opportunist ( 166417 ) on Tuesday May 08, 2007 @10:19AM (#19037311)
    1. Download some free theme that looks like Aero. Watch out for malware.
    2. Remove half your ram.
    3. Clock the CPU down a few notches.
  • I really wish they had posted the driver version(s). Now I have to assume that they are using the May 8800 drivers and not the earlier releases or betas.
    • Actually, we have a built-in anti-blur mechanism whereby we stop processing visual information while our eyes are moving rapidly. It's called Saccadic Masking [wikipedia.org] and it's responsible for the 'rainbow effect' on cheap DLP projectors. With this in mind, I wonder if we could artificially increase our visual response times in high-speed-motion situations (for instance race driving, racquet sports, etc.) with some form of shutter system? At the very least, it would allow us to see details where otherwise we'd just
  • The old model for video drivers has been in use since Win98, and essentially has not changed since then. During all that time, the video card companies have been optimizing their drivers to run faster in that architecture. Significant speed boosts have occurred in the past simply due to driver updates, even as much as 20-30% for some types of games.

    In Vista, however, the driver model is completely different. As a result, many of the optimizations that had been done in the past are no longer valid and have
  • That the Nvidia logo and slogan "the way it's meant to be played" will have a disclaimer on the bottom
    like most car commercials, or tacked on to the splash screen?

    I can see it now:

    The way it's meant to be played*

    *may be slow, buggy, prone to BSODs, catch fire, lock up, eat power supplies for lunch, cause
    your computer room to be hot, supper to be cold, hate Vista and long for XP/AMD/ATI and stability
    is not guranteed until a week before the next OS is out.
    So there, THUPBPBPBPB!
    • > That the Nvidia logo and slogan "the way it's meant to be played" will
      > have a disclaimer on the bottom like most car commercials, or tacked on to the splash screen?

      I've tried complaining to nVidia about a bug in their drivers. They're unreachable. The support forums have no employees, so it's users trying to help users. They have no e-mail and their feedback form is still 'under construction'. They refer you to your OEM for all questions, which of course your OEM (someone who slaps chips on a PCB)
  • World of Warcraft at greater than 90fps, where the human eye can't even see the difference.

    Actually, provided that your screen has a refresh rate at least equal to the game's fps, you can, because of the motion blur it creates. That's why a game at 25 FPS doesn't look quite as smooth as a game at 60 FPS, while nothing looks smoother than a movie at 25 FPS.

    One day, maybe, true motion blur will be in every then-gen game, and we'll all have our games running at 25 FPS and think it's perfectly fine.

    • My solution years ago was to search online for a utility called "reforce". This scans your video card and monitor and allows you to select the scan/refresh rate for every resolution possible.

      There also is a option to make everything - all 40 modes or so 60hz. This is highly recommended as it cuts down direct X crashes and issues by at least a factor or two. All you have to do then is go to the video card and turn the V-sync on. The games will all only work at 60hz at this point as well, which increases
  • There are many reasons for Vista performing badly with many games: 1) Vista is not actually a new operating system. Microsoft dropped the Longhorn kernel from Vista in favour of a slightly tweaked version of Server 2003's kernel. Server 2003 runs games pretty well (better than XP in some cases) 2) All the extra rubbish that is in Vista interferes with performance. All the DRM, graphical gimmicks and security stuff packaged with Vista, which you can't turn off, adversely affects the performance of the games

Science is to computer science as hydrodynamics is to plumbing.

Working...