Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Software Bug Upgrades IT

XP SP2 Can Slow Down Business Apps 359

An anonymous reader submits "Mobile PC magazine installed XP SP2 on a bunch of notebooks and benchmarked them, finding that SP2 caused a 9-percent performance reduction in business productivity apps. While a couple of notebooks performed better, the majority took a 3- to 22-percent performance hit." For now, the story is just at the top of the Mobile PC website, but they promise more details in an upcoming issue.
This discussion has been archived. No new comments can be posted.

XP SP2 Can Slow Down Business Apps

Comments Filter:
  • Buffer checks (Score:5, Interesting)

    by JanusFury ( 452699 ) <kevin...gadd@@@gmail...com> on Saturday September 18, 2004 @01:14AM (#10283191) Homepage Journal
    This is probably due to them recompiling a large number of libraries and system components with the buffer checking and other security features they added into the recent versions of Visual C++. If you ask me, it's worth it, just to know that my Windows box has a few less wide open holes to be exploited.

    It definitely has proven its worth so far - I may be wrong, but I'm pretty sure the reason SP2 isn't vulnerable to that GDI+ JPEG exploit is that they recompiled GDI+ with buffer checks.
    • Anybody know what the slowdown is for libsafe?
    • Re:Buffer checks (Score:5, Insightful)

      by metlin ( 258108 ) * on Saturday September 18, 2004 @01:23AM (#10283245) Journal
      ...but I'm pretty sure the reason SP2 isn't vulnerable to that GDI+ JPEG exploit is that they recompiled GDI+ with buffer checks.

      Correct me if I'm wrong, but shouldn't this have been done right in the beginning itself?

      If I were writing any commercial grade code, especially stuff that I know that people would take advantage of, I would sure as hell make sure that I had all my buffer checks in place.

      I've heard so much about the programming practices at Microsoft and what not - and yet, ironically, these things keep cropping up so damn bloody often while some operating systems [openbsd.org] coded by a bunch of loosely connected hackers are way more robust and stable.
      Hmm, makes one wonder.

      (Heh, funnily OpenBSD site says - Only one remote hole in the default install, in more than 8 years! - I guess it does say a lot).

      I do not understand, I would have thought that despite all the shit that MS gets for writing bad code, they would make sure that their code is largely buffer checked. Now, when you have to release stuff from outside to patch up for those, you would obviously be wasting a lot more cycles than if you had done so in the beginning, and well.

      Sheesh. They do not do a good job of making software and cause you inconvenience, then they release something to make up for it, and that causes you even more inconvenience.

      Hah.
      • Re:Buffer checks (Score:5, Insightful)

        by JanusFury ( 452699 ) <kevin...gadd@@@gmail...com> on Saturday September 18, 2004 @01:25AM (#10283251) Homepage Journal
        99 buffer checks don't do you any good if one buffer is missing a check, and that one gets exploited.

        That's what their compiler modifications are intended to help with, and from my experience, they help. I do agree that it should have been done sooner, though.
        • Re:Buffer checks (Score:5, Interesting)

          by metlin ( 258108 ) * on Saturday September 18, 2004 @01:32AM (#10283273) Journal
          The reason I brought that up was because I was interview by Microsoft last summer at Seattle, and one of the groups that interviewed me was the systems group.

          (Funnily, systems wasn't even my area, but still they interviewed me, but that's another story...)

          They were of the opinion that since MS is a favourite target of hackers and the like, any MS programmer ought to go to extraordinary lengths to patch any and all buffer checks and foo bar. I was asked to write some piece of code for compiler design and memory management, and the guy kept harping on buffer checks.

          I would imagine that with ALL those checks, such things would not be common - but lo! and behold, there they are.

          Either they are not doing a good job of doing the whole buffer check thing that the guy harped to me about and it was all hogwash to impress upon you how "important" and "hard" coding in MS is, or there is something seriously wrong with the codebase that SO many exploits turn out everyday.

          I can only guess which one it is.
          • Re:Buffer checks (Score:2, Insightful)

            by Anonymous Coward
            there is something seriously wrong with the codebase that SO many exploits turn out everyday.

            There probably is, and it's same problem found with virtually all C/C++ code of a certain vintage, Microsoft or not.

            Do you think Dennis Ritchie ever gave a shit about checking buffers? How about the millions of coders that copied his style? How about the people that wrote most of UNIX? How about the people that wrote most of Linux distros? Go check the Linux security sites, and you'll get the picture that this wa
            • Re:Buffer checks (Score:5, Insightful)

              by metlin ( 258108 ) * on Saturday September 18, 2004 @02:08AM (#10283407) Journal
              Uh hmm, your argument is flawed for the simple reason that just because Linux has buggy code, does not excuse Microsoft from writing good code.

              And comparing Dennis Ritchie's code with today's code is again flawed - hell, why, given my today's knowledge of Physics and Mathematics that I learn by my twelfth grade, I would have been the most intelligent man alive 400 years ago.

              You do not compare with what Dennis did or might have done, you make a reality check with how things are today - there is a fair section of crackers who want to exploit systems, and if you are in the business of writing commercial code, you'd better be darned good at making sure your code is good because customers are *paying* you for it.

              I have another issue with MS - they concentrate more on releasing things early than checking the code full before releasing. If this were an isolated issue, I would not have a problem - it is not. And MS has had so many years in the market, so many top-notch programmers AND the resources. If you want to compare, look at OpenBSD - that's an example of OpenSource code done right - with one remote exploit in 8 years.

              Linux is still in it's infancy, and for all that it's capable of it, it's quite unfair to compare it with the products of a 20 year old behemoth. If you ask me, Linux is doing a fantastic job of being a top notch enterprise systems in such a short time, when compared to Microsoft. And very few of the people behind it actually make any money of it. Does that not say a lot?
              • Re:Buffer checks (Score:3, Insightful)

                by Anonymous Coward
                First off, you seem ignorant of the point that many people did know better than Ritchie, which is why OpenVMS and OS/400 have infinitely better security records than UNIX does.

                Second, you're right that (in retrospect) MS probably should have hired those guys instead of the C/UNIX crowd that the unis were producing. Fact is that they didn't though, and irrespective of their monopoly status they got all the same kinds of people and kinds of problems as everyone else.

                Finally, it's true they were slower to fi
                • Re:Buffer checks (Score:5, Insightful)

                  by metlin ( 258108 ) * on Saturday September 18, 2004 @03:23AM (#10283628) Journal
                  I was not trying to flame MS for their past actions - however Microsoft started out with a fairly clean codebase for both Win2k and WinXP. Given that, it seems bad that such vulnerabilities keep coming up.

                  I do agree that both Win2k and WinXP are a lot more stable than their predecessors. However, you would think that when you are doing something the second time, you would double-check to make sure that you do not make the same mistakes as you did the first time.

                  I just feel that this is not happening - and any number of factors could be contributing to it (market, economics, manpower, complexity what not) - but that does not mean you do not take the pains to not do it well. I'm sure Microsoft's trying to take as much care as they can to ensure that this does not happen.

                  However, despite that, these still seem to be happening. Which is what I find quite baffling - there seems to be a fundamental flaw somewhere in there, and that needs to be taken care of. Which is what I mentioned in my initial posting, too.
                • Re:Buffer checks (Score:3, Interesting)

                  by sg_oneill ( 159032 )
                  actually the better record has to do with the fact fk all people use OS/400 / OpenVMS.

                  Yeah unix had some silly bugs, but that partly cos it was written by a really small team in spare time and became uber-popular despite it never really being intended to , and in an age where hackers where guys who logged in and FIXED your shit.
              • Re:Buffer checks (Score:3, Informative)

                by dirk ( 87083 )
                While you are right it is not fair to compare coding from 20 years ago with that of today, it is also unfair to compare OpenBSD with MS. They are aiming at 2 completely seperate goals, so of course they will be different. OpenBSD has the goal of being as secure as possible. They are extremely good at this. They also do not support many of the newest and greatest things and their usability is pretty bad. MS has the opposite goal. They want to have an incredibly usable OS which supports all the latest
          • Re:Buffer checks (Score:5, Interesting)

            by omicronish ( 750174 ) on Saturday September 18, 2004 @02:30AM (#10283475)

            Either they are not doing a good job of doing the whole buffer check thing that the guy harped to me about and it was all hogwash to impress upon you how "important" and "hard" coding in MS is, or there is something seriously wrong with the codebase that SO many exploits turn out everyday.

            I was an intern at Microsoft this past summer, and I believe it's the shear quantity and perhaps complexity of software being written that's resulting in these bugs. They really do emphasize writing secure code now (I don't know how it was like before). I shared an office with two other interns, and during several code reviews another intern was involved with, there would be "did you check parameters here? potential buffer overflow? what if this is NULL?" And it wasn't even important code he was working on.

            • during several code reviews another intern was involved with, there would be "did you check parameters here? potential buffer overflow? what if this is NULL?"

              Well, what if it was a potential buffer overflow? In such a piecemeal appraoch to programming, can the intern realistically be expected to know if there's a potential buffer overflow? How should he know how a NULL is handled? Isn't there an operating system that's supposed to do that stuff?

              Where's that damn garbage collector???

              Disclaimer: I am
              • Re:Buffer checks (Score:3, Informative)

                by Tim C ( 15259 )
                How should he know how a NULL is handled?

                Well, if he wrote the code that needs to handle the null, then he need to know how it should be handled. If he's inexperienced/junior enough to not be able to decide himself, he should speak to someone who can make that decision for him.
            • by glitch23 ( 557124 ) on Saturday September 18, 2004 @07:25PM (#10287521)

              "did you check parameters here? potential buffer overflow? what if this is NULL?" And it wasn't even important code he was working on.

              Clippy can still be dangerous if he goes unchecked.

      • by bmajik ( 96670 ) <matt@mattevans.org> on Saturday September 18, 2004 @01:29AM (#10283261) Homepage Journal
        specifically, the /GS flag to the VC++ compiler.

        The compiler was modified to support automatic stack overflow checking (i.e. canaries). Server 2003 was compiled with this (and as a result, MANY things that are shared-code problems resulting in exploits on other NT based OSes are either ineffective or DoS attacks on Server 2003).

        The idea is that /GS compiled binaries will cause the OS to terminate the app rather then letting code execute. The source code generally doesn't need changes.

        So, its a defense in depth tactic. Ideally, there'd be no BO's in code. But there are. Terminating the program with an explanation as to why is better than letting people run code on your box. :)

        • Ah! I wasn't aware of that.

          So would they be recompiling all their stuff that shows up an exploit with the new compiler, to ensure that BOs don't happen? :)

          Given the frequency of exploits that turn up, if MS kept releasing such patches ever so often, that would quite terribly slow down the whole system. On the other hand, like you rightly pointed out, better a slower running program that terminates by popping an explanation than an exploit.

          Oh well, the price of slavery ;)
      • Re:Buffer checks (Score:2, Insightful)

        by NanoGator ( 522640 )
        "Correct me if I'm wrong, but shouldn't this have been done right in the beginning itself?"

        Depends on which question you're asking.

        "I want Windows to run faster, should we be performing buffer checks?"

        "I want Windows to be more secure, should we be performing buffer checks?"

        This is not a rebuttal to your post, simply pointing out that it's not as black and white as that. Security is important, but usability is what made Microsoft a success.
        • I merely meant that patching up buffer checks into code later on makes the application a lot slower than having checked those in the very beginning - I know this because I've written code for which patches have had to be released (hey, we all learn).

          (this may not always be the case, but it is often the case)

          So, all that I meant was that while I do want Windows to run faster, it should not be at the expense of security - if it had been taken care of in the beginning, it would have been faster than taking c
        • go buy yourself a copy of windows 2000...

          Unless you have a CPU that needs multithreading, there is no reason - besides ..... - that you need XP. And ESPECIALLY if you are reading slashdot, you should be able to work win2k.

      • Re:Buffer checks (Score:5, Interesting)

        by IronChef ( 164482 ) on Saturday September 18, 2004 @01:52AM (#10283348)
        I've heard so much about the programming practices at Microsoft and what not - and yet, ironically, these things keep cropping up so damn bloody often while some operating systems coded by a bunch of loosely connected hackers are way more robust and stable.
        Hmm, makes one wonder.


        the openbsd people are united by an ideology. Microsoft employees are largely, though not exclusively, united simply by the desire for a paycheck.

        I work in a Microsoft facility and let me tell ya, they aren't all smoking what Steve Ballmer is.

        Is it any wonder that quality suffers when compared to a project that is a labor of love?

        Or maybe my bad attitude is why I am a contractor and not full time there. :)
    • Re:Buffer checks (Score:2, Insightful)

      by aws4y ( 648874 )
      Buffer checking is one way to solve the problem.
      Another, non intrusive way of doing it is to include kernel level memory protection. On top of that you could add Users, Groups and privileges and not allow every program to have the run of the system.
      Buffer Overruns are as old as C and UNIX has built mechanisism to cope with it that do not put the onus on the programmer, since the memory monitoring is done in the kernel, this is also safer in the long run because it means that a program must break memory p
      • Re:Buffer checks (Score:4, Insightful)

        by TheLink ( 130905 ) on Saturday September 18, 2004 @02:20AM (#10283444) Journal
        Uh what are you talking about?

        Windows XP has users, groups and privileges, and not every program has the run of the system.

        And UNIX is just as vulnerable to buffer overflows as Windows XP. They both are programmed in languages that are prone to such problems.
        • Re:Buffer checks (Score:2, Interesting)

          by Anonymous Coward
          Uh what are you talking about?

          Windows XP has users, groups and privileges, and not every program has the run of the system.


          Uh, what are you talking about?

          As a Windows NT programmer for 10 years I, and many others are fully aware that the Windows security model is more comprehensive and flexible than the POSIX model. This is due to the power of NTFS and the uniform interface of the NT object manager.

          However, while this is all dandy, the Windows environment is architected, and third party apps are deli
      • Hey, you! Wake up! This is the 21st century now, not 1988! Those kind of protections were build into the NT kernel since the very beggining (1993 or something), which means win 2k and xp have them.

        Regards,

      • Do you understand anything about NT ? Nt already has user separation, priviledge sepration, memory compartmentalization, etc etc.

        It's not like buffer overruns on NT are happening in random unpriviledged code an then magically running ring 0.

        There are two big issues that make BO's problematic on windows:

        1) traditionally, many system processes have run as something equivalent of unix root (Local System, etc). These already have root privs, so any exploit against these that allows code execution is code r
      • The real way to avoid these sorts of problems is to have a memory architecture that prevents writable pages from being executable, and vice versa. I read somewhere that the IA32 architecture makes this very hard - anybody know the details? Back in the old days we used to use separate instruction and data spaces (i.e. on the PDP-11) ...
  • by lesterchakyn ( 235922 ) <{moc.liamg} {ta} {etergen.rasec}> on Saturday September 18, 2004 @01:14AM (#10283192)
    You can't install a really big bunch of fixes and expect Windows to run faster!

    It has been always this way
    • Just as we, the lowly programmers, should know this... when building functional blocks on top of lower functional blocks (think pyramid, not spaghetti), we take out the range checking from lower functional blocks as we progress in our development cycle (at least the good programmers do).

      Just as my subject line is treated as a rock-steady axiom of Computer Science, particularly of Software Quality and Software Engineering electives, fixes that results in slower codes usually a BIG sign that range checking h
      • Its not always about performance. If you are removing range checks from all your code you are making it worse (imho). By your logic we shouldn't bother with std::vector or ArrayList or whatever and should just stick with plain old arrays because they are faster. There is a reason that range checking is in there - so that you catch the unexpected case.
        • For a non-public APIs, there is no good reason for a range checking on the lower-tier modules for a carefully integrated module stacking. It is superfluous coding.

          That is, the 2nd programmer's module calling the 1st programmer's function without consulting the 1st programmers documentation, be that it may: verbal, written, guesture or even smoke signals.

          Tight cohesive teamwork is the key... Get it together.
    • by EtherAlchemist ( 789180 ) on Saturday September 18, 2004 @01:59AM (#10283383)

      I found one instance where a fix actually allows you to pirate OTHER software (or at the very least violate otherwise restrictive "one machine at a time" clauses in the EULA).

      I installed SP2 and didn't notice any problems at all. Then, I fired up Fireworks which has a little util that sees if other copies using the same license are running on the network (who, me?) and was prompted by Windows telling me that the service had been blocked and did I want to Continue Blocking, Unblock or should it Ask Me Later.

      Well, so far, choosing Ask Me Later has enabled (for testing, of course) running multiple copies of single license software when we would not have been able to previously.

      Neat! Thanx Bill!
  • by Anonymous Coward
    This has to do with a buggy CPU "driver" in SP2, rolling back that driver to the pre-SP1 version should correct the slowdown.
  • by kannibal_klown ( 531544 ) on Saturday September 18, 2004 @01:19AM (#10283222)
    I just installed SP2 on my personal laptop that I use for work. I reformatted it yesterday, and I had a CD with SP2 on it. I figured I would rather just install it off the CD that worry about downloading all of those frigging security updates and what-not.

    Anyway, I could have sworn the laptop ran faster before I put SP2 on there. I never bothered to benchmark it, but it seems slugish now. And it's not a weak machine (as far as laptops go). 2.4GHz with 1GB Ram.

    I'm not about to undo everything I've done. I've installed way too much, and don't want to worry about breaking those apps by removing the patch.

    Oh well. I'll just live with it. It's not my main machine anyway, just something to do some DB work with.
    • Laptops and SP2 don't mix. The CPU frequency throttling driver is for some reason pushed back to a version prior to SP1 and works horribly. Your computer may be running at 600-700mhz despite what its telling you. It may not feel 4 times slower though because I doubt you often ever need to go above 800mhz in usage despite what the marketing departments will tell you (this may be different in your case if you develop or run a DB engine on it, but I'm referencing the typical home user). But in short, yes it ju
  • by Adam9 ( 93947 ) on Saturday September 18, 2004 @01:20AM (#10283225) Journal
    Here is another article [hardcoreware.net] where they ran different benchmarks on SP2 and SP1. The office productivity test was the one with the biggest difference. The article puts the blame on the new firewall.

    They should compare a PC with SP2 and one with SP1 with a third party firewall.
  • Why- (Score:5, Interesting)

    by thewldisntenuff ( 778302 ) on Saturday September 18, 2004 @01:20AM (#10283226) Homepage
    was this even posted at all?

    This wasn't even a readable story - just a small synopsis of a story that will be featured in Mobile PC mag next month. There could have been plenty more info, but instead we got two paragraphs.....

    OTOH, is an average 9% drop in performance even an issue? I mean, 9% in office apps is nothing....Who needs high performance when typing, making spreadsheets, or even a PowerPoint presentation?

    This (once again) illustrates the MS push towards security over performance/compatibility

    -thewldisntenuff
    • Re:Why- (Score:3, Insightful)

      by eqkivaro ( 721746 )

      I agree. Who gives a shit? When was the last time someone actually upgraded their computer because Word was too slow? Please!

      Unless you're playing new games there's no reason to be running anything newer than a Pentium II.

    • Re:Why- (Score:3, Interesting)

      by metlin ( 258108 ) *
      Although I agree with most of you said, I have a bone to pick with this statement -

      OTOH, is an average 9% drop in performance even an issue? I mean, 9% in office apps is nothing....Who needs high performance when typing, making spreadsheets, or even a PowerPoint presentation?

      Hmmm, I guess you've never been in a corporate business office, where excel sheets running into hundreds of pages are opened. Or business plans and product specs that run into hundreds of pages are opened.

      Why go that far, you've ap
      • "and my system is so terribly slow that it's unbelievable."

        What are your system specs?
        How much RAM have you got in your system?
        • Enough and more (512 MB) - but what I meant was to point out that my system *has* been visibly slower after SP2. For example, when you've multiple instances of Word open, it becomes exponentially slower.

          I just meant that slowing down of office applications by 9% is not something insignificant, and it sure as hell hurts some of us.
      • If you had a latency of 10% in all your networks, you would know what am talking about.

        A latency compared to what? Like "oh no, my ethernet latency is now 330us rather than 300us. Better call the network guys so they can fix it"?

  • Coral Cache Link (Score:3, Informative)

    by Anonymous Coward on Saturday September 18, 2004 @01:22AM (#10283236)
    Cached link [nyud.net] in case it gets Slashdotted.
  • by Bill_Royle ( 639563 ) on Saturday September 18, 2004 @01:22AM (#10283238)
    I've seen some drag on my system since putting SP2 on, but it's really a double-edged sword.

    However, in my experience it's harder now for sites to push ActiveX controls and executables to your PC now, unless you do a bit of tweaking or visit a deliberately malicious site.

    Considering the system drag that occurs when the average user installs spyware inadvertently, I'd say the SP2 drag ought to be cancelled out for the time being, as it's a bit harder for spyware to propogate under it.
  • by realdpk ( 116490 ) on Saturday September 18, 2004 @01:24AM (#10283247) Homepage Journal
    Has anyone noticed an increase in how long it takes Putty to start up post-SP2? I thought it was the firewall at first, but I disabled that. It still takes about 5 seconds to launch, where before it was instant.
  • by coupland ( 160334 ) * <dchase@hotmailCHEETAH.com minus cat> on Saturday September 18, 2004 @01:34AM (#10283284) Journal
    If you thought SP2 would be a speed upgrade then you also buy the previous lines that Win98, ME, NT4, W2K, XP would make Windows faster than previous versions. Of course these fallacies are based on the assumption that you would install the upgrade on a *newer* PC than their sample set. No Windows update has ever been faster than its predecessors.

    Period.
  • I wonder what this guy has to say about this. [ebaumsworld.com]
  • by corsair2112 ( 813278 ) on Saturday September 18, 2004 @01:36AM (#10283293)
    If I post an "article" on my 5 megs of webspace provided to me by my ISP denouncing Windows XP saying that installing SP2 will steal my first born and rape my cats, then "create" some benchmarks to prove my point, then submit the article to slashdot, will it make it on the frontpage?

    I'll even conclude in the article that running linux will solve world hunger and even do my laundry.
  • Reality check (Score:3, Interesting)

    by Card ( 30431 ) on Saturday September 18, 2004 @01:37AM (#10283295) Homepage
    Correct me if I'm wrong, but given today's hardware, is 10-20% slowdown even noticeable to the average user running, say, Word? IIRC, the threshold for user to notice anything meaningful is around 30% in day-to-day operations.

    Games are a different beast, but does the user even care if loading a spreadsheet takes an extra second or two?
  • 2 things (Score:3, Insightful)

    by slobber ( 685169 ) on Saturday September 18, 2004 @01:45AM (#10283328)
    9% on average on "Business Apps" is to vague too draw any conclusions. Was the slowdown in disk, network, memory, network performance? All of the above?

    The slowdown could mean that MS cut some corners and traded speed for security in XPs' pre SP2 version. While fixing security problems they had to perform some extra checks and that dragged performance down. Or, they could've discovered some serious architectural issues with fixing new holes, so they had to do it in a slow and inefficient way due to the fact that their architecture wasn't designed with those checks in mind.

    On a side note, I experienced a significant slowdown when running Norton AV that supposedly does a bunch of extra security checks. File and network performance became unbearable at times. It got so bad that I had to ditch NAV so now I am reverting my Windows system every day (I run it under VMWare, Linux is a host system). I found this setup + Zone Alarm to be a better answer to endless Windows security issues.
  • by Gary Destruction ( 683101 ) * on Saturday September 18, 2004 @01:58AM (#10283375) Journal
    Maybe Microsoft needs to determine what the most common software installed on Windows PCs is and even work with software manufactures directly to ensure the greatest compatibility.
  • by Anonymous Coward on Saturday September 18, 2004 @01:59AM (#10283379)
    http://support.microsoft.com/default.aspx?kbid=875 352&product=windowsxpsp2 [microsoft.com]

    Note the /NoExecute=AlwaysOff option in the article.

    Well known cause for much of the slow down some people find with SP2. Of course, this opens you up to morphic/purposefully overwritten code exploits, but such is life.

    • Hmm.... from the KB article

      Currently, the only x86 processors that support No-Execute functionality are the AMD 32/64-bit Opteron and Athlon-64.

      Since this doesn't affect Intel, this can't account for all the slow-downs people are expieriencing.

      (Not that I bothered to read the article in Mobile PC.)
  • by JVert ( 578547 ) <corganbilly@hotmai[ ]om ['l.c' in gap]> on Saturday September 18, 2004 @02:02AM (#10283389) Journal
    This should actually be posted in the politics corner. I gotta admit ./ is doing a lot better job at playing politics then certain US canidates. Seriously, a service pack to perform maintence and add some very usefull features. What is the general response? "SP2 broke my edonkey and made my girlfriend (online) break up with me." OH OH! now its slower with certain progams because they switched some compile flags that they should have enabled years ago!
  • by Anonymous Coward on Saturday September 18, 2004 @02:28AM (#10283469)
    You can either get your ass kicked by gamers for having a slow machine, or by hackers for having an insecure one.
  • by pavera ( 320634 ) on Saturday September 18, 2004 @02:32AM (#10283480) Homepage Journal
    I've attempted to install SP2 on three machines now and I'm not trying any more. After the 1st install, the system blue screened, and could not be recovered, had to reinstall from scratch.

    The second attempted install got about 2/3rds of the way done and then crashed resulting in an unstable system. The partial install could not be completely removed, and the machine would crash often, another reinstall from scratch.

    the third attempted install died in the early stages repeatedly (about 15 seconds after starting the install) and never got past that point.

    These were three completely different systems with different software installed, but all ended up with the same result, no SP2 without a complete clean installation of XP first. I'm so disgusted with MSs QA right now, I never plan to install SP2 again, because my time is too valuable to spend entire days rebuilding systems just because they can't write updates to their software.

    Hell in Gentoo and Debian I update the entire system with a single command and download hundreds of software packages equalling hundreds of MBs and it all goes smooth as silk, can't MS figure out how to copy files from an update package into the system without blowing it all to hell?
  • News Flash! (Score:5, Funny)

    by Phat_Tony ( 661117 ) on Saturday September 18, 2004 @02:55AM (#10283538)
    XP SP II Can Slow Down Business Aps!

    Similar problems have been found with XP SP I, the original XP, along with Windows 2000, 98, ME, CE, 95, and 3.1.

  • by YE ( 23647 ) on Saturday September 18, 2004 @03:29AM (#10283640)
    ...some business apps like Gator even refuse to run!
  • by Cyberllama ( 113628 ) on Saturday September 18, 2004 @03:50AM (#10283695)
    Unsecure software runs faster. All that extra checking things to make sure they're valid and so forth requires processing power. I mean, a login script that just accepts any password entered would require less processing than one that actually checks the data against some other data.
  • Defragment C:! (Score:4, Interesting)

    by prandal ( 87280 ) on Saturday September 18, 2004 @04:11AM (#10283758)
    After installing SP2, defragment your hard drive - so many core files are replaced that the system's performance will be even more sub-optimal than usual until you do this.
  • by Linker3000 ( 626634 ) on Saturday September 18, 2004 @04:27AM (#10283812) Journal
    Since installing SP2 on a laptop, the printouts from Treeview Pro (a directory listing program) have had every printed character flipped on its vertical axis - all the letters are in the right place but the wrong way round so - for example, all 'b's look like 'd' - it's readable but makes your brain hurt!!

    Does anyone have a weirder SP2 effect?

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...