Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Internet Microsoft

How Microsoft Degrades Their Users (In a Good Cause) 174

blackbearnh writes "We all know that slow Web pages drive users crazy, but where is the boundary between too slow and too simple? As Microsoft's Eric Schurman points out, the fastest-loading page of all is a blank one, but it's also the most useless. In an interview with O'Reilly Radar leading up to his appearance at the Velocity Conference, Schurman talks about his experiences working on some of Microsoft's highest-volume sites, including the home page and Live Search. In particular, he discusses how Microsoft will selectively degrade the performance of pages to small sets of users so that they can see how various amounts of delay at different times and places affect user behavior. 'In cases where we were giving what was a significantly degraded experience, the data moved to significance extremely quickly. We were able to tell when we delayed people's pages by more than half a second, and it was very obvious that this had a significant impact on users very quickly. We were able to turn off that experiment. The reasoning... was it helps us make a strong argument for how we can prioritize work on performance against work on other aspects of the site.' He also talks about what it's like to be one of the most often-targeted DDoS sites on the planet."
This discussion has been archived. No new comments can be posted.

How Microsoft Degrades Their Users (In a Good Cause)

Comments Filter:
  • by fahrbot-bot ( 874524 ) on Wednesday May 20, 2009 @12:50AM (#28022153)

    In cases where we were giving what was a significantly degraded experience ...

    ... the normally degraded experience.
    (Ba da BOOM! Don't forget to tip your waitress.)

    • Re: (Score:2, Funny)

      That's sort of the same thing I was thinking. I mean, maybe they should have users that opt in to such an experience before they start degrading it.

      • by zxjio ( 1475207 ) on Wednesday May 20, 2009 @01:26AM (#28022315)
        Experimenting by delaying a pageload for 500ms is worthy of ethical considerations? Would you like to sue Microsoft for emotional damage? Too many people are afraid of doing anything these days.
        • by Jurily ( 900488 ) <jurily AT gmail DOT com> on Wednesday May 20, 2009 @02:17AM (#28022531)

          Experimenting by delaying a pageload for 500ms is worthy of ethical considerations?

          No, they should be shot on sight.

        • by Jesus_666 ( 702802 ) on Wednesday May 20, 2009 @03:38AM (#28022825)
          That's half a second! Let's do the numbers:

          We assume that Live search gets ten billion hits a day. We also assume that Microsoft degraded 5% of all hits. Thus Microsoft has wasted 1000000000 * 0.5s * 0.02 = ten million seconds! Microsoft wastes more than 26 years worth of productive time per day. Now, assuming that the computer of the Live search users consume 800W on average, we find that Microsoft wastes a whopping 20.9 watt-millenia per day. Assuming that 80% of that is turned into waste heat it's obvious that this has a non-negligible impact on Earth.

          Gentlemen, I think we have found the root cause for both the energy crisis and global warming (and because our bitching about the oil price annoys the arabic world, also islamic terrorism). Now all we need to do is keep Microsoft from doing these experiments and everything's dandy again.
          • by tannsi ( 1460623 ) on Wednesday May 20, 2009 @05:06AM (#28023213)

            I think this would only actually be a problem if anyone used LIVE search.

          • by OakDragon ( 885217 ) on Wednesday May 20, 2009 @08:17AM (#28024345) Journal

            Modded as Funny (100%), but you've just laid out the environmentalist rationale for controlling everything everyone does, everywhere.

            • Right as opposed to the other group that purposely prevents green technologies from being available to anybody that's willing to pay?

              I know that people like to pretend that there's some sort of massive conspiracy, but the fact of the matter is that if people would make reasonable choices forcing people wouldn't be necessary. I personally see no reason why emissions should vary so much between a car in California and one that's supposedly exactly the same in Washington or Illinois.

          • by PitaBred ( 632671 ) <slashdot@pitabre ... org minus distro> on Wednesday May 20, 2009 @10:15AM (#28025835) Homepage

            We assume that Live search gets ten billion hits a day

            While we're at it, can we assume that I have ten million bucks?

            • Re: (Score:3, Informative)

              by Thinboy00 ( 1190815 )

              Statistics [alexa.com]. Also, scroll down and note the % of people who actually proceed to search.live.com, and then look at this [alexa.com]. (note that the last two statistics probably overlap gratuitously, so if you want to do any math, ignore the third, because the second is more precise.) And if you want some laughs, put google and yahoo into the compare boxes.

          • Microsoft wastes more than 26 years worth of productive time per day.

            Hah, that's nothing compared to the time spent talking about Microsoft here on slashdot.

        • I agree. I don't think users are entitled to access to your page, or even "fair" access to your page. If they don't like it they can go elsewhere. It would be similar to a brick in mortar store giving crappy service, if you don't like it go somewhere else next time. That was the whole point in the experiment anyways, they were trying to answer the question "How slow is too slow?".

          Hopefully this will get them to speed up their pages. Hotmail for instance is an absolute dog. I have a symmetric 155Mb connect

          • by jargon82 ( 996613 ) on Wednesday May 20, 2009 @05:25AM (#28023285)
            Exactly. It's their site, and they certainly allowed to do with it what they want :). They could do "market research" and ask people how slow it could be, but instead they are collecting real world technical data and gaining insight on to how the performance impacts real people. Hopefully they then use this to decide where to spend time on performance.
            • by Jurily ( 900488 )

              gaining insight on to how the performance impacts real people.

              You have about one second from them hitting Enter and them reading the results. Any faster won't matter because the brain needs a context switch too, any slower and it'll be annoying. That'll be 4 million dollars please.

        • Re: (Score:3, Interesting)

          by BikeHelmet ( 1437881 )

          If we can sue them, then we also have to sue Comcast!

          They frequently slow down my browsing with their cruddy filtering, to the point where some jumps take seconds.

          This isn't right(since I'm Canadian), but tracert doesn't lie!

          It's horrible when a game's servers have comcast lines between them and me. Rather than 50-150 ping, I face 700+. :(

        • Won't anyone think of the children?

        • by syousef ( 465911 )

          Experimenting by delaying a pageload for 500ms is worthy of ethical considerations? Would you like to sue Microsoft for emotional damage? Too many people are afraid of doing anything these days.

          Behaving ethically isn't something you do part time, or only under certain circumstances. You either behave ethically or you don't. A 500ms delay may be at the very top of that slippery slope - it ain't going to kill anyone - but it's still part of it.

          The other thing you're failing to consider is that you should alwa

      • Re:As opposed to ... (Score:5, Informative)

        by Hal_Porter ( 817932 ) on Wednesday May 20, 2009 @01:28AM (#28022331)

        Gmail does "selective degradation" really well. E.g. if you load gmail over a slow VPN over wireless connection it says "This site is taking longer to load than normal, would you like to try the Basic Html version or wait longer". Also you can choose basic html (i.e. less ajax and css) as your default view.

        Basic HTML is quite usable these days - it even does email address autocompletion on Opera. So it can use ajax but it presumably doesn't depend of it. In a way it's a bit like a well written application which can use new features if they are present but run without them on downlevel systems.

      • by EvanED ( 569694 ) <evaned@g[ ]l.com ['mai' in gap]> on Wednesday May 20, 2009 @02:49AM (#28022669)

        That would probably completely invalidate the results though, for two reasons. First, the sorts of people who would opt into that wouldn't at all be representative. (It would take an unusual person to even find the opt-in, let alone volunteer for a degraded experience knowingly.) Second, knowing about it would be way too likely to affect how the people behaved.

        You could get halfway by saying "would you like to help us do research" or something like that, without saying in what way, which would reduce these problems, but not completely.

  • by mysidia ( 191772 ) on Wednesday May 20, 2009 @12:54AM (#28022177)

    selectively degrade the performance of pages to small sets of users

    In other words, Firefox, Opera, XP, and Linux users. And the experiment will get turned off, once they switch back to IE8 on Vista.

  • Agile and all that (Score:3, Insightful)

    by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Wednesday May 20, 2009 @01:03AM (#28022213)

    What is cool about the Web is that it is the most Agile of all release environments. Unlike shrinkwrap software, web software can be changed very easily and universally for all users. It brings a raw edge to the development of the software.

    In this, there is also the possibility of becoming complacent and ill-tuned to the needs of your users. Taking Google as an example, they keep their services in a perpetual state of beta, always in testing, never reaching a final v1. This type of reliance on constant feedback from customers may work for a short while, but unless the product reaches a state of relative stability (in terms of both not crashing and also not changing) the users will typically find some other software to use.

    So when Microsoft decides to impact a few customers with degraded QoS, they may be setting themselves up for a bigger fall later. By introducing the possibility that MS may actively sabotage your user experience in the name of experimentation and testing, they degrade their own reputation (as much as it can be degraded from its current levels) and needlessly increase FUD regarding their proffered services.

    It may be for a good cause, but customers should not be the ones testing Microsoft's software. As a professional software house, they should provide good quality control before software hits the servers. It doesn't matter if this is the age of Agile or not.

    • by Anonymous Coward on Wednesday May 20, 2009 @01:17AM (#28022265)

      In this, there is also the possibility of becoming complacent and ill-tuned to the needs of your users. Taking Google as an example, they keep their services in a perpetual state of beta, always in testing, never reaching a final v1. This type of reliance on constant feedback from customers may work for a short while, but unless the product reaches a state of relative stability (in terms of both not crashing and also not changing) the users will typically find some other software to use.

      You just disproved your own point.

    • by Volante3192 ( 953645 ) on Wednesday May 20, 2009 @01:20AM (#28022285)

      Yep. Half a second delay is the end of the world. Age of Linux as SaaS...or whatever buzzwords are in play here.

      I'm up for a good Microsoft rant any time, any place, but if a small batch of users have to take a performance hit to improve the experience in the end for all users, isn't that a positive thing? You can't really beta test this stuff. You can try running simulations, but nothing beats real world numbers.

      Would we view this any different if Apple tried it? Google?

      (Disclaimer: This is a logic exercise. In reality, I doubt there's actually much MS could and would do to their site to improve my experience using it.)

      • Re: (Score:3, Interesting)

        by perryizgr8 ( 1370173 )
        i think google already does this. i read about a ui designer who left google. he said that google relied too much on experimental data for their colors and ui than his advice. for example, if they had to choose the color of the search button in youtube, dark blue or light blue. so for a day, each color would be tried. due to the sheer volume of clicks, they would be able to see patterns and then decide which color users are more likely to click on.
        but the key difference here is that changing ui is nowhere
      • Re: (Score:3, Funny)

        by Hal_Porter ( 817932 )

        if a small batch of users have to take a performance hit to improve the experience in the end for all users, isn't that a positive thing?

        Didn't Jesus say "the needs of the many outweigh the needs of the few"?

    • by Fallen Seraph ( 808728 ) on Wednesday May 20, 2009 @01:55AM (#28022439)

      In this, there is also the possibility of becoming complacent and ill-tuned to the needs of your users. Taking Google as an example, they keep their services in a perpetual state of beta, always in testing, never reaching a final v1. This type of reliance on constant feedback from customers may work for a short while, but unless the product reaches a state of relative stability (in terms of both not crashing and also not changing) the users will typically find some other software to use.

      Yeah! I mean, take IE6 for example. That didn't change in a REALLY long time, and lots of people use it! That makes it good, right? [/sarcasm]

      Your statement neglects quality. Yes, people want sites that're stable and don't crash, and yes, changing the design every week is bad and confusing, but improving on the design and function of a site is always a good thing, so long as you do so at intervals large enough for users to adjust to. The design of Gmail has only changed drastically two or three times in it's history design-wise, but they still consider it a Beta (depending on what you consider a drastic change, of course).

      The issue is that Google, once simply a search engine, is now in the Web Services industry. The fact is, no matter what the label says, Gmail and many of their other apps are not in Beta, and haven't been in a long time. They're just hesitant to call it "v1" or something because that has a sense of finality, like customers shouldn't expect it to change very often. With the web, and Web Apps in particular, that's no longer really the case. They are often redesigned and redone to improve their performance, effectiveness, ease-of-use, and even aesthetics. You even point out yourself how agile the web is as an environment for releasing software. You neglect, though, that this keeps it interesting for the users as well, because they like the feeling that their product is continually being improved at no extra cost to them (unlike many shrink-wrapped software) (Note: When I say "extra cost," I mean in addition to any subscriptions they already have to the service, if any).

      The "Beta" in Google's case is very much a marketing issue as much as it is a technical issue.

      • by x2A ( 858210 ) on Wednesday May 20, 2009 @04:31AM (#28023033)

        "The "Beta" in Google's case is very much a marketing issue as much as it is a technical issue"

        And perhaps a commitment issue, like people who stay engaged forever but never actually get married...

      • The "Beta" in Google's case is very much a marketing issue as much as it is a technical issue.

        I thought about this recently, and it occurred to me that maybe the "perpetual beta" for the free Gmail service is probably also in part due to Google needing a large pool of beta users so that when they sell their Gmail product to organizations who want an outsourced email solution, they can deliver a better product.

        Google gives away free email to everyone; google gets free beta testers in return, so google can sell a better product to corporations. Makes a lot of sense when you look at it this way. I do

    • It also bears considering that part of the experiment was to observe users' responses to the degraded service. A professional software house can control the quality of their software to to an arbitrary degree, for a cost. Understanding the marginal benefit of an additional "unit" of quality, however, requires them to characterize the users' response to software experiences of varying quality.

    • So when Microsoft decides to impact a few customers with degraded QoS, they may be setting themselves up for a bigger fall later.

      I bet for most users they will just think it is their internet connection. So they will stop and try to reload the page, if it takes to long to load. But that gives Microsoft the information they need without really pissing off the users, as they will blame themselves, their computer, their wireless connection, their internet connection... They may blame Microsoft web service how

    • Rather than cool, I'd call this the worst thing about the "web". If you buy a brand new X (car, TV, toaster, etc.) and half its features are missing or broken, aren't you pissed? But it's okay for software. This mentality enables companies to save money on engineering and QA and pushes these costs onto their customers. It's BS...

    • "What is cool about the Web is that it is the most Agile of all release environments. Unlike shrinkwrap software, web software can be changed very easily and universally for all users. It brings a raw edge to the development of the software."

      Sure, it's much easier to change a web application using Javascript and the browser DOM than it is to modify a native application.

    • What is cool about the Web is that it is the most Agile of all release environments. Unlike shrinkwrap software, web software can be changed very easily and universally for all users.

      Which means that I as a user can't opt out of an update that breaks functionality I depend on (or just plain old like and want).

      [however, how often that happens and how important that is to you... is for you to judge]

  • by Anonymous Coward on Wednesday May 20, 2009 @01:14AM (#28022253)

    Thanks for the reminder, it's already been a couple of hours since my last flood ping! Now if you excuse me...

    The woods are lovely, dark and deep,
    But I have promises to keep,
    And pings to send before I sleep,
    And pings to send before you sleep.

  • ...compared to google.
    but the home page of live search is great. so i open it everyday and just watch the picture.
    • by x2A ( 858210 )

      Off topic comment re your sig:

      For some reason, whenever I see the initials RMS, my brain translates it as "Root Me Silly".

  • by syousef ( 465911 ) on Wednesday May 20, 2009 @01:35AM (#28022365) Journal

    If I were running a fast food restaurant one of the first things it would make sense to do is pick groups of customer to punch in the face instead of giving them their order. It's all for a good cause. We want to know just how much abuse they'll take before they go down the road to the competition. That will help us figure out how good our food is. Now did you want a fries with that burger? *PUNCH* How about a *PUNCH* drink?

    See how absurd it sounds?

    • by lxs ( 131946 ) on Wednesday May 20, 2009 @01:39AM (#28022391)

      Don't be so negative. They're simply migrating the Vista experience to the Cloud.

    • by totally bogus dude ( 1040246 ) on Wednesday May 20, 2009 @02:36AM (#28022609)

      It sounds absurd because what you're saying is absurd.

      If you ran an experiment where some customers had their orders delayed by a few minutes more than was necessary and had some kind of metric to determine their enjoyment of their dining experience, it wouldn't be so absurd. Perhaps you provide free internet access in your store, and the extra delay results in a greater chance of people making use of it. And once they've started using it, there's a greater chance they'll decide to order a coffee after their meal and stick around for a bit longer.

      Or maybe you find they're less likely to return to the store. That might be hard to track, but the point stands. There are some things which are interesting and which may or may not give unexpected results when tried in real life. If an experiment like this shows that a few minutes delay significantly upsets customers, then it becomes clear that spending extra money to have more staff on is probably actually worth the expense. On the other hand if you can show that most people don't notice, then it makes sense to risk having a shortage of staff at peak periods if you can save a bit of money.

      You might even find unexpected results, for example maybe a lot of people after waiting a few minutes with nothing to look at but the menu end up ordering more than they initially would, so it's actually profitable to make people wait longer. Who knows? The only way to find out is to experiment.

      • by nEoN nOoDlE ( 27594 ) on Wednesday May 20, 2009 @03:32AM (#28022805)

        Hey, who invited the logic guy to this Microsoft bashing thread?

      • When have you ever been to a restaurant and ordered food but wanted to wait longer then you should for it? Same question for the internet, do you ever click on a link and hope that it takes longer than what you expect?

        No

        That's totally bogus dude

        • Your post is 'totally bogus' because no one is suggesting the people being tested 'hope' for something in this regard.

          The question is - how much of a delay causes the bulk of your customer base to walk away, and what mitigation factors can you put in place to keep them.
        • by x2A ( 858210 )

          Half a second. You've spent more time complaining about it than those people waited for their page to load.

          Just to put things into some kind of perspective.

      • by stephanruby ( 542433 ) on Wednesday May 20, 2009 @04:26AM (#28023007)

        If you ran an experiment where some customers had their orders delayed by a few minutes more than was necessary and had some kind of metric to determine their enjoyment of their dining experience, it wouldn't be so absurd.

        Sure, it wouldn't be so absurd, because we all know that a Microsoft Live results page is just like a nice burger, or a nice frothy Guiness getting poured ever slowly. The slower it takes, the better it usually is.

        In fact, that should be Microsoft new marketing campaign: "At Microsoft Live, we make all our results from scratch and we don't pre-index anything. It does take a little bit longer, and we may not be the biggest search engine around, but that's just a sign we're focusing on delivering quality results -- not fast results."

        • There is a place in Japan where you can pay 100$ for a cup of tea.
          Point is, additional value can be attributed to something if it is harder to get.

      • by syousef ( 465911 )

        It sounds absurd because what you're saying is absurd.

        Perhaps you think it's fine and ethical to run experiments on your customers. To me THAT sounds absurd, so I highlighted this in a way that someone without a sense of humour can't possibly appreciate.

        • A few people seem to be implying some kind of ethical issue with this practice, but try as I might I don't see the problem. It's not as if they're subjecting customers to some kind of degrading experience (despite the slashdot headline). So the site loads slightly slow, or the page is a little less optimised, or the waitstaff are a bit slower. How is this at all unethical? These are all perfectly normal things that can and do happen "by accident" all the time. I don't see how artificially causing them to oc

          • by syousef ( 465911 )

            A few people seem to be implying some kind of ethical issue with this practice, but try as I might I don't see the problem.

            Try harder.

            It's not as if they're subjecting customers to some kind of degrading experience (despite the slashdot headline). So the site loads slightly slow, or the page is a little less optimised, or the waitstaff are a bit slower. How is this at all unethical?

            Instead of serving the customer as best you can, you are using them as guinea pigs. You're slightly inconveniencing them to fur

        • Re: (Score:3, Insightful)

          I hate to break it to you, but customers don't know what they need. They can tell you what they WANT, but often, that's not what they NEED, and their feedback tends to be anecdotal or garbage data. That's why when new products, software, and sites are designed, they often go through a usability test, where potential customers *gasp* are brought in to use the product. Their feedback, though, is secondary to actual physical metrics that can be obtained by either watching them use the product or through automa
          • by syousef ( 465911 )

            I hate to break it to you, but customers don't know what they need. They can tell you what they WANT, but often, that's not what they NEED, and their feedback tends to be anecdotal or garbage data.

            I hate to break it to you, but deciding what your customer needs based on your own rather biased point of view and your own tainted motivations is usually not in their best interest.

            Their feedback, though, is secondary to actual physical metrics that can be obtained by either watching them use the product or throu

            • I hate to break it to you, but deciding what your customer needs based on your own rather biased point of view and your own tainted motivations is usually not in their best interest.

              You don't seem to understand. It has nothing to do with bias. It's about gathering real, measurable, metrics of performance (such as time taken, mistakes made, etc) and usability and comparing them to expected metrics, making changes, and measuring the result once again. It has nothing to do with opinion and a lot to do with st

              • by syousef ( 465911 )

                You don't seem to understand. It has nothing to do with bias. It's about gathering real, measurable, metrics of performance (such as time taken, mistakes made, etc) and usability and comparing them to expected metrics, making changes, and measuring the result once again

                You can't take a measurement without some bias. If you're trying to be scientific about things, you conduct a double blind trial. If you don't understand that you're the wrong person to be doing the measurement and metrics.

                It has nothing to d

      • by mh1997 ( 1065630 ) on Wednesday May 20, 2009 @08:01AM (#28024205)

        If you ran an experiment where some customers had their orders delayed by a few minutes more than was necessary and had some kind of metric to determine their enjoyment of their dining experience, it wouldn't be so absurd.

        When I was a teenager, I worked at McDonald's. One day, some corporate people came into the restaurant with stop watches and notebooks. They had people pulled from the cash registers, then had extra people put at the registers. It appeared that they were doing something along the lines of what you are saying and what Microsoft did.

    • Re: (Score:2, Troll)

      by Facegarden ( 967477 )

      If I were running a fast food restaurant one of the first things it would make sense to do is pick groups of customer to punch in the face instead of giving them their order. It's all for a good cause. We want to know just how much abuse they'll take before they go down the road to the competition. That will help us figure out how good our food is. Now did you want a fries with that burger? *PUNCH* How about a *PUNCH* drink?

      See how absurd it sounds?

      That's just fucking ridiculous. Do you really feel similarly violated when a page loads 500mS slower versus someone punching you in the face?

      If so, then wake the fuck up. It's a pretty interesting problem to determine how fast is "fast enough" for a page to load and I don't blame them.

      Imagine google.com could load ten times faster than it currently does, but would increase their operating costs by ten times. I would bet that no one suggests that it would be worth it, so why is it so unreasonable to investig

      • Re: (Score:3, Insightful)

        by x2A ( 858210 )

        "You may be modded funny, but to me, that's only because there isn't a "douchebag" mod"

        I think the funny mod can actually be used defensively... when somebody says something, and they might be completely serious, but it's blatently obvious that it shouldn't be taken seriously, a +1 funny mod can help to set the context of which it is read :-)

    • Just last night I got a phone survey that was obviously commissioned by my cable TV/Internet company. Among the questions asked was, "if your monthly bill went up by $3, how likely would you be to cancel your service?" They asked a variation of that question about a competitor's telephone service.

      Your example is funny, but don't you think McDonald's has done research on how long they can let a drive-through line get before customers go elsewhere?

  • Alkamai? (Score:3, Interesting)

    by CAIMLAS ( 41445 ) on Wednesday May 20, 2009 @01:53AM (#28022431)

    Uh, doesn't all of MS's servers get fronted by Alkamai systems (running Linux) to distribute the load and help mitigate DDoS attacks?

  • by chrylis ( 262281 ) on Wednesday May 20, 2009 @02:17AM (#28022527)

    Passing on the low-hanging fruit, it seems to me that this is pretty much exactly the kind of thing that happens all the time at the packet layer. WRED, for example, selectively drops packets even when buffers aren't full and the network is still theoretically operating under capacity so that the next TCP connection doesn't bring down the uplink. How is the Microsoft strategy qualitatively different?

  • by ghjm ( 8918 ) on Wednesday May 20, 2009 @03:48AM (#28022851) Homepage

    a web page more useless than a blank page.

    http://havenworks.com/ [havenworks.com]

    Thank you, and good night.

    • Gaaaaaaaaah!

      I think I just caught colour blindness.
      • Re: (Score:2, Funny)

        by Anonymous Coward

        We had to evaluate that website for one of our IT units. In a room full of computers, all with that hideous monstrosity on the screen... where do you turn to?!

        I turned to alcohol. I can still see the site, I just don't care any more. :)

    • Dr. Bronner [blogs.com] makes sites as well as soap?

    • What is this I don't even
    • Damn, it looks like someone stretched out their ass and took a big dump between a bunch of <html> tags.

      I can see the slogans: "Havenworks: a work-safe and more subtle goatse"... Or maybe "Two girls one site"

  • Isn't this type of study best suited for a properly designed and executed "focus group." It's surely the more appropriate way to do user testing.

    Experimenting with web site delays on live users is akin to inappropriately releasing an operating system before it's ready for prime time, and letting the users suffer by finding and reporting the bugs. Oh, wait...

    (Also, I'm sure MS has enough sections to their web properties, and enough traffic, and enough existing delays, that they could analyze their existing

  • ...software... in ten years time computers will be ten times faster and programs will be ten times larger and it will take the user ten times longer to do anything.....

    Whats the article really about, if not how to cook a frog and how to determine the rate of turning up the heat.

  • This isn't news. This is how it's done. Ignoring the fact that it's about degrading performance, split testing is designed to attempt to optimize one variable. Sometimes it's difficult to isolate said variable. In this case, microsoft spends inordinate amounts of time and money to keep a high volume site snappy and responsive. The question is: are they spending *too* much money. So, they are attempting to answer that question using ye-olde-standard split testing methods.

    Nothing to see here. Move al

  • the fastest-loading page of all is a blank one, but it's also the most useless

    You've got to be kidding! Have you ever looked at the internet?

  • Absurd (Score:3, Funny)

    by frovingslosh ( 582462 ) on Wednesday May 20, 2009 @12:52PM (#28028407)

    Microsoft will selectively degrade the performance of pages to small sets of users so that they can see how various amounts of delay at different times and places affect user behavior.

    Why this is completely absurd. It would be like some moron deciding that people at Slashdot only read the top of the page and, rather than simply making a smaller page with a link to the rest of the information, only loading the top of the page until you try to scroll down and read more. Then suddenly things would jump around and muck up your concept of where you were on the page. The only thing that would be worse is if the put something cute or interesting at the bottom of the page to encourage you to scroll down to see it, and trigger this very undesirable behavior frequently.

If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol

Working...