Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Internet Explorer Microsoft The Internet Windows Technology

A Look At Microsoft's 'Mini Internet' For Testing IE 241

MrSeb writes "With the grandiose bluster that only an aging juggernaut can pull off, Microsoft has detailed the Internet Explorer Performance Lab and its extraordinary efforts to ensure IE9 is competitive and IE10 is the fastest browser in the world. Here are a few bullet points: 128 test computers, 20,000 tests per day, over 850 metrics analyzed, 480GB of runtime data per day, and a granularity of just 100 nanoseconds. The data is reported to 11 server-class (16-core, 16GB of RAM) computers, and the data is stored on a 24-core, 64GB SQL server. The 'mini internet' has content servers, DNS servers, and network emulators (to model various different latencies, throughputs, packet loss)."
This discussion has been archived. No new comments can be posted.

A Look At Microsoft's 'Mini Internet' For Testing IE

Comments Filter:
  • by Anonymous Coward on Friday February 17, 2012 @12:22PM (#39076193)

    Why not just use the real internet?

    • by weszz ( 710261 ) on Friday February 17, 2012 @12:25PM (#39076257)

      They wanted to account for any kind of lag, so by having it all in house and disconnected from even their internal network, they have control over all variables so everything is equal.

      They did this post on their blog yesterday http://blogs.msdn.com/b/b8/ [msdn.com]

      • by ackthpt ( 218170 ) on Friday February 17, 2012 @12:39PM (#39076437) Homepage Journal

        They wanted to account for any kind of lag, so by having it all in house and disconnected from even their internal network, they have control over all variables so everything is equal.

        They did this post on their blog yesterday http://blogs.msdn.com/b/b8/ [msdn.com]

        They care about it so they created a genuine imitation of the real thing.

        Honestly, I'd go at some of the pages I have to each day, which are ludicrous in their use of content and scripting - web developers just pick up and drop widgets all over the place, never a look toward what impact it has on the page being interpreted or used on the receiving end. I know I've got a bad one when I hear the processor fan kick in for a stinkin' web page!!!

        • by weszz ( 710261 ) on Friday February 17, 2012 @12:43PM (#39076493)

          they never do say what kind of sites, just that they are sanitized and are real world web pages... but if you are comparing IE to itself or Chrome or FF, whatever site you use should be similar in speeds, except for ones that detect clients and do different things...

          "Content servers are web servers that stand in for the millions of web hosts on the Internet. Each content server hosts real world web pages that have been captured locally. The captured pages go through a process we refer to as sanitization, where we tweak portions of the web content to ensure reproducible determinism. For example, JavaScript Date functions or Math.Random() calls will be replaced with a static value. Additionally, the dynamic URLs created by ad frameworks are locked to the URL that was first used by the framework.

          After sanitization, content is served similarly to static content through an ISAPI filter that maps a hash of the URL to the content, allowing instantaneous lookup. Each web server is a 16-core machine with 16GB of RAM to minimize variability and ensure that content is in memory (no disk access required).

          Content servers can also host dynamic web apps like Outlook Web Access or Office Web Apps. In these cases, the application server and any multi-tier dependencies are hosted on dedicated servers in the lab, just like real world environments."

          • One of the screenshots showed load times of URL's. I saw one was online.wsj.com. If you ask me the WallStreet Journal site is one the the worst sites to load due the the amount of crap they try to shove onto it.
            • The solution is simple. Open source IE10, let us non m$ folks make useful suggestions. Another idea is let libreOffice/Chrome/Firefox/Safari/Opera/Amaya/mySQL/inkScape/Linux keep right on marching. Which leads to an interesting seg-way, "What's a Trillion Dollors worth, if you can't spend it?"
          • So, how many of those "content servers" with captured real world web pages, are dedicated to porn? (hey, not joking- that -does- make up a huge percentage of traffic)
        • by smooth wombat ( 796938 ) on Friday February 17, 2012 @02:16PM (#39077757) Journal
          web developers just pick up and drop widgets all over the place,

          Rule #2 of IT that should never be broken [earthlink.net]: Never let a web designer design your web page.

          Giving free reign to a web designer to design a web site is like giving a two year-old a Faberge egg.
    • by greichert ( 464285 ) on Friday February 17, 2012 @12:31PM (#39076343)
      Because you can not have reproducable results on the real Internet. Only a fake one, where eveyrthing is controlled and reproducable, can be used for testing and making sure some settings do not make the browser slower.
      • by Bill, Shooter of Bul ( 629286 ) on Friday February 17, 2012 @01:01PM (#39076745) Journal

        You have to figure out what the variables that you have problems with in real world usage, before you can start optimising your product to account for them.
        There has to be iterative cycles of real world, then fake internet testing to really make it work well.
        It would also help if you were able to test your competition alogn the same lines.

        I additionally wonder if they are accouting for all of the different behavious of all of the various webservers out there. If they are only testing agianst iis, well, that's not very good.

        • by Phlow ( 2488880 )
          If you wonder about that, you must be wondering if their entire team that put this together is chock full of morons. Seriously, I think a little more credit is due here.
        • by gutnor ( 872759 )

          Well duh ! That is one the major reason you create that kind of lab in the first place: you found something in the real world then you craft a similar scenario in the lab and make that part of either your test case, or benchmark. Obviously part of the crafting the scenario is to make sure that it behaves the same as in the real world. Other uses include debugging, analysis of edge cases (i.e. stuff difficult to find the real world, or stuff that does not yet exist)

          Considering the renewed competition on t

        • by Bengie ( 1121981 ) on Friday February 17, 2012 @03:50PM (#39079003)

          Probably an evolving tick-tock setup between real and test.

          Step1. Performance Test App against real world
          Step2. Document Real world issues
          Step3. Create test environment to run your issue cases against
          Step4. Optimize app against test environment
          Step5. Goto Step1, adding any new cases

          Steps 1 and 2 can happen independent of 3 and 4. Step5 is just to make the logic seem serial.

    • Re: (Score:2, Insightful)

      by liquidsin ( 398151 )

      they don't want all those fancy test clients to be picking up the latest in drive-by syphilis; even microsoft knows better than to go on the real internet with explorer.

    • by thatskinnyguy ( 1129515 ) on Friday February 17, 2012 @01:01PM (#39076753)
      Because that would be somewhat unscientific. A lab setting is controllable and you would be able to trigger and know where latency is coming from and how to correct certain behaviors in software. With the real internet, it's anyone's guess where lag is coming from.
  • And still... (Score:4, Insightful)

    by jcreus ( 2547928 ) on Friday February 17, 2012 @12:22PM (#39076203)
    Beaten by Chrome and Firefox.
    • by Anonymous Coward on Friday February 17, 2012 @12:24PM (#39076227)

      The only thing Firefox does fast anymore is update.

      • by gorzek ( 647352 )

        Oh snap!

        Yeah, that's why I ditched Firefox years ago for Chrome. Got sick of FF freezing/crashing all the time, as well as its performance just getting worse and worse over time.

        • And Chrome has built in sycning of *everything*. An natively supports Greasemonkey scripts without an extension. And auto-updates in a cleaner fashion than Firefox. And is blazing fast at rendering.

          • by gorzek ( 647352 )

            All true. Although I have a friend who uses Opera, and she was aghast at the way Chrome renders a white screen before it starts rendering the full page. She said under Opera, the full page just shows up all at once with no weird white screen first. I'm not sure but I think Opera's renderer might be a little faster than Chrome's.

            • Funny, I'm aghast at requiring everything to be downloaded and rendered before it's displayed. That gives the illusion of nothing being done, and subtly irks me.

              • by gorzek ( 647352 )

                I feel the same way, myself. I'd rather the browser render whatever it has available at that moment rather than wait for the whole page (unless it has to because the page is compressed or something.)

      • Re:And still... (Score:4, Insightful)

        by mcrbids ( 148650 ) on Friday February 17, 2012 @01:11PM (#39076881) Journal

        Firefox hasn't become any slower. What's happened is that everything else become so much faster. There used to be a "pregnant pause" when entering a domain, I remember when 30 seconds to load a page was the norm, over 56k.

        Now, I expect the results from a google search to appear as I type, interactively. This isn't just an improvement, it's a whole 'nother animal at that level of performance.

        • by sakdoctor ( 1087155 ) on Friday February 17, 2012 @01:48PM (#39077385) Homepage

          While slashdot mocks the computer industry marketing for describing computers using a single metric, you seem to be quite happy with that when it comes to browser performance.

          An example: Chrome (v8 engine) has this reputation for amazing speed, but IE9 absolutely grinds Chrome into the dust when it comes to simply repositioning elements on screen; something which today's web apps spend a lot of their time doing. You can feel it too if you know what you're looking for. I don't follow IEs development as closely as Chrome or Firefox, but IE must be hardware accelerating these translations.

          I fully expect Google to focus on performance cases which help their specific apps. Again, a conflict of interest, akin to Microsoft pre-caching masses of junk, so that Office can appear to start up much faster than the competition.

          • by ifrag ( 984323 )

            so that Office can appear to start up much faster than the competition

            Ah yes, "the competition"... hmm, who exactly is that these days? Yeah, please don't say Libre/OpenOffice.

            I'm not at all saying MSOffice is good, because in a lot of ways it's terrible, but honestly thinking there is real competition now is a bit outrageous.

            • Still can't print with the latest issue of Open Office on OSX. Forums online tell us its a feature, or a bad driver (not using a driver all other apps print flawlessly) but definitely not a bug.

              I guess the feature they are referring to is you can export to PDF and print.

    • Re:And still... (Score:5, Insightful)

      by thedonger ( 1317951 ) on Friday February 17, 2012 @12:29PM (#39076315)

      And when all we care about is the fastest browser - in nanoseconds! - will we begin to forget the truly important criteria for choosing a browser?

      Or better still, by the time IE is on par with Chrome the actual browser will be irrelevant because mobile platforms - in which IE has little share - will do to traditional computers what Cromagnons did to Neanderthals. The next generation will use integrated devices, unaware they were using a browser, and with little or no need for even a choice.

      • by n5vb ( 587569 )

        The next generation will use integrated devices, unaware they were using a browser, and with little or no need for even a choice.

        And little or no understanding of how it works or how to use it as anything other than yet another few-to-many information channel they can listen to or watch, but can't talk back to in any real sense. And you're right that that's the direction it's going, but some of us aren't thrilled about that..

      • Milliseconds, (maybe not so much nanoseconds) DO matter inside an animation loop.

        Oh, unless you were happy with Flash?

        • Did you ever see South End Park (1999 or thereabouts South Park parody)? If that was all Flash was ever used for I'd think it was worthwhile.
  • "As you can see, my young apprentice, your friends have failed. Now witness the firepower of this fully ARMED and OPERATIONAL battle station! "

  • by Anonymous Coward

    until they add some zombied computers and malware control servers.

    oh, wait.. these are windows test systems. never mind. some kind soul probably already added them

  • by erroneus ( 253617 ) on Friday February 17, 2012 @12:28PM (#39076291) Homepage

    I couldn't resist. But with all the work and effort and resources going into this, how is it that operations a tiny fraction of this can generate fast, reliable and standards complaint browsers better than MSIE?

    Microsoft, the problem isn't that you're not spending enough money. It's that you're not doing it right.

    • by roc97007 ( 608802 ) on Friday February 17, 2012 @12:39PM (#39076435) Journal

      I couldn't resist. But with all the work and effort and resources going into this, how is it that operations a tiny fraction of this can generate fast, reliable and standards complaint browsers better than MSIE?

      Microsoft, the problem isn't that you're not spending enough money. It's that you're not doing it right.

      I'm not familiar with IE 8 and 9, but in the past the issue was many years and revisions of code reuse and accumulation of cruft, an insane amount of backwards compatibility and some poor initial design choices combined to make each new version bigger, slower and buggier. I imagine this combines to make developing and especially testing any new release a massive undertaking.

      To be fair, I would argue that Firefox is starting that downward spiral now. Each new version is slower and has a bigger footprint. Personally, I've switched to Chrome, but when Chrome inevitably starts lagging, I'll be on the lookout for the next completely new browser. Not merely because it's new, but because it's less likely to have years of bad decisions weighing it down.

      The problem in Microsoft's case is that they seem incapable of dumping what they have and doing a complete rewrite. There may be marketing reasons for this, granted, but if they made a clean break it'd be better for them in the long run.

      • by Mr 44 ( 180750 )

        The problem in Microsoft's case is that they seem incapable of dumping what they have and doing a complete rewrite.

        It's really tempting to think that way, but actually doing a clean re-write is usually a complete mistake. I'd say Mac OSX is one of the only successfull examples of this (and that largely because their previous versions were so horribly outdated it was unbelievable).

        Joel has a great article on this from a while back:
        http://www.joelonsoftware.com/articles/fog0000000069.html [joelonsoftware.com]

        • Right, but back in 2000, I don't think we had the appreciation of accumulated browser cruft that we have today.

          I'd argue that Internet Explorer has fallen into the class of "so horribly outdated it's unbelieveable".

      • To be fair, I would argue that Firefox is starting that downward spiral now. Each new version is slower and has a bigger footprint.

        As opposed to Chrome, which (on my machine, at least) has an even larger footprint?

        • I think the difference is that Chrome spawns separate processes instead of one big-ol-thang like Firefox. My computers seem to like multiple relatively large processes more than one gigantic process, even if the sum of the parts is greater than the whole.

      • It's just awes... Waiting on cache...
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      yes
      http://bellard.org/jslinux/

    • by Teckla ( 630646 )

      But with all the work and effort and resources going into this, how is it that operations a tiny fraction of this can generate fast, reliable and standards complaint browsers better than MSIE?

      And with more features, too!

      It's 2012 and IE9 still doesn't have a built-in spellchecker for text areas! Among many other must-have features that are suspiciously absent.

      • Re: (Score:3, Interesting)

        by jader3rd ( 2222716 )

        It's 2012 and IE9 still doesn't have a built-in spellchecker for text areas!

        If IE had a spell checker you'd call it bloat. When it's still 2012 and browsers running on Windows 8 won't need to worry about it because it'll be built into the OS, would that relieve your frustration?

      • To be fair, IE10 includes spell checking (and auto correct), though only preview versions have been released. The final version of IE9 was released in March of last year. I'm curious, what other "must-have features" are absent from IE10 (or IE9, for that matter)?

        I'm used to people disliking IE for how it was years ago and not giving it a fair chance today--and of course Microsoft hate is popular here (TFS opens with the "grandiose bluster" of an "aging juggernaut"--thanks so much for excellent and fair repo

  • These massive pages are a real benchmark for any browser. Or Google .. they seem to be logging every page I go to now. Or eBay with all that horrible bloat. Or Facebook, which is seriously clunky with to many competing scripts. ...

    If it's their own little internet they should be using some of the most bloated, unresponsive web sites on the internet to test on. I don't think when IE10 comes out I'll be surfing their own tiny little internet.

  • Microsoft, Howard Hughes is calling. Yes, he's read the MSDN article on IEPL and he'd really like his Spruce Goose [wikipedia.org] back.
  • Whatever the result I hope microsoft do well that in turn will push competitors and we the users should hopefully benefit. Though I feel IE has a long way to go..
  • HHGttG (Score:4, Funny)

    by Verdatum ( 1257828 ) on Friday February 17, 2012 @12:34PM (#39076371)
    Wasn't this a plot-point in the Hitch-Hiker's Guide to the Galaxy series? Having an artificial universe on-site so they could go exploring but still be able to come back for long lunches...
  • To mention the in-house installs of SWEN, TDSS, Melissa, and ILOVEYOU.
  • "And the Chevy Vega was thoroughly tested for millions of miles before being released to the public." GM has spoken!!
  • ...and they still can't see why users hate their software.

  • by Smask ( 665604 ) on Friday February 17, 2012 @12:48PM (#39076575)
    Does that mean they have only porn sites with midget porn? And a mini 4chan, populated with toddlers?
  • Then technically 40% of its traffic is pure Pr0n.

  • ... why we need IPv6. ;-)

  • One thing I'm wondering - given how in Windows 7 and beyond, MS seems to be making IPv6 the default home networking protocol, any idea whether this mini internet they are experimenting w/ is an IPv6 internet, or an IPv4 intranet (likely w/ private addresses?)

    Seems to me that that's what they should be doing. That way, they don't have to duplicate their efforts later to verify IPv6 compatibility.

  • ... servers do they have popping up "Your PC is infected!" messages offering bogus AV software to download?

  • The 'mini internet' has content servers, DNS servers, and network emulators (to model various different latencies, throughputs, packet loss)."

    I am all for this testing, but in practice I think it will lack one very important metric that Microsoft can not measure. How well IE10 works with a site riddled with poorly written code.

    • oh don't worry, they use pages captured from the web after all.

      my worry is, will they also test it against the standards? you know, like they should be doing :O

      and personally, I'd rather have browsers that support the standards and that's it, than browser vendors having to maintain (and browsers having to contain) huge amounts of special case code, which in turn prolongs sloppy website authorship. I know very well it won't happen, but I would love to have such a browser. you know, a hardcore strict edition

The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity. -- Edsger Dijkstra

Working...