A Look At Microsoft's 'Mini Internet' For Testing IE 241
MrSeb writes "With the grandiose bluster that only an aging juggernaut can pull off, Microsoft has detailed the Internet Explorer Performance Lab and its extraordinary efforts to ensure IE9 is competitive and IE10 is the fastest browser in the world. Here are a few bullet points: 128 test computers, 20,000 tests per day, over 850 metrics analyzed, 480GB of runtime data per day, and a granularity of just 100 nanoseconds. The data is reported to 11 server-class (16-core, 16GB of RAM) computers, and the data is stored on a 24-core, 64GB SQL server. The 'mini internet' has content servers, DNS servers, and network emulators (to model various different latencies, throughputs, packet loss)."
not a true real-world test (Score:2, Informative)
until they add some zombied computers and malware control servers.
oh, wait.. these are windows test systems. never mind. some kind soul probably already added them
IE Crap (Score:1, Informative)
And still unable to correctly implement CSS3 and HTML5
Re:Could use the real internet eh! (Score:5, Informative)
They wanted to account for any kind of lag, so by having it all in house and disconnected from even their internal network, they have control over all variables so everything is equal.
They did this post on their blog yesterday http://blogs.msdn.com/b/b8/ [msdn.com]
Re:Could use the real internet eh! (Score:4, Informative)
they never do say what kind of sites, just that they are sanitized and are real world web pages... but if you are comparing IE to itself or Chrome or FF, whatever site you use should be similar in speeds, except for ones that detect clients and do different things...
"Content servers are web servers that stand in for the millions of web hosts on the Internet. Each content server hosts real world web pages that have been captured locally. The captured pages go through a process we refer to as sanitization, where we tweak portions of the web content to ensure reproducible determinism. For example, JavaScript Date functions or Math.Random() calls will be replaced with a static value. Additionally, the dynamic URLs created by ad frameworks are locked to the URL that was first used by the framework.
After sanitization, content is served similarly to static content through an ISAPI filter that maps a hash of the URL to the content, allowing instantaneous lookup. Each web server is a 16-core machine with 16GB of RAM to minimize variability and ensure that content is in memory (no disk access required).
Content servers can also host dynamic web apps like Outlook Web Access or Office Web Apps. In these cases, the application server and any multi-tier dependencies are hosted on dedicated servers in the lab, just like real world environments."
Re:Granularity of 100 nanoseconds (Score:5, Informative)
No, it means that Windows has a 100ns granularity on it's timestamps.
http://msdn.microsoft.com/en-us/library/windows/desktop/ms724284(v=vs.85).aspx [microsoft.com]