1420433
story
The Qube writes
"As a followup to the in-depth story posted back in February regarding the history and the development of Windows NT, part 3 of the series of articles is now online.
It discusses the software testing procedures inside Microsoft."
Microsoft has testing procedures? (Score:5, Funny)
Re:Microsoft has testing procedures? (Score:2, Funny)
*pink screen of death*
Damn, even thats buggy.
Microsoft's testing: (Score:4, Funny)
QA Engineer 2: "Yup."
QA Engineer 1: "Okay, I'm declaring it GM and releasing it to manufacturing."
QA Engineer 2: "It's 'Miller Time'!"
Heres another one (Score:5, Funny)
Re:Heres another one (Score:2)
Re:Microsoft's testing: (Score:4, Funny)
Re:Microsoft's testing: (Score:5, Funny)
Re:Microsoft's testing: (Score:2)
Why couldn't it be Pabst Blue Ribbon or Milwaukee's Best ("Microsoft's Beast?") or something?
I don't want to drink Borg Beer, man.... ;-)
Re:Microsoft's testing: (Score:2)
Obligatory (Score:2, Funny)
I'm definitely sticking with gnome (Score:5, Funny)
Originally, the KDE worked to upgrade its infrastructure to Windows 2000 on its own. But after spending 18 months evaluating and testing, Cornett realized they'd need some help. The department contacted Microsoft Consulting Services (MCS) to ask about architectural guidance, and hired a full-time technical account manager from Microsoft's Enterprise Services group. Eventually, the KDE joined Windows Server 2003 Rapid Adoption Program (RAP), which allowed them to begin working with the product early in its development process.
(ok, so when you look into it you're likely to realize that it's the Kentucky Dept of Education, but when skimming the article it caught my eye and I was really confused!)
Re:I'm definitely sticking with gnome (Score:2, Insightful)
Besides the fact that as far as I can tell Windows 2003 offers very few benefits over the finally acceptably stable Windows 2000.
Re:I'm definitely sticking with gnome (Score:2)
The video of this is out... (Score:3, Funny)
The camera flies down, zooming in & out, between dozens of the ten million monkeys at ten million PCs, and back up to a control desk manned by straw-chewing yokels.
A screen flashes red
"Sir! Monkey number Y435A23J has come up with something that boots!"
The camera pans around to Bill Gates' face
"I call it... Windows 2008. Release it"
or something
Re:The video of this is out... (Score:2)
flying_boy.rm [rm2novellc...lyingboyrm]
All jokes aside... (Score:5, Insightful)
Rather than take 'miller time' pot shots at Microsoft, the real takeaway is the understanding that, no matter how rigorous the testing and build process, there is a complexity limit where a unified one-organization nightly fix-build-test model simply can't provide a product of suitable quality.
Better to acknowledge the best-of-breed methodology Microsoft uses to test their OSes, and conclude that while this breed works okay for applications, a world-class operating system needs peer review and distributed open source development to create a quality, secure product.
Re:All jokes aside... (Score:5, Insightful)
Except that this article says nothing about the testing methodology that Microsoft uses. It describes how Microsoft helps certain customers test deployment. Deployment testing has little or nothing to do with software testing.
This is an article about how Microsoft has the budget to help "special" customers with a free "service" (not software) and frankly, the bits about offering cash-strapped school systems free consulting and test deployments sounds a lot more like a Microsoft press release than a software testing case study.
I was genuinely hoping to read about their software QA process. What a waste of 5 minutes.
--Asa
Re:All jokes aside... (Score:2)
Anyone know where that article is?
Ah, yes, the obligatory Linux advocacy (Score:2, Insightful)
And that would be Linux, I suppose? Because no bugs ever creep into Linux, and there's never been a security flaw found. Except if you read Bugtraq, of course.
This wasn't even the point of the article -- though it might have been
Re:Ah, yes, the obligatory Linux advocacy (Score:2)
I suppose I didn't state it clearly enough: I think Microsoft has GOOD QA, but that even with the BEST QA, there exists a product complexity level which, when exceeded, means that a distributed, ongoing, proactive QA system, such as that afforded by open source (or even Apple's bug/enhancement submission procedure) is a much better way to ensure a more consistantly stable
Re:Ah, yes, the obligatory Linux advocacy (Score:2)
Sorry, my previous post sounds harsher than intended on re-reading it. My point was that I don't actually buy the mass development argument that open source advocates put forward. Please hop up a level and see my reply to jareth-0205 for more.
Re:Ah, yes, the obligatory Linux advocacy (Score:3, Informative)
That's the point! Bugs are much more likely to be found in an open system such as Linux because of the nature of Open source development - all people using the software can reporting / fixing bugs, not just the limited few inside a company. The parent poster is actually complimenting MS testing, just saying that it can never be as good as open source because of the numbers involved.
Re:Ah, yes, the obligatory Linux advocacy (Score:2)
From my perspective (i use MS/95-XP and Linux) at least with the open source model, I
The myth of open source mass development (Score:2)
The problem is that I disagree with the practical value of this idea.
It's great in theory. Thousands of people all around the world can look over the source code to Linux, and submit their patches and so on. The same goes for any other popular open source project, be it Mozilla, Open
Re:Ah, yes, the obligatory Linux advocacy (Score:2)
Sure, but their track record on many other things doesn't help closed source, either. Look at many other vendors, Apple for example, and they're much better; sometimes they've been known to publish patches for security flaws faster than the open source community.
Re:All jokes aside... (Score:2)
I just have this vivid image of a code version of "Jurasic Park".
(Bill Gates playing the part of Hammond.) Oh no, no program escapes for Redmond Park. We do have several undocumented releases per year though...
Testing??? Not at all. (Score:5, Insightful)
Where is the description of the test methodologies used? The bug escalation and change control systems? What sort of configuration control is used?
Re:Testing??? Not at all. (Score:2)
Testing is more than making sure it works and is stable under load.
If they wanted to impress me, they should set up a seperate lab full of programmers emulating script kiddies, and trying to hack into the servers to get at their data. Kiddies trying to take advantage of IE holes to plant trojans and own the servers.
Just like the real world.
Re:Testing??? Not at all. (Score:2)
Development costs (Score:4, Interesting)
Re:Development costs (Score:2, Insightful)
It's also a lot more popular.
Re:Development costs (Score:3, Insightful)
I don't know what you are talking about. I have been using OS X as a server OS for some time now and it has got to be the easiest server OS to manage. It is more stable than W2003 server, easier to manage less expensive etc...etc...etc... I am running it here [utah.edu] and in several other places in addition to my primary workstation that also hosts a couple of small bandwidth websites.
Re:Development costs (Score:2)
Nonetheless, the user base of MacOS as a server OS is trace. There simply are no deployments of the type talked about in the article, with hundreds of domain servers needing to be migrated. These guys don't mess around - they expect to have industrial strength support during the upgrade, and they expect there to be no regressions.
Apple is in an entirely different league - they can ship a trivial OS
Re:Development costs (Score:3, Informative)
That may be, but until recently Apple has not had an OS capable of large scale serving. I have used IRIX, Solaris and Windows in the past, but I find OS X to be the best of breed in terms of a do it all OS.
There simply are no deployments of the type talked about in the article, with hundreds of domain servers needing to be migrated. These guys don't mess around - they expect to have industrial strength support during the upgrade, and they expe
apple.com is the #hardware site on the Web (Score:3, Insightful)
I am not so sure about that. In a recent survey of hardware site, apple.com is the #1 site with about 3.5 mln unique visitors, while hp.com is a distant second with 2.7 mln. Apple Store is probably the best online store with annual sale in billions of dollars, Apple also host the most popular QuickTime movie trailers, and iTunes Music Store sell over a mln songs a week.
I
Re:Development costs (Score:3, Informative)
There is a thing called Mac OS X Server, you Windows idiot.
>> The only parts of MacOS that are used for serving stuff is the open source code, which effectively is built and tested by the community.
You are talking pure shit through your fat ass. What about WebObjects, NetInfo, Apple Remote Desktop, NetBoot and a host of other Apple sysadmin tools?
Re:Development costs (Score:2)
The main reason for Apple's small market share in the server space is due to hardware, not software. But things have changed since the introduction of Xserve and Xserve RAID last year, and Apple's server market share in Dec 2002 had grown nearly 300% - not too shabby for a new entry. With IBM PPC 970 just around the corner, the performance issue
Re:Development costs (Score:2)
1- The cool hardware [apple.com]
2- The nice software product [apple.com]
3- The independent support site [macosxserver.com]
Mac OS/X is quite a good server which is not encumbered by a stupid GUI when you don't need it.
Re:Development costs (Score:3, Funny)
Re:Development costs (Score:2)
Nah, couldn't be that. Must be because MS sucks and Apple doesn't!
Re:Development costs (Score:2)
That they support probably two or three orders of magnitude more hardware is reason enough, but on top of that they don't have the luxury of a significant chunk of their development being done for free by the OSS community.
Maybe if Apple had spent similar amounts of money on OS X, you wouldn't have to have the fastest Mac available just to be able to run OS X
Re:Development costs (Score:3, Informative)
What the fuck are you talking about?
Other than different CPU architecture, Mac OS X and Windows both support the same sort of hardware: ATI and nVidia GPU, Ethernet, USB, FireWire, 802.11b, SCSI and ATA Drive. Apple usually is years ahead of MS in adopting new technology: USB, FireWire, FireWire 800, BlueTooth, 802.11b, 802.11g, gigabit Ethernet, Rendezvous.
Apple is 60 times smaller than MS, but actually ma
Re:Development costs (Score:2)
Do you have any concept of just how many different pieces of hardware that covers ? "Ethernet" on its own would encompass *thousands* of different types of network cards, all of which require different drivers. Similarly for things like "SCSI". Heck, XP probably supports near a hundred different *motherboard chipsets*. Also, it may
Re:Development costs (Score:2)
Similarly, Apple provide free programming tools and documents for companies to write device drivers, but ultimately the manufacturers have to take the main responsibility to support their own products.
In any case, this really has very little to do with the quality of the OS.
Re:Development costs (Score:2)
But what _actually_ happens is that Microsoft write multitudes of hardware drivers to give basic functionality for a wide range of hardware to their customers. Just like Apple have drivers that support some of the hardware they don't ship (eg: wheel mice).
In any case, this really has very little to do with the quality of the OS.
Re:Development costs (Score:2)
But not for "hundreds of thousands components" as the other guy claimed earlier.
>> Just like Apple have drivers that support some of the hardware they don't ship (eg: wheel mice).
Exactly. Apple write code for generic USB mouse that also support mice with wheel and two buttons, which doesn't mean they have to write drivers for every p
Re:Development costs (Score:2)
Have you ever tried Mac OS X?
My 400 MHz iMac bought 4 years ago runs faster and smoother than Win XP on my 800 MHz PC, and it does much more than XP and works 24 hours a day for weeks and months without getting shut down. In contrast, the PC has to be shut down by the end of each day because it's too noisy, and it still crashes on
Re:Development costs (Score:2)
I use it all day, every day.
I've used OS X on nearly every Mac from a 233Mhz Beige G3 all the way up to a Dual 1.25GHz G4 as well, so I've got a rough idea how fast it runs on each of them - and I've yet to see one that runs even close to as smoothly as my ~4 year old dual P3/700, let alone some of the dual 3Ghz monsters you can buy today. Heck, my ~7 year old dual Pentium 200 runs XP about as fast as your 400Mhz iMac would - you can't even run OS X on a Mac that old.
This is
Re:Development costs (Score:2)
What exactly are you doing with OS X machines? I do lots of programming and graphics on an iBook, and just don't feel it's slow at all, unless I watch QuickTime video and play iTunes at the same time as well
Re:Development costs (Score:2)
That's because the two OSes exhibit different behaviours under excessive load conditions. OS X's superior (in terms of features) graphics system performs double-buffering, so you never get the half-drawn wi
Re:Development costs (Score:2)
Apple got to throw away all their mistakes when they started making OS X. They don't need to support nearly so many hardware experiments - ISA, VLB, MCA, assorted stupid methods of getting to "high" memory, fifty different ways of using large hard drives etc. etc. They also don't need to support a wide assortment of "good idea at the time" legacy technologies, DCOM and others of their
MS mad house (Score:2)
Despite the author's infinite admiration for MS, his description of War Room in part 2 is a clear indication that the Redmond Beast lives in a mad house.
I feel particularly sorry for the poor developers who suddenly were asked to fix the tens of thousands of "branding bugs" after MS had decided to drop th
Re:Development costs (Score:2)
The core Windows team is a drop in the bucket compared to all of microsoft's other projects. Go to Microsoft's website and try to select a product. The selection is huge!
Most of the developers for Windows are actually only partial contributers. They work on the
A
Re:Development costs (Score:2)
According to the article, there are 8000 to 10000 programmers working on the Win 2k3. I don't think Apple has that many employees world wide.
MS is about 60 times bigger than Apple and has more than $40 bln cash. Apple is primarily a hardware company, so lots of its resources are devoted to hardware innovations. But the irony is that Apple's software portfolio is actually bigger and better than that of MS. You may find that
Re:Development costs (Score:2)
Apple isn't spending money testing scalability, although their GUIs are certainly pretty.
Re:Development costs (Score:2)
Re:Development costs (Score:2)
Pure rubbish!
Tell me how much MS code in Windows that is due to differences between PCs made by IBM, HP, Dell or other box makers. They all use Intel CPU, ATI or nVidia GPU, etc.
I don't know where you get your hundreds of thousands components from, but the device drivers are all written by the device makers themselves not MS.
MS just hypes and hypes (Score:2)
Mac OS X provides low-level support for industry standards like USB, FireWire, BlueTooth and so on. It's totally upto the device makers to write the device-specific stuff and do most of the testing. It's my experiences that plug-n-play works much better on Mac than on PCs. M
Re:MS just hypes and hypes (Score:2)
Secondly, you really have to get yourself a Mac to experience the power and elegance of Mac OS X. You obviously have lots of theory about OS architectures, but OS X on my 400 MHz iMac bought 4 years ago just runs faster and much more stable than Win XP on my 800 MHz PC. The only thing that XP is good at is faster booting, but that's a moot point because I do
Re:MS just hypes and hypes (Score:2)
Project Builder is a very powerful IDE for building large Cocoa / Carbon apps or Framework / Bundle / Kernel Extension / Plug-in / Unix tool in C/C++, Objective C / C++, Java, AppleScript, with highly sophisticated dependency management for multiple targets / build styles / build phases.
Interface is the only GUI builder that I have ever used that is capable of making sleek GUI with virtually no cod
Testing (Score:5, Interesting)
Re:Testing (Score:4, Insightful)
As part of the Microsoft culture, it appears that you've missed the point.
The problem is the 50 million lines of code itself.
I would have "managed" NT's testing by "not managing it" at all, and instead would have clipped out all those bells and whistles to make a much more trim and modular OS. The code base is unecessarily large, from a functional point of view.
But just like the current SUV problem in America, it appears that Microsoft is dancing a tango with the consumers. Microsoft produces shitty code that looks good on the screen, and the consumers say "ohh" and "ahh" while not minding the crashes and restrictions, and then Microsoft gets encouraged to produce more "pretty code". I don't consider this problem to be fixable
Re:Testing (Score:2)
People tend to want options, all of those 'bells and whistles' are just the options that people keep asking for.
Re:Testing (Score:2)
If enough people ask for it, the laws of economics demand that someone will provide it...
Re:Testing (Score:2)
When you install a Microsoft OS on a new machine, what do you get? You get the underlying OS, a pretty but braindead GUI, IIS, and a few toy apps (like Wordpad and MS Paint).
When you install a Linux distro, you get the OS (kernel and GNU tools), a less pretty but less braindead GUI (only if you want it), Apache (again, only if you want it), and a boatload of useful programs (GCC, CVS, The GIMP, OpenOffice, etc., and
Re:Testing (Score:5, Informative)
Its a fine line and MS has done a fairly good job given the size of their code base and the pressure on them from the comsumer to get new products out in a timely way.
Whoa, there. Since when is it the consumer who is pressuring MS for new products? It seems to me that it's MS who has been rushing new "features" into production and pressuring consumers to upgrade. I don't know of anyone who had a burning desire to upgrade to Word 2K or Windows XP. The fact that others were upgrading and causing compatibility problems was the compelling reason.
Re:Testing (Score:2)
By the way, Windows NT is one of the best, if not the best, pieces of software created. The kernel architecture is vastly complex, far more complex than Unix/Linux. They have do
testing procedures? (Score:2)
Kentucky section on NT4-AD migration (Score:3, Interesting)
Any NT admins out there care to shed a bit more light on this? All I remember is that domain management under NT4 wasn't all that swift, but it's been several years since I've had to do anything with it.
Resources (Score:3, Insightful)
"They asked us how much space we needed, so we did the math, and it came to about 1 terabyte (TB)," Cornett told me. "On the first day, we left the EEC at 1:00 am on Tuesday, and were back in the lab at 8:30 am that morning. They had run fibre to our room, and given us access to a SAN with two 500 GB drives. We had a need and immediately, it was solved. They said, 'We were
THANKS (Score:2)
It's just another Microsoft White Paper (Score:3, Insightful)
"Hey buddy, c'mere, I'm going to give you the inside skinny of how testing really works inside Microsoft. First, we read slashdot and realized that nobody installs our products until SP1. So we developed this whiz bang testing center filled with every kind of PC you'd find in the real world. Y'know, Dells, IBMs and HPs (which stands for Hewlett Packard), we got 'em all. This center is FREE (as in beer) for our customers to use to test their real life products. First off, they have to give us all of their custom in-house software, y'know, so we can test. We keep that, btw. Secondly, it helps if they hire our Certified Microsoft Advisory board to assist them through the process. Lastly, we customize our software to work with those environments. And that's how we prove that our software is ready to go from day 1 and you don't need to wait for a service pack so buy it NOW!"
All kidding aside, its interesting to note that the CTO of Jet Blue has been giving lots of interviews to tech magazines explaining how "off the shelf" Windows software [we customize our software to work with those environments] not only works better than Unix/Linux but because he uses only Windows, he has been able to reduce his development staff greatly because he can do more with fewer people. That set off red flags in my mind... now that Jet Blue appears in this article as a testament to MS testing... well... I smell a PR campaign...
Seriously.... (Score:5, Insightful)
I used to do driver development for NT4.0. As such, I had a "victim" machine and a "debugging" machine, linked via a serial cable. The victim runs my driver, and I do my development and debug using the debugging machine to access the kernel debugger on the victim.
A normal cycle of development went something like this:
Note: this was a fresh install of NT4.0 debugging, with SP4. No third party apps (other than my own) installed. This was using Microsoft's WinDBG.
Now, I don't know about Microsoft's developers, but I regard an assertion failure as a failure - i.e. a bug to be fixed. Having HUNDREDS of them in released code is just unacceptable. Using an ASSERT() as a debugging printf() is wrong.
So either a) the MS developers have a different view of things than I do, or b) the MS developers were allowing hundreds of easily identified problems to go into release.
Now, EVERY non-trivial software project's lead engineer must make a decision at every release - "Do I fix these bugs and slip the release, or fix these bugs in the next release?" And EVERY lead will allow some bugs to slip. Usually, those bugs are deemed minor - spelin mestaches (sic), layout errors, things like that.
But to have a) hundreds of assertion failures, which give you file and line number of the error, and b) a memory leak in your debugger bad enough that you can WATCH it leak away hundreds of megabytes of memory each time, and to allow that to go out? Ugh.
Now I am sure that MS Q/A found those errors - if not they are far more incompetent that I am willing to assume they are. So clearly Q/A was overruled by management - "We don't care, ship it anyway!"
And that is the central problem to ANY Q/A department - if management overrules them, and forces a shipment anyway, then how do you blame Q/A?
I've said this before, and I shall say it again now: this is one of the places a real ISO-9000 standard can be useful. If the spec sayth "Lo, and the release candidate code shall have no bugs open against it in the bug tracking system, and any bugs that exist shall be clearly targeted to later revisions, and Q/A shall findth no undocumented bugs in the code, or the release shall be slipped, and the bugs corrected, AMEN!" then Q/A can say "OK, if you want to throw our ISO-9000 cert out the door, then by all means override us and ship."
(Yes, that won't prevent management from simply targeting all bugs to a later revision and shipping, but it at least forces some consideration of the consequences to me made."
Re:Seriously.... (Score:2, Insightful)
Worked at one place where they mandated that on us. One dude documented how to format a floppy disk in DOS. Then another doc how to put a sticker on it.
Unless its taken seriously its neato to have but other than that...
NT4 was basicly NT3.51 with the graphical shell. NT3.51 was 2 years old by the time NT4 shipped... Few were serious about using it at the
Re:Seriously.... (Score:3)
You don't want your development machine to be the machine your driver runs on. If your driver breaks, it could crash the machine, corrupt the file system, or any number of nasty things.
You also need to be able to inspect OS level objects, which you usually cannot do on a running system. So, most OS's provide a debug capability that "freezes" the system, and through a simple interface (usually serial) allows you to inspect the state of the whole
Re:Seriously.... (Score:2)
The phrases "mental image", "hysterical", "imagine" and "LOL" should have been a clue that I was not exactly writing a technical post. Here's a nickle, go buy an #include "humor.h"
I was just saying that your "victim machine" phrase triggered an amusing mental image while reading the post. If you re-read your post with that spin it's pretty funny. I can't imagine how you interprete
I read the article (Score:2)
But perhaps there is one already, and the output from stuff like this is the ASCII cow art. Or, Heaven forbid, goatse.
NT (Score:3, Insightful)
Those years of public beta testing certainly paid off.
Check this PPT to. (Score:3, Informative)
Re:Check this PPT to. (Score:2)
Just a couple of highlights:
The complete source code was 50 Gigabytes.
Build time was 8 hours.
The source code control system tracked over 411,000 files.
There were a lot of challenges trying to keep 5000 people working on the same operating system at once, they learned from problems and improved the process for Windows 2000.
It is high-level data, but it is still quite interesting.
Quote from the Ky. Dept. of Ed. customer. (Score:5, Insightful)
These guys obviously aren't students of "Licensing 6.0". [com.com]
what they shoulda done... (Score:5, Funny)
Hmmph. (Score:4, Insightful)
I think Microsoft would do well to test more and make less. Each incarnation of Windows seems to have brought disproportionately large improvements (or hindrances if you like) in the user interface, features, and resource consumption. Whilst a gradual accumulation of features and a slow increase in resource use is inevitable for any operating system I think Microsoft has been making their systems grow too much too quickly.
Microsoft seems to be running out of some new features to add to each new version of Windows to entice consumers are resorting to making their own features (notably,
As such I feel that MS would benefit from focusing on testing instead of adding new things. Consolidation is often just as helpful as (if not better than) augmentation, particularly for larger systems. I feel that sales would remain high if Windows had no new features or UI but could genuinely be considered as stable as alternatives.
Comparing with the UNIX model (Score:5, Interesting)
Thats why the win32 system spirals into complexity, no matter how much money is pumped into development or testing. Of course one of the best things about windows is also one of the worst, that vendors developing their own drivers for their hardware might make incompatible or bad drivers, or ones that step on the feet of other installed drivers in the system. In the Linux kernel, all the drivers are present before the testers and are considered while any major change takes place.. such as the VM or switching to 64-bit cpu. This is true for most other UNIXen where drivers are sent to the unix vendor for testing as well, but thats not as efficient as the Linux model.
And then the number of eyeballs testing Linux and FreeBSD is a phenomenon Microsoft cant copy. The free software community does not work for a paycheck, but theres more sincerity towards the software than there would be for a proprietary software. Free Software can be a matter of ego and gives a sense of competition with Microsoft. You cant buy that. This I believe is the biggest reason why colossal manhours are poured into free software development, while some of these developers work the rest of their days as data entry or office clerks, even mcdonalds.
Re:Comparing with the UNIX model (Score:3, Insightful)
Re:Comparing with the UNIX model (Score:2)
Re:Comparing with the UNIX model (Score:2)
"Many eyes" philosophy of open source: agreed, in theory this should render Microsoft obsolete, in practice it has merely spurred Microsoft to create a better product
It is rendering Microsoft obsolete. Our company buys laptops for employees and promptly replaces the Windows XP with 2000 before handing them out.
And that renders Microsoft obsolete how, exactly?
Both in performance and stability, Microsoft remains defeated
Every study that has directly compared
Re:Comparing with the UNIX model (Score:2)
Linux: from the people who ask "would you like fries with that?" for a living.
Re:Comparing with the UNIX model (Score:3, Informative)
Interesting article, but not really about testing. (Score:4, Informative)
The lifecycle of a software bug in the Windows Division
1) The bug is found, reported in a bug tracking system, and assigned to the developer.
2) The bug is evaluated by managers for severity and triaged in a daily "war" meeting. At this point, the bug may be postponed until the next cycle, or marked to be addressed in the current cycle.
3) For all open bugs in the current cycle, the developer investigates and creates a fix, frequently running a few tests before marking it as ready for test.
4) Testers make sure the bug is fixed, look for any additional problems, look for related issues, and frequently even run a regression test pass to make sure that the developer didn't accidentally break something else while making the fix. If there are additional problems, the bug goes back to the developer to make a better fix, otherwise the bug is marked as okay to check in.
5) The developer then code reviews the changes with another developer, builds the changes for all platforms to catch any possible compile breaks, and then checks in the changes.
6) The build lab picks up the changes for the day and starts to compile.
7) If a compile break occurs, usually because someone was in a hurry and didn't follow the rules, an on-call developer triages and fixes so that the compile can continue.
8) When the build finishes, it is installed on a set of machines, and a series of build verification tests are run to ensure that the build is at least good enough to run some tests.
9) When the build verification tests finish, then the testers install that build and double check that the bug is still fixed, and mark the bug as such.
10) Finally, the tester adds a regression test to their test plan, and automates that test so that it will at least be run before the end of every major cycle, sometimes every minor cycle, every week, every build, or for some issues even as part of the build verification tests.
Major cycles are for betas, and final releases, minor cycles are for releases to be deployed internally, builds tend to come out daily. At the start of a cycle, and in early cycles, the bar is fairly low, almost any bug can be fixed and added to the build. Near the end of each cycle, and at later cycles, the requirements are increased so that only changes that are absolutely necessary are taken, reducing the risk of introducing new problems that won't be discovered until after the product is released. At some point in every major cycle, the bugs and test plans are reviewed to find areas that need improvement.
Additionally, instrumented code to measure test coverage, quality standards in a number of areas like accessibility, reliability, scalability, globalization, localization, integration, interoperability, are measured for improvement, usability studies are performed, code profiling tools are used, code scanning tools look for code execution paths that could result in problems and automatically file bugs, testers bash on other components, and anything else anyone can think of to find the problems early.
However, the pace is incredible and problems can come from anywhere. Imagine testing an Xwindows application to configure networking while the kernel is changing, the networking core is changing, Xwindows is changing, the shell is changing, the compiler is changing, your application is changing, and the tools you use to test with are changing. It is a challenging job.
If you want to bash Microsoft, that's fine, I used to...hence my handle, but now that I have seen inside the "beast", it's just a business, most of the rumors are very off base, and most of the people there are just normal people who want to do the right thing.
Public Relations for Microsoft (Score:2)
Apple.com #1 hardware site - 50% more hits than HP (Score:2)
http://www.internetretailer.com/dailyNew s.asp?id=9361
"Apple Computer Inc.?s Apple.com led all computer hardware sites in number of shoppers for the week that ended May 11, according to Nielsen/NetRatings? AdRelevance report. Apple.com logged 3.75 million unique visitors, 73.7% of all visitors to hardware sites, which hosted 5.09 million shoppers for the week.
Next behind Apple was Hewlett-Packard Co.?s HP.com at 2.47 million visitors; Dell Computer Corp., at 1.94 million; Gate
Re:Text of the article (fixed formatting) (Score:5, Funny)
Windows Server 2003: The Road To Gold
Part Three: Testing Windows
and ends
Can anyone recommend a way to get my cat off heroin? It would be much appreciated.
Re:Inside MS... (Score:2)
Re:Let's see- (Score:3, Informative)
Pentium III @ 1ghz
256MB RAM
GeForce2 MX
RedHat 8, upgraded to the 2.4.21-pre5-ac2 kernel
KDE 3.0.3-8 RedHat
And I almost never have problems with KDE. I use Kate for almost all my programming, and I can count the number of times it's crashed on one hand with fingers left over.
You know that you can adjust how much CPU time KDE uses, right? I don't know about other distros, but for RedHat 8 it's under Extras - Preferences - Desktop Settings Wizard.
Re:Let's see- (Score:2, Insightful)
Pentium III @ 1ghz
256MB RAM
GeForce2 MX
RedHat 8, upgraded to the 2.4.21-pre5-ac2 kernel
KDE 3.0.3-8 RedHat
And I almost never have problems with KDE
Key words 'almost never'...
How about a 200mhz 80mb Laptop running Windows 2000 since Beta 1 back in fall of 1997 up to currently running WindowsXP.
And to this day it has only crashed once, when the user popped the IDE Hard Drive out accidentally while the system was running.
However, with a nice journalled file system, putting back i
Re:Let's see- (Score:3, Interesting)
Obviously all OSes are going to have bugs. The question is, how severe are those bugs? How frequent do they manifest themselves? KDE hasn't crashed on me in a very long time, and Linux hasn't crashed since I upgraded to 2.4.21-pre5-ac2 (and prior to "upgrad
Re:Let's see- (Score:2)
How long do you leave your laptop on?
Re:Let's see- (Score:2)
I my experience people who turn their system on, use it for a while, and then turn it off immediately have fewer crashes than people who (like me) leave their system on 24/7. Of course, the people who turn their system on and off a lot wear their HDs out faster >:)
Re:ignorance of linux users (Score:2)