Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Software

Inside The Development of Windows NT: Testing 255

The Qube writes "As a followup to the in-depth story posted back in February regarding the history and the development of Windows NT, part 3 of the series of articles is now online. It discusses the software testing procedures inside Microsoft."
This discussion has been archived. No new comments can be posted.

Inside The Development of Windows NT: Testing

Comments Filter:
  • by Anonymous Coward on Sunday May 25, 2003 @12:09PM (#6035219)
    Well, I guess you have to make sure those "features" work correctly
  • by Anonymous Coward on Sunday May 25, 2003 @12:11PM (#6035226)
    QA Engineer 1: "Does it compile?"
    QA Engineer 2: "Yup."
    QA Engineer 1: "Okay, I'm declaring it GM and releasing it to manufacturing."
    QA Engineer 2: "It's 'Miller Time'!"
  • Obligatory (Score:2, Funny)

    by MacroRex ( 548024 )
    But wasn't it supposed to be Not Tested?
  • by 1nv4d3r ( 642775 ) on Sunday May 25, 2003 @12:20PM (#6035280)
    From the article (not kidding!):

    Originally, the KDE worked to upgrade its infrastructure to Windows 2000 on its own. But after spending 18 months evaluating and testing, Cornett realized they'd need some help. The department contacted Microsoft Consulting Services (MCS) to ask about architectural guidance, and hired a full-time technical account manager from Microsoft's Enterprise Services group. Eventually, the KDE joined Windows Server 2003 Rapid Adoption Program (RAP), which allowed them to begin working with the product early in its development process.

    (ok, so when you look into it you're likely to realize that it's the Kentucky Dept of Education, but when skimming the article it caught my eye and I was really confused!)
    • I think its funny that it took them 18 months to figure out they needed help. Obviously Cornett is not all that gifted in the computer realm, given 18 months I'm pretty sure I could come up with something better than installing an MS beta on all my servers.

      Besides the fact that as far as I can tell Windows 2003 offers very few benefits over the finally acceptably stable Windows 2000.

  • by Anonymous Coward on Sunday May 25, 2003 @12:21PM (#6035285)
    It starts off with a room. a huge room... the largest one you've ever seen...

    The camera flies down, zooming in & out, between dozens of the ten million monkeys at ten million PCs, and back up to a control desk manned by straw-chewing yokels.

    A screen flashes red

    "Sir! Monkey number Y435A23J has come up with something that boots!"

    The camera pans around to Bill Gates' face

    "I call it... Windows 2008. Release it"

    or something
  • All jokes aside... (Score:5, Insightful)

    by KFury ( 19522 ) * on Sunday May 25, 2003 @12:25PM (#6035305) Homepage
    All jokes aside, the amount of testing and the build process for NT is one of the most tightly organized and comprehensive testing methodologies in existence.

    Rather than take 'miller time' pot shots at Microsoft, the real takeaway is the understanding that, no matter how rigorous the testing and build process, there is a complexity limit where a unified one-organization nightly fix-build-test model simply can't provide a product of suitable quality.

    Better to acknowledge the best-of-breed methodology Microsoft uses to test their OSes, and conclude that while this breed works okay for applications, a world-class operating system needs peer review and distributed open source development to create a quality, secure product.
    • by asa ( 33102 ) <asa@mozilla.com> on Sunday May 25, 2003 @12:55PM (#6035467) Homepage
      "Better to acknowledge the best-of-breed methodology Microsoft uses to test their OSes...."

      Except that this article says nothing about the testing methodology that Microsoft uses. It describes how Microsoft helps certain customers test deployment. Deployment testing has little or nothing to do with software testing.

      This is an article about how Microsoft has the budget to help "special" customers with a free "service" (not software) and frankly, the bits about offering cash-strapped school systems free consulting and test deployments sounds a lot more like a Microsoft press release than a software testing case study.

      I was genuinely hoping to read about their software QA process. What a waste of 5 minutes.

      --Asa
      • My bad. I read a really interesting comprehensive article on Microsoft's build and QA process for Windows NT a few months ago, and I just assumed that that was what the link pointed to. Now I wish I could find the article because it's something /.ers would love (if /. isn't where I found it in the first place).

        Anyone know where that article is?
    • Better to acknowledge the best-of-breed methodology Microsoft uses to test their OSes, and conclude that while this breed works okay for applications, a world-class operating system needs peer review and distributed open source development to create a quality, secure product.

      And that would be Linux, I suppose? Because no bugs ever creep into Linux, and there's never been a security flaw found. Except if you read Bugtraq, of course.

      This wasn't even the point of the article -- though it might have been

      • yet still someone comes in here and starts making out that Microsoft have bad QA and the open source model is vastly superior. How sad.

        I suppose I didn't state it clearly enough: I think Microsoft has GOOD QA, but that even with the BEST QA, there exists a product complexity level which, when exceeded, means that a distributed, ongoing, proactive QA system, such as that afforded by open source (or even Apple's bug/enhancement submission procedure) is a much better way to ensure a more consistantly stable
      • And that would be Linux, I suppose? Because no bugs ever creep into Linux, and there's never been a security flaw found.

        That's the point! Bugs are much more likely to be found in an open system such as Linux because of the nature of Open source development - all people using the software can reporting / fixing bugs, not just the limited few inside a company. The parent poster is actually complimenting MS testing, just saying that it can never be as good as open source because of the numbers involved.

        • adding to your comment, with open source software, the security problems are ADMITTED so even IF a fix isn't issued quickly (which they usually are) YOU can decide to pull those machines/daemons/services down until a fix is issued. Anyone can see the bugs right after they are posted with oss, and decide for themselves what to do until a fix is issued. With MS, they often are very good about fixes, and often they are not.

          From my perspective (i use MS/95-XP and Linux) at least with the open source model, I
        • Bugs are much more likely to be found in an open system such as Linux because of the nature of Open source development - all people using the software can reporting / fixing bugs, not just the limited few inside a company.

          The problem is that I disagree with the practical value of this idea.

          It's great in theory. Thousands of people all around the world can look over the source code to Linux, and submit their patches and so on. The same goes for any other popular open source project, be it Mozilla, Open

    • "Best of breed" testing sounds an awful lot like "survival of the fittest."

      I just have this vivid image of a code version of "Jurasic Park".

      (Bill Gates playing the part of Hammond.) Oh no, no program escapes for Redmond Park. We do have several undocumented releases per year though...

  • by the eric conspiracy ( 20178 ) on Sunday May 25, 2003 @12:27PM (#6035313)
    No way this article is about software testing. This is about an evaluation lab where customers bring in their applications to show to Microsoft. It's a marketing puff-piece, that's all.

    Where is the description of the test methodologies used? The bug escalation and change control systems? What sort of configuration control is used?

    • Exactly.

      Testing is more than making sure it works and is stable under load.

      If they wanted to impress me, they should set up a seperate lab full of programmers emulating script kiddies, and trying to hack into the servers to get at their data. Kiddies trying to take advantage of IE holes to plant trojans and own the servers.

      Just like the real world.

    • Microsoft employs a very thourough testing procedure procedure for every software product, going back to DOS. The problem is that it's generally the customers doing the testing.
  • Development costs (Score:4, Interesting)

    by BWJones ( 18351 ) on Sunday May 25, 2003 @12:34PM (#6035345) Homepage Journal
    O.K., can anyone here tell me why Microsoft is spending an order of magnatude more $$'s to develop Windows than Apple is spending on developing OS X? It can't be testing because the Apple products appear so much more refined.

    • 2003 is a server OS. MacOS X is not, despite Apples best attempts. The only parts of MacOS that are used for serving stuff is the open source code, which effectively is built and tested by the community. MS include things like IIS/Active Directory as part of the Windows product, so more testing is needed.

      It's also a lot more popular.

      • by BWJones ( 18351 )
        2003 is a server OS. MacOS X is not, despite Apples best attempts.

        I don't know what you are talking about. I have been using OS X as a server OS for some time now and it has got to be the easiest server OS to manage. It is more stable than W2003 server, easier to manage less expensive etc...etc...etc... I am running it here [utah.edu] and in several other places in addition to my primary workstation that also hosts a couple of small bandwidth websites.
        • I don't know what you are talking about. I have been using OS X as a server OS for some time now.......

          Nonetheless, the user base of MacOS as a server OS is trace. There simply are no deployments of the type talked about in the article, with hundreds of domain servers needing to be migrated. These guys don't mess around - they expect to have industrial strength support during the upgrade, and they expect there to be no regressions.

          Apple is in an entirely different league - they can ship a trivial OS

          • Re:Development costs (Score:3, Informative)

            by BWJones ( 18351 )
            Nonetheless, the user base of MacOS as a server OS is trace.

            That may be, but until recently Apple has not had an OS capable of large scale serving. I have used IRIX, Solaris and Windows in the past, but I find OS X to be the best of breed in terms of a do it all OS.

            There simply are no deployments of the type talked about in the article, with hundreds of domain servers needing to be migrated. These guys don't mess around - they expect to have industrial strength support during the upgrade, and they expe
          • >> There simply are no deployments of the type talked about in the article, with hundreds of domain servers needing to be migrated.

            I am not so sure about that. In a recent survey of hardware site, apple.com is the #1 site with about 3.5 mln unique visitors, while hp.com is a distant second with 2.7 mln. Apple Store is probably the best online store with annual sale in billions of dollars, Apple also host the most popular QuickTime movie trailers, and iTunes Music Store sell over a mln songs a week.

            I
      • Re:Development costs (Score:3, Informative)

        by afantee ( 562443 )
        >> 2003 is a server OS. MacOS X is not, despite Apples best attempts.

        There is a thing called Mac OS X Server, you Windows idiot.

        >> The only parts of MacOS that are used for serving stuff is the open source code, which effectively is built and tested by the community.

        You are talking pure shit through your fat ass. What about WebObjects, NetInfo, Apple Remote Desktop, NetBoot and a host of other Apple sysadmin tools?
      • I do that mistake a lot myself, but please do some more research before you post:

        1- The cool hardware [apple.com]

        2- The nice software product [apple.com]

        3- The independent support site [macosxserver.com]

        Mac OS/X is quite a good server which is not encumbered by a stupid GUI when you don't need it.
    • So you didn't use OS X prior to version 10.1.5?
    • Hmm, maybe because they don't control the hardware?

      Nah, couldn't be that. Must be because MS sucks and Apple doesn't!
    • O.K., can anyone here tell me why Microsoft is spending an order of magnatude more $$'s to develop Windows than Apple is spending on developing OS X?

      That they support probably two or three orders of magnitude more hardware is reason enough, but on top of that they don't have the luxury of a significant chunk of their development being done for free by the OSS community.

      Maybe if Apple had spent similar amounts of money on OS X, you wouldn't have to have the fastest Mac available just to be able to run OS X

      • Re:Development costs (Score:3, Informative)

        by afantee ( 562443 )
        >> That they support probably two or three orders of magnitude more hardware is reason enough

        What the fuck are you talking about?

        Other than different CPU architecture, Mac OS X and Windows both support the same sort of hardware: ATI and nVidia GPU, Ethernet, USB, FireWire, 802.11b, SCSI and ATA Drive. Apple usually is years ahead of MS in adopting new technology: USB, FireWire, FireWire 800, BlueTooth, 802.11b, 802.11g, gigabit Ethernet, Rendezvous.

        Apple is 60 times smaller than MS, but actually ma
        • Other than different CPU architecture, Mac OS X and Windows both support the same sort of hardware: ATI and nVidia GPU, Ethernet, USB, FireWire, 802.11b, SCSI and ATA Drive.

          Do you have any concept of just how many different pieces of hardware that covers ? "Ethernet" on its own would encompass *thousands* of different types of network cards, all of which require different drivers. Similarly for things like "SCSI". Heck, XP probably supports near a hundred different *motherboard chipsets*. Also, it may

          • MS only has to write code to support standard protocols like USB or FireWire. The device makers will write the device drivers, do the testing, and pay MS to get certifications.

            Similarly, Apple provide free programming tools and documents for companies to write device drivers, but ultimately the manufacturers have to take the main responsibility to support their own products.

            In any case, this really has very little to do with the quality of the OS.
            • MS only has to write code to support standard protocols like USB or FireWire. The device makers will write the device drivers, do the testing, and pay MS to get certifications.

              But what _actually_ happens is that Microsoft write multitudes of hardware drivers to give basic functionality for a wide range of hardware to their customers. Just like Apple have drivers that support some of the hardware they don't ship (eg: wheel mice).

              In any case, this really has very little to do with the quality of the OS.

              • >> But what _actually_ happens is that Microsoft write multitudes of hardware drivers to give basic functionality for a wide range of hardware to their customers.

                But not for "hundreds of thousands components" as the other guy claimed earlier.

                >> Just like Apple have drivers that support some of the hardware they don't ship (eg: wheel mice).

                Exactly. Apple write code for generic USB mouse that also support mice with wheel and two buttons, which doesn't mean they have to write drivers for every p
      • >> Maybe if Apple had spent similar amounts of money on OS X, you wouldn't have to have the fastest Mac available just to be able to run OS X at a barely acceptable speed ?

        Have you ever tried Mac OS X?

        My 400 MHz iMac bought 4 years ago runs faster and smoother than Win XP on my 800 MHz PC, and it does much more than XP and works 24 hours a day for weeks and months without getting shut down. In contrast, the PC has to be shut down by the end of each day because it's too noisy, and it still crashes on
        • Have you ever tried Mac OS X?

          I use it all day, every day.

          I've used OS X on nearly every Mac from a 233Mhz Beige G3 all the way up to a Dual 1.25GHz G4 as well, so I've got a rough idea how fast it runs on each of them - and I've yet to see one that runs even close to as smoothly as my ~4 year old dual P3/700, let alone some of the dual 3Ghz monsters you can buy today. Heck, my ~7 year old dual Pentium 200 runs XP about as fast as your 400Mhz iMac would - you can't even run OS X on a Mac that old.

          This is

          • OK, maybe there is something wrong with my own PC, but I have used XP on a 2 GHz Sony Vaio - its video performance is much slower than my 700 MHz iBook. On the Vaio, dragging a window quickly would leave a long trail of very ugly broken window frames - something that never ever happen to me on any Mac.

            What exactly are you doing with OS X machines? I do lots of programming and graphics on an iBook, and just don't feel it's slow at all, unless I watch QuickTime video and play iTunes at the same time as well
            • OK, maybe there is something wrong with my own PC, but I have used XP on a 2 GHz Sony Vaio - its video performance is much slower than my 700 MHz iBook. On the Vaio, dragging a window quickly would leave a long trail of very ugly broken window frames - something that never ever happen to me on any Mac.

              That's because the two OSes exhibit different behaviours under excessive load conditions. OS X's superior (in terms of features) graphics system performs double-buffering, so you never get the half-drawn wi

    • Because they have a horrifying amount of legacy code to maintain, on what is basically a legacy architecture - i.e. the PC.

      Apple got to throw away all their mistakes when they started making OS X. They don't need to support nearly so many hardware experiments - ISA, VLB, MCA, assorted stupid methods of getting to "high" memory, fifty different ways of using large hard drives etc. etc. They also don't need to support a wide assortment of "good idea at the time" legacy technologies, DCOM and others of their
    • It's probably because MS keep copying and stealing ideas from Apple and other companies without real insight or understanding, which make it hard fro them to integrate with their own typically bad design.

      Despite the author's infinite admiration for MS, his description of War Room in part 2 is a clear indication that the Redmond Beast lives in a mad house.

      I feel particularly sorry for the poor developers who suddenly were asked to fix the tens of thousands of "branding bugs" after MS had decided to drop th
    • Who says Apple is spending less then Microsoft in terms of OS development.

      The core Windows team is a drop in the bucket compared to all of microsoft's other projects. Go to Microsoft's website and try to select a product. The selection is huge!

      Most of the developers for Windows are actually only partial contributers. They work on the .net and visual studio team. They only provide code to run .net, com, ole, etc on Windows. The core that actually write the kernel, gui, and IE probably is less then 1,000.

      A
      • >> Who says Apple is spending less then Microsoft in terms of OS development.

        According to the article, there are 8000 to 10000 programmers working on the Win 2k3. I don't think Apple has that many employees world wide.

        MS is about 60 times bigger than Apple and has more than $40 bln cash. Apple is primarily a hardware company, so lots of its resources are devoted to hardware innovations. But the irony is that Apple's software portfolio is actually bigger and better than that of MS. You may find that
    • Hmm, obviously you don't use OS X to manage more than a couple hundred users... OS X simply doesn't scale, or at least the management UI doesn't.

      Apple isn't spending money testing scalability, although their GUIs are certainly pretty.

  • Testing (Score:5, Interesting)

    by ayf6 ( 112477 ) on Sunday May 25, 2003 @12:38PM (#6035363) Homepage
    Testing is a vital part of programming. Tests should always be written PRIOR to the programming. This allows you to think of problems before they arise. In some sense it seems as if MS is avoiding this by having someone "come over to fix the problem within 20 minutes." HHowever, given the diverse environments there does not seem to be a direct solution for them. The EEC seems to be a huge step forward to finding where code breaks for a given customer but it doesnt solve any security holes (which should have been addressed pre-coding when you come up w/ tests for your software). As for all the joking about MS programmers in this forum so far, i find it kinda rediculous that people do that. People that laugh at the products MS produces really do have to look hard at how THEY would manage and TEST 50 MILLION lines of code. With 50 million lines of code you're looking at virtually an infinate number of tests to run, which is obviously impossible to do. Thus you either have to roll out a product that hasn't been 100% tested because of its size or keep testing and never make money. Since its all about the money you obviously roll out the product and try to patch it as fast as you can when somone does find a bug that got by Q&A and the testers. You need to find a balance between testing a product completely and releasing a product to make money. Its a fine line and MS has done a fairly good job given the size of their code base and the pressure on them from the comsumer to get new products out in a timely way.
    • Re:Testing (Score:4, Insightful)

      by LaCosaNostradamus ( 630659 ) <[moc.liam] [ta] [sumadartsoNasoCaL]> on Sunday May 25, 2003 @02:18PM (#6035916) Journal
      People that laugh at the products MS produces really do have to look hard at how THEY would manage and TEST 50 MILLION lines of code. With 50 million lines of code you're looking at virtually an infinate number of tests to run, which is obviously impossible to do. Thus you either have to roll out a product that hasn't been 100% tested because of its size or keep testing and never make money.

      As part of the Microsoft culture, it appears that you've missed the point.

      The problem is the 50 million lines of code itself.

      I would have "managed" NT's testing by "not managing it" at all, and instead would have clipped out all those bells and whistles to make a much more trim and modular OS. The code base is unecessarily large, from a functional point of view.

      But just like the current SUV problem in America, it appears that Microsoft is dancing a tango with the consumers. Microsoft produces shitty code that looks good on the screen, and the consumers say "ohh" and "ahh" while not minding the crashes and restrictions, and then Microsoft gets encouraged to produce more "pretty code". I don't consider this problem to be fixable ... we who know better and are less mediocre simply have to fend for ourselves and rely on the influence of our leadership to promote the Better Way. This is the slow method of providing a good example for others to follow, which is the only leadership that matters. Microsoft's billions are just a facade; consumer mediocrity is another facade; what will matter in the long run is bullet-proof code that more serves public needs instead of software-industry investors.
    • Re:Testing (Score:5, Informative)

      by vsprintf ( 579676 ) on Sunday May 25, 2003 @02:59PM (#6036099)

      Its a fine line and MS has done a fairly good job given the size of their code base and the pressure on them from the comsumer to get new products out in a timely way.

      Whoa, there. Since when is it the consumer who is pressuring MS for new products? It seems to me that it's MS who has been rushing new "features" into production and pressuring consumers to upgrade. I don't know of anyone who had a burning desire to upgrade to Word 2K or Windows XP. The fact that others were upgrading and causing compatibility problems was the compelling reason.

    • I don't think that what you are saying is correct. It maybe 50 milion lines of code, but it is not one application. There are several tens of apps into Windows NT, which is highly componentized. Therefore, testing applies to components and intergration between those components, limiting the number of tests to a down-to-earth number.

      By the way, Windows NT is one of the best, if not the best, pieces of software created. The kernel architecture is vastly complex, far more complex than Unix/Linux. They have do
  • and all this time i just thought microsoft 'winged' it through testing.
  • by mgkimsal2 ( 200677 ) on Sunday May 25, 2003 @12:46PM (#6035402) Homepage
    The Kentucky Dept of Education had 400+ separate NT domains, and visiting the MS EEC, they devised a way to migrate to one domain under Active Directory. I am in no way an NT4 domain expert, but this seems more like poor NT4 planning than anything fantastic which AD delivers over and above what they could have done with better planning. However, perhaps the issue isn't just ' let's go back a redo the NT4 scenario ' (which I don't think you can easily do) because NT4 is being phased out altogether.

    Any NT admins out there care to shed a bit more light on this? All I remember is that domain management under NT4 wasn't all that swift, but it's been several years since I've had to do anything with it.
    • Resources (Score:3, Insightful)

      by mgkimsal2 ( 200677 )
      Of course this is a bit of a marketing piece - many of the Kentucky comments about the EEC are flattering, but *should* be expected...

      "They asked us how much space we needed, so we did the math, and it came to about 1 terabyte (TB)," Cornett told me. "On the first day, we left the EEC at 1:00 am on Tuesday, and were back in the lab at 8:30 am that morning. They had run fibre to our room, and given us access to a SAN with two 500 GB drives. We had a need and immediately, it was solved. They said, 'We were
    • Thanks to all the people who replied below - I didn't want to have to say 'thanks' to each one separately, but you each did help to clarify some of my faulty memory re: NT4 domains. :)
  • by the_skywise ( 189793 ) on Sunday May 25, 2003 @12:46PM (#6035406)
    Parading around as "information".

    "Hey buddy, c'mere, I'm going to give you the inside skinny of how testing really works inside Microsoft. First, we read slashdot and realized that nobody installs our products until SP1. So we developed this whiz bang testing center filled with every kind of PC you'd find in the real world. Y'know, Dells, IBMs and HPs (which stands for Hewlett Packard), we got 'em all. This center is FREE (as in beer) for our customers to use to test their real life products. First off, they have to give us all of their custom in-house software, y'know, so we can test. We keep that, btw. Secondly, it helps if they hire our Certified Microsoft Advisory board to assist them through the process. Lastly, we customize our software to work with those environments. And that's how we prove that our software is ready to go from day 1 and you don't need to wait for a service pack so buy it NOW!"

    All kidding aside, its interesting to note that the CTO of Jet Blue has been giving lots of interviews to tech magazines explaining how "off the shelf" Windows software [we customize our software to work with those environments] not only works better than Unix/Linux but because he uses only Windows, he has been able to reduce his development staff greatly because he can do more with fewer people. That set off red flags in my mind... now that Jet Blue appears in this article as a testament to MS testing... well... I smell a PR campaign...
  • Seriously.... (Score:5, Insightful)

    by wowbagger ( 69688 ) * on Sunday May 25, 2003 @12:56PM (#6035476) Homepage Journal
    All the Q/A in the world does you no good if you don't act upon what Q/A has found.

    I used to do driver development for NT4.0. As such, I had a "victim" machine and a "debugging" machine, linked via a serial cable. The victim runs my driver, and I do my development and debug using the debugging machine to access the kernel debugger on the victim.

    A normal cycle of development went something like this:
    1. { {{edit,compile} while errors} link } while errors
    2. Download to victim
    3. boot victim. Watch hundreds of assertion failures from the OS scroll by on debugging console.
    4. Shut down debugger as it has leaked all available memory
    5. Start debugger again
    6. Load my driver and test.
    7. locate bugs in my driver, begin again


    Note: this was a fresh install of NT4.0 debugging, with SP4. No third party apps (other than my own) installed. This was using Microsoft's WinDBG.

    Now, I don't know about Microsoft's developers, but I regard an assertion failure as a failure - i.e. a bug to be fixed. Having HUNDREDS of them in released code is just unacceptable. Using an ASSERT() as a debugging printf() is wrong.

    So either a) the MS developers have a different view of things than I do, or b) the MS developers were allowing hundreds of easily identified problems to go into release.

    Now, EVERY non-trivial software project's lead engineer must make a decision at every release - "Do I fix these bugs and slip the release, or fix these bugs in the next release?" And EVERY lead will allow some bugs to slip. Usually, those bugs are deemed minor - spelin mestaches (sic), layout errors, things like that.

    But to have a) hundreds of assertion failures, which give you file and line number of the error, and b) a memory leak in your debugger bad enough that you can WATCH it leak away hundreds of megabytes of memory each time, and to allow that to go out? Ugh.

    Now I am sure that MS Q/A found those errors - if not they are far more incompetent that I am willing to assume they are. So clearly Q/A was overruled by management - "We don't care, ship it anyway!"

    And that is the central problem to ANY Q/A department - if management overrules them, and forces a shipment anyway, then how do you blame Q/A?

    I've said this before, and I shall say it again now: this is one of the places a real ISO-9000 standard can be useful. If the spec sayth "Lo, and the release candidate code shall have no bugs open against it in the bug tracking system, and any bugs that exist shall be clearly targeted to later revisions, and Q/A shall findth no undocumented bugs in the code, or the release shall be slipped, and the bugs corrected, AMEN!" then Q/A can say "OK, if you want to throw our ISO-9000 cert out the door, then by all means override us and ship."

    (Yes, that won't prevent management from simply targeting all bugs to a later revision and shipping, but it at least forces some consideration of the consequences to me made."
    • Re:Seriously.... (Score:2, Insightful)

      by Anonymous Coward
      ISO-9000 cert mearly states that you HAVE a process, and have it documented. Oh and process can change easy enough, it is after all just a doc file.

      Worked at one place where they mandated that on us. One dude documented how to format a floppy disk in DOS. Then another doc how to put a sticker on it.

      Unless its taken seriously its neato to have but other than that...

      NT4 was basicly NT3.51 with the graphical shell. NT3.51 was 2 years old by the time NT4 shipped... Few were serious about using it at the
  • My IQ just dropped by 10 percentage points as a result of trying to digest the meaningless marketspeak. /. urgently needs a real lameness filter that will take stuff like this and reduce it to the bits that actually have meaning.

    But perhaps there is one already, and the output from stuff like this is the ASCII cow art. Or, Heaven forbid, goatse.

  • NT (Score:3, Insightful)

    by bobm17ch ( 643515 ) on Sunday May 25, 2003 @01:05PM (#6035538)
    The annoying thing about NT (4.0), is that now that it is almost rock-solid, MS no longer support it.

    Those years of public beta testing certainly paid off. :)

  • Check this PPT to. (Score:3, Informative)

    by Karpe ( 1147 ) on Sunday May 25, 2003 @01:29PM (#6035660) Homepage
    There is a very interesting presentation on the design and development of windows NT/2000 presented on USENIX here [usenix.org] (google HTML rendition here [google.com.br]). I love to bash Microsoft too, but reading it, there are at least some decisions that I think they did right.
    • If you are interested in what Microsoft did for NT and Windows 2000, look at the links in the parent post.

      Just a couple of highlights:
      The complete source code was 50 Gigabytes.
      Build time was 8 hours.
      The source code control system tracked over 411,000 files.

      There were a lot of challenges trying to keep 5000 people working on the same operating system at once, they learned from problems and improved the process for Windows 2000.

      It is high-level data, but it is still quite interesting.
  • by hndrcks ( 39873 ) on Sunday May 25, 2003 @01:53PM (#6035777) Homepage
    "We're hoping to get a long-term lifespan out of Windows Server 2003 without having to do major upgrading."

    These guys obviously aren't students of "Licensing 6.0". [com.com]

  • by Enrico Pulatzo ( 536675 ) on Sunday May 25, 2003 @01:58PM (#6035810)
    1. write article about testing windows 2003 server.
    2. host article on a windows 2003 server
    3. post article to slashdot
    4. take notes
    5. fix bugs/adjust settings/add features
    6. goto 1.
  • Hmmph. (Score:4, Insightful)

    by Dthoma ( 593797 ) on Sunday May 25, 2003 @02:28PM (#6035976) Journal
    I'll try posting something original as opposed to the MS-bashing and the MS-bashing-bashing whilst remaining at least a little ontopic.

    I think Microsoft would do well to test more and make less. Each incarnation of Windows seems to have brought disproportionately large improvements (or hindrances if you like) in the user interface, features, and resource consumption. Whilst a gradual accumulation of features and a slow increase in resource use is inevitable for any operating system I think Microsoft has been making their systems grow too much too quickly.

    Microsoft seems to be running out of some new features to add to each new version of Windows to entice consumers are resorting to making their own features (notably, .NET and the like) in order to keep sales high. Unfortunately I think that in terms of features and UI they can't push the boundary too much further for the next few years (though obvious beyond that there will no doubt be new ideas).

    As such I feel that MS would benefit from focusing on testing instead of adding new things. Consolidation is often just as helpful as (if not better than) augmentation, particularly for larger systems. I feel that sales would remain high if Windows had no new features or UI but could genuinely be considered as stable as alternatives.
  • by mnmn ( 145599 ) on Sunday May 25, 2003 @03:47PM (#6036326) Homepage
    Having a monolithic kernel + garbage, monolithic reigstery, no way to control the revisions of DLLs, worse than RPM package management etc is what NT is all about. The UNIX model is making small highly useable tools that you understand perfectly, and making them work together. In win32, if you cant work with an API, you make a new API, and we now have several generations of APIs to deal with just to access the screen, resources, widgets etc in windows. You have a standard API that has worked for over a decade for most UNIXen, and the ones that do change their API purge the last version and replace it with a similar system.... think sysv vs BSD, Solaris sysctls vs Linux sysctls.

    Thats why the win32 system spirals into complexity, no matter how much money is pumped into development or testing. Of course one of the best things about windows is also one of the worst, that vendors developing their own drivers for their hardware might make incompatible or bad drivers, or ones that step on the feet of other installed drivers in the system. In the Linux kernel, all the drivers are present before the testers and are considered while any major change takes place.. such as the VM or switching to 64-bit cpu. This is true for most other UNIXen where drivers are sent to the unix vendor for testing as well, but thats not as efficient as the Linux model.

    And then the number of eyeballs testing Linux and FreeBSD is a phenomenon Microsoft cant copy. The free software community does not work for a paycheck, but theres more sincerity towards the software than there would be for a proprietary software. Free Software can be a matter of ego and gives a sense of competition with Microsoft. You cant buy that. This I believe is the biggest reason why colossal manhours are poured into free software development, while some of these developers work the rest of their days as data entry or office clerks, even mcdonalds.
    • Interesting "comparison" you make.
      • "Monolithic" kernel: NT kernel is object based ("everything is an object" [file, ACL, semaphore, process, etc.] VS Unix "everything is a file"), mach-style modular and (as of W2K) fully reentrant. Allows for multiple independent subsystem operation (e.g. Win32 and OS2/1.0 and POSIX); significantly more advanced than (e.g.) Linux and BSD.
      • "Monolithic" registry: while I do not particularly agree with the implementation of the Windows Registry I do find some merit in the co
      • I stand corrected in some facts noted above, but some clarifications: "Monolithic" kernel: NT kernel is object based ("everything is an object" [file, ACL, semaphore, process, etc.] VS Unix "everything is a file"), mach-style modular and (as of W2K) fully reentrant. Allows for multiple independent subsystem operation (e.g. Win32 and OS2/1.0 and POSIX); significantly more advanced than (e.g.) Linux and BSD. The NT kernel is certainly modular and not monolithic in the strict sense of the word, but far from t
        • Good reply. But I query this statement:

          "Many eyes" philosophy of open source: agreed, in theory this should render Microsoft obsolete, in practice it has merely spurred Microsoft to create a better product

          It is rendering Microsoft obsolete. Our company buys laptops for employees and promptly replaces the Windows XP with 2000 before handing them out.


          And that renders Microsoft obsolete how, exactly?

          Both in performance and stability, Microsoft remains defeated

          Every study that has directly compared
    • ...while some of these developers work the rest of their days as data entry or office clerks, even mcdonalds.

      Linux: from the people who ask "would you like fries with that?" for a living.
  • by deranged unix nut ( 20524 ) on Sunday May 25, 2003 @04:30PM (#6036513) Homepage
    For a small slice of how Microsoft actually tests software:

    The lifecycle of a software bug in the Windows Division

    1) The bug is found, reported in a bug tracking system, and assigned to the developer.
    2) The bug is evaluated by managers for severity and triaged in a daily "war" meeting. At this point, the bug may be postponed until the next cycle, or marked to be addressed in the current cycle.
    3) For all open bugs in the current cycle, the developer investigates and creates a fix, frequently running a few tests before marking it as ready for test.
    4) Testers make sure the bug is fixed, look for any additional problems, look for related issues, and frequently even run a regression test pass to make sure that the developer didn't accidentally break something else while making the fix. If there are additional problems, the bug goes back to the developer to make a better fix, otherwise the bug is marked as okay to check in.
    5) The developer then code reviews the changes with another developer, builds the changes for all platforms to catch any possible compile breaks, and then checks in the changes.
    6) The build lab picks up the changes for the day and starts to compile.
    7) If a compile break occurs, usually because someone was in a hurry and didn't follow the rules, an on-call developer triages and fixes so that the compile can continue.
    8) When the build finishes, it is installed on a set of machines, and a series of build verification tests are run to ensure that the build is at least good enough to run some tests.
    9) When the build verification tests finish, then the testers install that build and double check that the bug is still fixed, and mark the bug as such.
    10) Finally, the tester adds a regression test to their test plan, and automates that test so that it will at least be run before the end of every major cycle, sometimes every minor cycle, every week, every build, or for some issues even as part of the build verification tests.

    Major cycles are for betas, and final releases, minor cycles are for releases to be deployed internally, builds tend to come out daily. At the start of a cycle, and in early cycles, the bar is fairly low, almost any bug can be fixed and added to the build. Near the end of each cycle, and at later cycles, the requirements are increased so that only changes that are absolutely necessary are taken, reducing the risk of introducing new problems that won't be discovered until after the product is released. At some point in every major cycle, the bugs and test plans are reviewed to find areas that need improvement.

    Additionally, instrumented code to measure test coverage, quality standards in a number of areas like accessibility, reliability, scalability, globalization, localization, integration, interoperability, are measured for improvement, usability studies are performed, code profiling tools are used, code scanning tools look for code execution paths that could result in problems and automatically file bugs, testers bash on other components, and anything else anyone can think of to find the problems early.

    However, the pace is incredible and problems can come from anywhere. Imagine testing an Xwindows application to configure networking while the kernel is changing, the networking core is changing, Xwindows is changing, the shell is changing, the compiler is changing, your application is changing, and the tools you use to test with are changing. It is a challenging job.

    If you want to bash Microsoft, that's fine, I used to...hence my handle, but now that I have seen inside the "beast", it's just a business, most of the rumors are very off base, and most of the people there are just normal people who want to do the right thing.
  • "We've tapped into that 40 billion dollar cash reserve we have, and decided that spending a couple percent on QA testing might be a good marketing investment"
  • According to a recent survey

    http://www.internetretailer.com/dailyNew s.asp?id=9361

    "Apple Computer Inc.?s Apple.com led all computer hardware sites in number of shoppers for the week that ended May 11, according to Nielsen/NetRatings? AdRelevance report. Apple.com logged 3.75 million unique visitors, 73.7% of all visitors to hardware sites, which hosted 5.09 million shoppers for the week.

    Next behind Apple was Hewlett-Packard Co.?s HP.com at 2.47 million visitors; Dell Computer Corp., at 1.94 million; Gate

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...