Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Software Microsoft Security IT

WGA Meltdown Blamed On Human Error 250

Erris writes "As commentators like Ars Technica slam WGA as deeply flawed, Microsoft is blaming human error and swears it won't happen again. 'Alex Kochis, Microsofts senior WGA product manager, wrote in a blog posting that the troubles began after preproduction code was installed on live servers. ... rollback fixed the problem on the product-activation servers within 30 minutes ... but it didnt reset the validation servers. ... "we didnt have the right monitoring in place to be sure the fixes had the intended effect"' Critics were not impressed. 'A system thats not totally reliable really should not be so punitive, said Gartner Inc. analyst Michael Silver. Michael Cherry, an analyst at Directions on Microsoft in Kirkland, Wash., said he was surprised that it was even possible to accidentally load the wrong code onto live servers ... [and asks], "what other things have they not done?' This is not the first time this has happened, either."
This discussion has been archived. No new comments can be posted.

WGA Meltdown Blamed On Human Error

Comments Filter:
  • by G4from128k ( 686170 ) on Monday September 03, 2007 @08:59AM (#20450949)
    One of the articles I read (http://www.betanews.com/article/Microsoft_WGA_Out age_Not_an_Outage/1188405961) suggested that if the server had actually gone down, then this would not have been a problem. The article, based on comments from Microsoft, suggested that WGA defaults to "genuine" if it can't reach the WGA server. So why didn't MSFT just kill the server to let people's software default to "genuine" instead of leaving the server connected with faulty software?
  • It's a fair point (Score:5, Interesting)

    by Joe Jay Bee ( 1151309 ) * <jbsouthsea@@@gmail...com> on Monday September 03, 2007 @09:01AM (#20450961)
    Critics were not impressed. 'A system thats not totally reliable really should not be so punitive, said Gartner Inc. analyst Michael Silver. Michael Cherry, an analyst at Directions on Microsoft in Kirkland, Wash.,

    WGA is a natural, if not perfect (or even good) business response to the problem of piracy (leaving out all the debate over whether it's a good or bad thing for Microsoft as a whole). But the technical implementation leaves a lot to be desired; if anything, the response to a WGA server failure should be automatic pass (fail safe) instead of an automatic fail (fail deadly).

    Sure, for a 24 hour window pirates would have a free-for-all in getting perfectly valid WGA results, but at the same time legitimate customers would not be inconvenienced. As far as I can see, that's the only way to keep WGA while minimising the backlash against it.
  • by PeeAitchPee ( 712652 ) on Monday September 03, 2007 @09:21AM (#20451107)

    Strictly speaking, there are no tasks I do today that I couldn't do in 1997.

    Speak for yourself. Just because *you personally* don't use the extra processing power, memory, and storage that are available doesn't mean that lots of others don't. For example, I'm in the middle of digitizing and OCRing 110 years of local newspapers from microfilm into archival-quality PDFs for an historical society. Quite simply, you *cannot* have too much processing power when doing OCR -- I'm running multiple instances of ABBYY FineReader Corporate on a 2x Quad Core Xeon that has been pegged for two weeks now. It's quick, multithreads across all 8 cores and does a great job, but there's simply too much data. Note that this project would have been completely impossible in 1997 -- there simply wasn't enough processing power, memory or storage available to do it on anything less than a supercomputer. And that's not even considering truly bandwidth- and processor-intensive tasks related to video, weather meodeling, etc.

  • by Anonymous Coward on Monday September 03, 2007 @09:32AM (#20451195)
    What I find interesting is the switch from version numbers to years for a lot of apps, which started with the switch from Windows 3.1 to Windows 95. When you're dealing with a "year" number, there's added pressure to put out updates more regularly--someone with Windows 95 in 1997 is painfully aware that they have software that's 2 years "out of date". Even if a number of Service Packs have come out since then, there's the "emotional" feeling that the product is out of date.

    While for OS'es, they mercifully abandoned year-based versioning with XP, they still do year-versions of Office, etc. And even then, they're under huge pressure to get out updates for OS'es on a regular basis--there was a long wait for Vista.

    Now, putting out releases every few years isn't a bad thing per se. However, Microsoft suffers from the same problem that people who version by year do, be they auto makers or video game manufacturers. You need to put your "stamp" on the new version. New features. New ideas. New things to make people go "ooh, shiny!" To give an example, EA's Madden 2008 can't just be Madden 2007 with new player stats. They need to add features, even if they're "change for change sake." With the death of version numbers, EVERY version is a Major Version, because there's no other point of reference. Every new release needs to be a revolution in software.

    And, obviously, the need to have "this year's model" more sparkly than last year's model leads to bloatware. Features you just don't need, or will never use, are now "built in" instead of add-ons for the small number of folks who need them. Design is more important than ease of use. Stuffing in features is more important than efficiency. That's the new game.
  • by gelfling ( 6534 ) on Monday September 03, 2007 @09:35AM (#20451225) Homepage Journal
    Some division head inside Redmond is crafting his internal proposal to convert the update realm from a cost center to a revenue center. The rationale will be to collect the funding to staff up that function appropriately so as not to harm MS from mistakes such as this.

    The ironic thing is that few people will pay - and while the level of installed patches will go down the overall level of security will not materially change given the overall poor security stance in the first place. What will happen is that interoperability will begin to fail badly.
  • Re:It's a fair point (Score:3, Interesting)

    by Geekbot ( 641878 ) on Monday September 03, 2007 @09:41AM (#20451267)
    A system designed to spy on customers which, out of disregard for those customers, can cost those users their computer, files, and productivity? Microsoft doesn't have customers. It has victims.

    It all goes to trust and loyalty. How could a company that has such a widespread use take all of that potential customer loyalty and fanbase and turn it into a seething hatred? I really don't see how Microsoft can not make small gestures to gather users on to it's side. Even the people that use their product seem to hate Microsoft for their apparent disregard for the consumer.

    If I had 20 customers I'd be working at making them my most loyal fans. If I had millions of customers I'd be looking at starting my own country. How Microsoft can squeeze it's customers like lemons and sit enjoying lemonade while being hated, I just don't understand it. Microsoft is a bad example of how a company should do business, regardless of the profit margin.
  • by Generic Guy ( 678542 ) on Monday September 03, 2007 @09:42AM (#20451275)

    I think you're more on-topic then you think. I feel compelled to respond to your observations with my own:

    the OS/360 operating system...the machine had 2MB of memory and the operating system cost 400Kb of the memory.

    Keep in mind that 400K is about 20% of the machine's available resources, which doesn't seem to different from today. Although today we have a lot more choice in how many 'resources' to put into a workstation or server type system.

    There is also the difference between hosting old world text terminal interfaces and the modern high color depth, fancy windowing systems we have today.

    They charged something like $9.50 a month for 1Kb of system memory. That meant that every Kilobyte of memory saved was worth hundered or even thousands of dollars over the life time of the machine.

    Now this is the interesting point, IMO. In the past, you would often lease your 'mainframe' software, and need to renew it every year. Often you would have to contact your sales rep, get a new key, and 'activate' the software for another year. With a computer on every desktop, people were sold on the idea that you 'buy' your OS and software from the store and its yours -- forever. While 'Activation' and WGA are ostensibly an anti-pirating measure, in my eyes Microsoft is trying to steer the desktop PC market back to the old mainframe model of paying a yearly (or perhaps monthly) tithe to keep your computer working. Get the market used to phone-home features, and slowly close the net. They've been interested in subscription models for quite awhile, now.

    The problem for Microsoft is that, unlike mainframe vendors, they suck at reliability. So while Microsoft is eager for a lease-type model, they don't have the corporate culture or experience to make a robust system, they still have a lot of design issues with the tracking and activation back end which is of course necessary for a 'rental' paradigm.

  • by Technician ( 215283 ) on Monday September 03, 2007 @10:11AM (#20451497)
    What do they gain? Was WGA suppose to convince people using illegitimate versions of Windows to turn to the light? Fuck that, they'll just download the latest cracked WGA .DLL and get on with it, while the legit users will get boned because their serial key wasn't recognized or whatever.

    Avoid the rush of stormtroupers at the door (BSA) and go legit. Try Ubuntu. It works out of the box. It will connect to your existing LAN with the ablility to log into your existing NFS and SMB workgroup shares. It will use your IPP net attached printers without difficult Vista configuration problems.

    A new Vista machine on my LAN took over 4 hours to figure out how to log into my existing SMB shares and connect to my IPP net attached printers.

    The first Ubuntu machine only took 30 minutes to learn and complete both tasks. IPP and networking both worked out of the box without tweaks or tricks.

    They said Windows is easy to use... Until you need to learn a new version and it's set of bugs.
  • by jbreckman ( 917963 ) on Monday September 03, 2007 @10:14AM (#20451515)

    Then perhaps you could have used an example that SHOULD be more efficient on today's computers.

    Simply put, Word has never required the full power of a PC (once multi-threading came into play anyway). So who cares if Vista isn't doing anything to help? Or if it is eating more resources? If Word is all you are using, then you shouldn't really notice a difference.

    However, if you used a different example - like graphic design, development, 3d modeling, etc., we are doing things today that would have been impossible 10 years ago. True, the OS is taking up a bigger chunk of the pie, but the pie has grown enough that it doesn't REALLY bother me.

    This is even more true with more and more multi-core processors coming out. If I have 2 cores at my disposal, I'm going to be even more inclined to let the OS do some extra stuff on one of them.

  • by PeeAitchPee ( 712652 ) on Monday September 03, 2007 @10:25AM (#20451599)

    As for your task, it may not have been done on single machine in a reasonable timeframe and certainly not in a point and click fashion. However you could have easily integrated the ABBY engine into a networked batch OCR solution and then hired the capacity to run it (eg: a renderfarm).

    Ahhh, spoken like someone who's never done a project like this before. So easy to plan in your head on Slashdot in 30 seconds, isn't it?

    If creating the required integration work to ABBYY's OCR engine to some sort of distributed processing farm wasn't cost-prohibitive (which it is -- historical societies aren't exactly made of money), how would you suggest I upload over a terabyte of raw image data in a timely fashion to said render farm? And then download it again once completed (not as big of a problem, but still an issue)?

    The bigger question is whether or not to take on OCR in-house at all. If you want to sub-out OCR, then you have to wait until the scanning is complete (weeks) -- sending partial jobs via hard drive is more expensive than sending everything at once at the end. It's still too much money at the end of the day -- much, much cheaper to keep it in-house, and the QA process is better. The cheapest option is to buy the fastest server your budget permits and run it 24x7 in parallel with scanning and final PDF assembly / burning. ABBYY FineReader multithreads on recognition, but NOT on opening batches or writing out PDFs. That is the real bottleneck, and the reason it's necessary to run multiple instances.

  • by ScrewMaster ( 602015 ) on Monday September 03, 2007 @10:34AM (#20451691)
    If I have 2 cores at my disposal, I'm going to be even more inclined to let the OS do some extra stuff on one of them.

    Yes, but you paid for those cores, the OS vendor did not. The problem is this: what is that extra stuff, and why should your operating system be doing anything that isn't of benefit to you?

    Take Vista for example. It is a resource hog. Some of that piggishness is the user interface, but there's a lot of other "extra stuff" in Vista that has no right to be there. Hopefully, someone will figure a way to strip most of it out at some point: maybe then it will be actually usable. Until then, I'm personally going to stick with XP and Linux. There's less extra stuff.
  • Re:Monitoring (Score:3, Interesting)

    by Anonymous Coward on Monday September 03, 2007 @10:56AM (#20451855)
    Seriously, Microsoft is great at monitoring YOUR computer, but they can't monitor their own?

    Better than most people think.

    Once a week, the Internet Time feature of Windows notifies MS that you run Windows.
    Every time you search your hard drive, Windows notifies MS and tells them what you just searched for.

    As an experiment, I tried setting ZoneAlarm & Comodo firewalls to deny all network traffic on a fresh Windows installs. Packets were still getting past the firewall. MS knows that you run their software.
  • Seriously... (Score:3, Interesting)

    by ObsessiveMathsFreak ( 773371 ) <obsessivemathsfreak AT eircom DOT net> on Monday September 03, 2007 @11:21AM (#20452051) Homepage Journal
    ...Who the hell is twitter? I'm beginning to think this little spat is itself some kind of astroturfing.

    Is this a Roland Piquepaille repeat incident, or a Beatles-Beatles one? Is this something new. Is this a bunch of rejected posters playing sour grapes or actually something we should give a damn about? Is this whole thing an elaborate troll?

    I read this site a lot, and this is the first I've heard of "The Great twitter Affair". Explain yourselves sirs.
  • by Belial6 ( 794905 ) on Monday September 03, 2007 @01:01PM (#20452887)
    "The problem is that for all the applications you like to run to run Windows needs to backwards compatable with older versions of itself,"

    This keeps getting repeated over and over. It is absolutely untrue. Microsoft bought VirtualPC. They can run a complete version of every previous version of Windows in a virtual machine. This would give darn near perfect backward compatibility, and 0 extra overhead for any new applications moving forward. Add to this the fact that Vista just doesn't have that good of compatibility.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...