WGA Meltdown Blamed On Human Error 250
Erris writes "As commentators like Ars Technica slam WGA as deeply flawed, Microsoft is blaming human error and swears it won't happen again. 'Alex Kochis, Microsofts senior WGA product manager, wrote in a blog posting that the troubles began after preproduction code was installed on live servers. ... rollback fixed the problem on the product-activation servers within 30 minutes ... but it didnt reset the validation servers. ... "we didnt have the right monitoring in place to be sure the fixes had the intended effect"' Critics were not impressed. 'A system thats not totally reliable really should not be so punitive, said Gartner Inc. analyst Michael Silver. Michael Cherry, an analyst at Directions on Microsoft in Kirkland, Wash., said he was surprised that it was even possible to accidentally load the wrong code onto live servers ... [and asks], "what other things have they not done?' This is not the first time this has happened, either."
Why didn't they kill the server? (Score:5, Interesting)
It's a fair point (Score:5, Interesting)
WGA is a natural, if not perfect (or even good) business response to the problem of piracy (leaving out all the debate over whether it's a good or bad thing for Microsoft as a whole). But the technical implementation leaves a lot to be desired; if anything, the response to a WGA server failure should be automatic pass (fail safe) instead of an automatic fail (fail deadly).
Sure, for a 24 hour window pirates would have a free-for-all in getting perfectly valid WGA results, but at the same time legitimate customers would not be inconvenienced. As far as I can see, that's the only way to keep WGA while minimising the backlash against it.
Re:Have we gone backwards? (Score:5, Interesting)
Strictly speaking, there are no tasks I do today that I couldn't do in 1997.
Speak for yourself. Just because *you personally* don't use the extra processing power, memory, and storage that are available doesn't mean that lots of others don't. For example, I'm in the middle of digitizing and OCRing 110 years of local newspapers from microfilm into archival-quality PDFs for an historical society. Quite simply, you *cannot* have too much processing power when doing OCR -- I'm running multiple instances of ABBYY FineReader Corporate on a 2x Quad Core Xeon that has been pegged for two weeks now. It's quick, multithreads across all 8 cores and does a great job, but there's simply too much data. Note that this project would have been completely impossible in 1997 -- there simply wasn't enough processing power, memory or storage available to do it on anything less than a supercomputer. And that's not even considering truly bandwidth- and processor-intensive tasks related to video, weather meodeling, etc.
The curse of the lost version number. (Score:2, Interesting)
While for OS'es, they mercifully abandoned year-based versioning with XP, they still do year-versions of Office, etc. And even then, they're under huge pressure to get out updates for OS'es on a regular basis--there was a long wait for Vista.
Now, putting out releases every few years isn't a bad thing per se. However, Microsoft suffers from the same problem that people who version by year do, be they auto makers or video game manufacturers. You need to put your "stamp" on the new version. New features. New ideas. New things to make people go "ooh, shiny!" To give an example, EA's Madden 2008 can't just be Madden 2007 with new player stats. They need to add features, even if they're "change for change sake." With the death of version numbers, EVERY version is a Major Version, because there's no other point of reference. Every new release needs to be a revolution in software.
And, obviously, the need to have "this year's model" more sparkly than last year's model leads to bloatware. Features you just don't need, or will never use, are now "built in" instead of add-ons for the small number of folks who need them. Design is more important than ease of use. Stuffing in features is more important than efficiency. That's the new game.
paying for updates around the corner (Score:5, Interesting)
The ironic thing is that few people will pay - and while the level of installed patches will go down the overall level of security will not materially change given the overall poor security stance in the first place. What will happen is that interoperability will begin to fail badly.
Re:It's a fair point (Score:3, Interesting)
It all goes to trust and loyalty. How could a company that has such a widespread use take all of that potential customer loyalty and fanbase and turn it into a seething hatred? I really don't see how Microsoft can not make small gestures to gather users on to it's side. Even the people that use their product seem to hate Microsoft for their apparent disregard for the consumer.
If I had 20 customers I'd be working at making them my most loyal fans. If I had millions of customers I'd be looking at starting my own country. How Microsoft can squeeze it's customers like lemons and sit enjoying lemonade while being hated, I just don't understand it. Microsoft is a bad example of how a company should do business, regardless of the profit margin.
Re:Have we gone backwards? (Score:5, Interesting)
I think you're more on-topic then you think. I feel compelled to respond to your observations with my own:
Keep in mind that 400K is about 20% of the machine's available resources, which doesn't seem to different from today. Although today we have a lot more choice in how many 'resources' to put into a workstation or server type system.
There is also the difference between hosting old world text terminal interfaces and the modern high color depth, fancy windowing systems we have today.
Now this is the interesting point, IMO. In the past, you would often lease your 'mainframe' software, and need to renew it every year. Often you would have to contact your sales rep, get a new key, and 'activate' the software for another year. With a computer on every desktop, people were sold on the idea that you 'buy' your OS and software from the store and its yours -- forever. While 'Activation' and WGA are ostensibly an anti-pirating measure, in my eyes Microsoft is trying to steer the desktop PC market back to the old mainframe model of paying a yearly (or perhaps monthly) tithe to keep your computer working. Get the market used to phone-home features, and slowly close the net. They've been interested in subscription models for quite awhile, now.
The problem for Microsoft is that, unlike mainframe vendors, they suck at reliability. So while Microsoft is eager for a lease-type model, they don't have the corporate culture or experience to make a robust system, they still have a lot of design issues with the tracking and activation back end which is of course necessary for a 'rental' paradigm.
Re:I've said it before and I'll say it again (Score:4, Interesting)
Avoid the rush of stormtroupers at the door (BSA) and go legit. Try Ubuntu. It works out of the box. It will connect to your existing LAN with the ablility to log into your existing NFS and SMB workgroup shares. It will use your IPP net attached printers without difficult Vista configuration problems.
A new Vista machine on my LAN took over 4 hours to figure out how to log into my existing SMB shares and connect to my IPP net attached printers.
The first Ubuntu machine only took 30 minutes to learn and complete both tasks. IPP and networking both worked out of the box without tweaks or tricks.
They said Windows is easy to use... Until you need to learn a new version and it's set of bugs.
Re:Have we gone backwards? (Score:3, Interesting)
Then perhaps you could have used an example that SHOULD be more efficient on today's computers.
Simply put, Word has never required the full power of a PC (once multi-threading came into play anyway). So who cares if Vista isn't doing anything to help? Or if it is eating more resources? If Word is all you are using, then you shouldn't really notice a difference.
However, if you used a different example - like graphic design, development, 3d modeling, etc., we are doing things today that would have been impossible 10 years ago. True, the OS is taking up a bigger chunk of the pie, but the pie has grown enough that it doesn't REALLY bother me.
This is even more true with more and more multi-core processors coming out. If I have 2 cores at my disposal, I'm going to be even more inclined to let the OS do some extra stuff on one of them.
Re:Have we gone backwards? (Score:5, Interesting)
As for your task, it may not have been done on single machine in a reasonable timeframe and certainly not in a point and click fashion. However you could have easily integrated the ABBY engine into a networked batch OCR solution and then hired the capacity to run it (eg: a renderfarm).
Ahhh, spoken like someone who's never done a project like this before. So easy to plan in your head on Slashdot in 30 seconds, isn't it?
If creating the required integration work to ABBYY's OCR engine to some sort of distributed processing farm wasn't cost-prohibitive (which it is -- historical societies aren't exactly made of money), how would you suggest I upload over a terabyte of raw image data in a timely fashion to said render farm? And then download it again once completed (not as big of a problem, but still an issue)?
The bigger question is whether or not to take on OCR in-house at all. If you want to sub-out OCR, then you have to wait until the scanning is complete (weeks) -- sending partial jobs via hard drive is more expensive than sending everything at once at the end. It's still too much money at the end of the day -- much, much cheaper to keep it in-house, and the QA process is better. The cheapest option is to buy the fastest server your budget permits and run it 24x7 in parallel with scanning and final PDF assembly / burning. ABBYY FineReader multithreads on recognition, but NOT on opening batches or writing out PDFs. That is the real bottleneck, and the reason it's necessary to run multiple instances.
Re:Have we gone backwards? (Score:5, Interesting)
Yes, but you paid for those cores, the OS vendor did not. The problem is this: what is that extra stuff, and why should your operating system be doing anything that isn't of benefit to you?
Take Vista for example. It is a resource hog. Some of that piggishness is the user interface, but there's a lot of other "extra stuff" in Vista that has no right to be there. Hopefully, someone will figure a way to strip most of it out at some point: maybe then it will be actually usable. Until then, I'm personally going to stick with XP and Linux. There's less extra stuff.
Re:Monitoring (Score:3, Interesting)
Better than most people think.
Once a week, the Internet Time feature of Windows notifies MS that you run Windows.
Every time you search your hard drive, Windows notifies MS and tells them what you just searched for.
As an experiment, I tried setting ZoneAlarm & Comodo firewalls to deny all network traffic on a fresh Windows installs. Packets were still getting past the firewall. MS knows that you run their software.
Seriously... (Score:3, Interesting)
Is this a Roland Piquepaille repeat incident, or a Beatles-Beatles one? Is this something new. Is this a bunch of rejected posters playing sour grapes or actually something we should give a damn about? Is this whole thing an elaborate troll?
I read this site a lot, and this is the first I've heard of "The Great twitter Affair". Explain yourselves sirs.
Re:Have we gone backwards? (Score:3, Interesting)
This keeps getting repeated over and over. It is absolutely untrue. Microsoft bought VirtualPC. They can run a complete version of every previous version of Windows in a virtual machine. This would give darn near perfect backward compatibility, and 0 extra overhead for any new applications moving forward. Add to this the fact that Vista just doesn't have that good of compatibility.