Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Operating Systems Software Windows IT Technology

How Microsoft Dropped the Ball With Developers 814

cremou writes "As part of an Ars Technica series on how one developer migrated from Windows to OS X (and why), this second article concentrates on how Microsoft bungled the transition from XP to Vista. The author looks at some unfortunate decisions Microsoft made that have made Windows an unpleasant development platform. 'So Windows is just a disaster to write programs for. It's miserable. It's quite nice if you want to use the same techniques you learned 15 years ago and not bother to change how you do, well, anything, but for anyone else it's all pain... And it's not just third parties who suffer. It causes trouble for Microsoft, too. The code isn't just inconsistent and ugly on the outside; it's that way on the inside, too. There's a lot of software for Windows, a lot of business-critical software, that's not maintained any more. And that software is usually buggy. It passes bad parameters to API calls, uses memory that it has released, assumes that files live in particular hard-coded locations, all sorts of things that it shouldn't do.'"
This discussion has been archived. No new comments can be posted.

How Microsoft Dropped the Ball With Developers

Comments Filter:
  • by Jeremiah Cornelius ( 137 ) * on Monday May 05, 2008 @08:17PM (#23306530) Homepage Journal
    Read the article.

    Short answer?

    Windows!
    • by denzacar ( 181829 ) on Monday May 05, 2008 @08:39PM (#23306704) Journal
      Actually it is more like:

      I hate Windows. It robs me of my creative juices.
      Because I am creative, you know... man?
      So I "Switched".
      Now, I code for OS X and every day is a beautiful rainbow for me.
      • Re: (Score:3, Insightful)

        FTA:

        "Regular updates to the OS keep developers on the upgrade treadmill; they work to make their applications fit in with the latest and greatest release, leveraging whatever new bells and whistles it provides, further improving the software ecosystem."

        As we see from the Windows and Mac OS (pre-X) ecosystems, this is completely false. Developers are loathe to change custom-written code that they know works in order to implement some kind of link to the OS; and in many cases they'll let old versions of their
    • Re:Long Answer? (Score:5, Interesting)

      by ldhertert ( 833408 ) on Monday May 05, 2008 @09:58PM (#23307396)
      This was modded as funny, but, as a .NET developer, I find it to be exceedingly true. I just made the switch to OSX, because I was unhappy with the stagnancy of windows. But, as we speak, I have windows running in a virtual machine almost solely for the purpose of running visual studio. I find the articles critiques very hard to swallow. The argument that .NET is limited by the attempt to be simplistic is asinine. Just like java, there are high level "simple" functions that may or may not suit your needs. If they don't you have the capability to dig down into much lower level functions to do what you need to do. He states several problems with the UI capabilities of .NET. Before I even get into the technical components of his argument, if he's trying to say that Java is better in this area, then he needs to get his eyes checked. Every single java app I've ever used is ugly as sin, and I've seen a lot. Sure, it's portable across environments, but that's not what .NET development is being used for. Aside from that...not happy with with the windows UI standards? Everyone else seems to be. You can write .NET apps that follow very closely to the common windows UI design standards. Not happy with the limitations on the UI? Write/use another UI implementation like GTK. And how do we not even mention that the UI layer has been completely overhauled with the advent of WCF? I can say with experience that .NET is very powerful and can be very pleasant to work with. The ability to move from desktop apps to web apps to mobile development with very little effort has been great for me and my career. I'm sure that Java does a lot of things right, but as someone who has seen a lot of the terrible things microsoft has done, I honestly think that .NET is a crowning achievement of theirs.
      • by lgw ( 121541 ) on Monday May 05, 2008 @10:06PM (#23307462) Journal
        .NET is the first thing MS has done in 15 years that's not tied to Win32 backwards compatibility. It's a fresh attempt to do things well. It's certainly cleaner than Win32, but neither that nor OSX (however wonderful it may be) will mater to me: Win32 is my profession now.

        I heartily encourage everyone else, especially all developers in low-income areas of the world, to Make the Switch ASAP. The less labor supply for these horrible, inconsistant Win32 APIs (really, it's like programming while your face is on fire), the better!
      • Re:Long Answer? (Score:5, Insightful)

        by CodeBuster ( 516420 ) on Monday May 05, 2008 @11:52PM (#23308230)

        The argument that .NET is limited by the attempt to be simplistic is asinine
        I agree. It sounds to me like this guy used .NET for a year or so around 2002 when it was brand new and then left and hasn't looked again for the last six (6) years. He is the first person that I have heard accuse .NET of being "too simplistic". The .NET class library (and Java class library as well) is the definition of everything and the kitchen sink. These are extremly powerful languages and libraries that are if anything too complex.

        Before I even get into the technical components of his argument, if he's trying to say that Java is better in this area, then he needs to get his eyes checked.
        Again, I totally agree. The .NET Framework class libraries and the C# language are the definition of "The Right Way" to organize and structure code. They are technically sophisticated, architecturally beautiful, and make use of all of the best ideas at the culmination of nearly three (3) decades of object oriented programming theory and software engineering. If he is going to criticize .NET for its "lack of capability" then he is basically saying that Java is crap too, because .NET borrowed heavily from Java which in turn borrowed heavily from C++, smalltalk, and all the way back through C to Algol 60.

        portable across environments, but that's not what .NET development is being used for
        It is supported and in the future it will become more and more common. The great innovation on the part of .NET was the common runtime type description language and metadata in addition to the common language assembly. This is what allows different programming languages to fully share types across libraries compiled from different source languages (including cross langague debugging). This was the feature that Java was missing and this is a major reason why .NET is so popular, it has the potential to end the "my programming language is better than yours" debate because the common types and assembly make the argument irrelevant. Use C# or use VB.NET or use Eiffel or whatever other language you want.

        I can say with experience that .NET is very powerful and can be very pleasant to work with.
        I have been using .NET for six (6) years now and I can honestly say that it has fulfilled all of my expectations with very few dissapointments and it is improving substantially with each subsequent version. I know that I sound like a hopeless fanboy, but .NET really is a pleasure to work with, especially given the breadth and depth of the libraries and languages with a tool for every job and every job done with the right tool.

        I honestly think that .NET is a crowning achievement of theirs.
        Although it probably doesn't hold much water with the Slashdot crowd, I think that it is fair to say that .NET is the best thing ever to come out of Microsoft, even though it wasn't a completely original idea (but how many completely original ideas have there been in computer science anyway? Everyone benefits from the work of predecessors and ones who have come before us).
        • Re:Long Answer? (Score:5, Interesting)

          by DrPizza ( 558687 ) on Tuesday May 06, 2008 @07:07AM (#23310202) Homepage

          I agree. It sounds to me like this guy used .NET for a year or so around 2002 when it was brand new and then left and hasn't looked again for the last six (6) years. He is the first person that I have heard accuse .NET of being "too simplistic". The .NET class library (and Java class library as well) is the definition of everything and the kitchen sink. These are extremly powerful languages and libraries that are if anything too complex.

          You seem to be confused between broad (.NET supports a lot of things, like sockets and cryptography and distributed transactions and GUIs and XML and oh my!) and powerful.

          .NET does a lot of different things, but .NET doesn't have the greatest underlying abstractions. For example, to name a few:

          • IList requires integer indexing. This makes it unsuitable for some kinds of sequential collection such as linked lists, which have no acceptable way of implementing integer indexing. Consequence? IList, whose documentation proudly claims to be "the base interface of all generic lists" is not, in fact, "the base interface of all generic lists". LinkedList does not implement IList.
          • IList has no ToArray, but List does. Sure, ICollection has Count/CopyTo, but if the convenience method is good enough for List, it's good enough for IList. OTOH, if the convenience method is unnecessary for IList, it's unnecessary for List.
          • Except for to confuse me, why do ICollections have a Count, when arrays have a Length? What's the meaningful semantic distinction that I'm missing here?
          • Why do arrays have LongLength but with no corresponding LongCount for ICollections (3.5 adds LongCount as an extension method, but that gives it inconsistent (method) syntax, and of course it can never work because even if the collection were extended to support long lengths, an extension method can never exploit that fact, because the extension just works on IEnumerable which supports only an int Count)?
          • Why do arrays have LongLength, and not simply have Length be a long? It surely didn't take a whole lot of foresight to figure that one out, did it?
          • Where do I find reverse iteration/enumeration?
          • Where do I find bidirectional iteration/enumeration?
          • Why does the LinkedList expose implementation details such as LinkedListNode to users?
          • Why is there no generalized mechanism for storing my position in a container?
          • Why does HashSet have no ISet interface?
          • Why is there no SortedSet?
          • Why is System.Collections.ObjectModel not System.Collections.Generic.ObjectModel given that it is, in fact, for generic collections?)
          • Why are there no static type-inferencing factories for read-only collections or singleton collections? When you have generics, factory methods are good, because factory methods can infer. Don't make me type IList myReadOnlyList = new ReadOnlyCollection(myList); the double specification of the type is spurious.
          • Why is it named ReadOnlyCollection when it is in fact a IList?
          • Why is there no true ReadOnlyCollection (i.e. that is actually useful for collections)?
          • Why is it named SynchronizedCollection when it is in fact an IList?
          • Why is there no true SynchronizedCollection (i.e. that is actually useful for collections)?
          • Why is there no deque class?
          • Why is there no IStack?
          • Why is there no IQueue?
          • Why do Stack and Queue hardcode their backing store, even though the performance profile may be better off using e.g. a linked list, deque?
          • Why is KeyedByTypeCollection so widely useful as to even exist?
          • Where do I find an equivalent to java.util.concurrent? Anyone suggesting SynchronizedCollection, please punch yourself in the face right now.
          • Why are there new (.NET 2.0) classes that use old (non-generic) types, e.g. System.Net.Mail.MailMessage.Headers, which uses NameValueCo
          • Re:Long Answer? (Score:5, Interesting)

            by dossen ( 306388 ) on Tuesday May 06, 2008 @11:59AM (#23312996)
            Damn - what a list.

            My own additions - from doing Sharepoint:
            - Key classes (SPSite and SPWeb) that are IDisposable. That's not too much of a problem, except that the documentation is somewhat vague on how to handle them - calling some methods will e.g. initialize the RootWeb property of SPSite, which you have to dispose, despite never having called RootWeb yourself. And sometimes you have to take care _not_ to dispose, as you are handed a reference to a shared instance of an SPWeb or SPSite.
            - Several nice components are sealed and/or internal - my current pet peeve is a webpart, which has its look and feel defined via XSLT-files (big poorly documented XSLT-files, that do a lot of scary things to dump a lot of ugly HTML), unfortunately these XSLT-files are used by several different webparts, and some of them have neither the ability to subclass the webpart to use another file, nor the option of configuring which file to use. The only way to change that look and feel is to change the standard files.
            - Checking user permissions in some places requires you to call a boolean method and receive true for access (sounds good so far, right...) and an exception for access denied!
            - Not only is the default rendering a load of IE specific crap, if you dig deep enough, you'll find hardcoded strings inside the core libraries that, to the best of my knowledge, are not valid HTML in any contemporary standard. And the only way to get rid of it is to capture the output stream and clean it up by hand, one string pattern at a time.
            - And that reminds me of a .NET one - why can't I do regular expressions on StringBuilder objects? I have to convert one to an immutable string, do my replacement that results in another immutable string. Just seems like an awful lot of copying, if I'm doing a lot of editing.
            - And another Sharepoint gripe - why are all the interesting parts of my code inheriting stuff, that makes unit testing a real pain (I tried to mock my way through it once or twice, but the amount of code needed to test without invoking the database backend is beyond what I could justify even trying).
          • Re:Long Answer? (Score:4, Informative)

            by VoidEngineer ( 633446 ) on Tuesday May 06, 2008 @12:31PM (#23313402)
            I'm not going to go through your entire to-do list, although I will answer a couple of your questions.

            Except for to confuse me, why do ICollections have a Count, when arrays have a Length? What's the meaningful semantic distinction that I'm missing here? ICollections implement an IEnumerator interface and have an Enumerator object which counts the objects in a collection. Think of an enumerator as an odometer (like the one in your car). If the object implements ICollection, then it has an odometer, which you can get the count from. Arrays just have size.

            Why do arrays have LongLength but with no corresponding LongCount for ICollections (3.5 adds LongCount as an extension method, but that gives it inconsistent (method) syntax, and of course it can never work because even if the collection were extended to support long lengths, an extension method can never exploit that fact, because the extension just works on IEnumerable which supports only an int Count)?
            Again, difference between an attribute and a verb. LongLength is an attribute of an object, such as height or width. Whereas Count is like an odometer in your car. Your question is kind of like asking why can you travel tens of thousands of miles along the US highway, but my odometer in my car only goes up to 1000. Well, that's just the way the odometer was made. Length of trip is distance, whereas your odometer is a counter and is only a *measure* of distance. It's simply a distinction the language makes.

            Why do arrays have LongLength, and not simply have Length be a long? It surely didn't take a whole lot of foresight to figure that one out, did it?
            Chunking. .NET gets used on both 32 and 64 bit platforms, and the performance penalty for splitting a 64 bit word into two is greater than using two 32 bit words. In the first case, you still have to use 64 bit words, but you pad the first 32 bits with zeros, and convert to 32 bit words. Requires an extra pass through the processor to calculate, whereas adding two 32 bit words into a 64 bit word is trivial. I'm not explaining this concept well, but if you look it up you'll find more info on the question you're asking. The design decision was based on current market saturation of 32 bit processors, and the LongLength was an added conversion for the 64bit programmers.

            Where do I find reverse iteration/enumeration? Where do I find bidirectional iteration/enumeration?
            Using the odometer analogy as above, the enumerator only goes forward; although you *can* reset it. If you want to do reverse iteration, copy your collection into a new collections backwards, and iterate over that new object. Alternatively, for most reverse or bidirectional iterations, you'll simply want to ditch the 'foreach' loops, and use a simple 'for' loop. Then you can start high and use decrement iterators to count down. I also like to use decrementor collections which get an object removed with each pass of the for loop.

            Why is there no generalized mechanism for storing my position in a container?
            You need to get the IEnumerator object from the ICollection, using the GetEnumerator method. It will have a Position field.

            Why is there no SortedSet?
            Probably implemented somewhere else.

            Why are there no static type-inferencing factories for read-only collections or singleton collections? When you have generics, factory methods are good, because factory methods can infer. Don't make me type IList myReadOnlyList = new ReadOnlyCollection(myList); the double specification of the type is spurious.
            You're splitting hairs here. It's a strongly typed language. It's meant to be explicit, not inferential. In other projects, besides yours, double specifying types is needed and a useful (if not critical) feature.

            Why is it named ReadOnlyCollection when it is in fact a IList? Why is there no true ReadOnlyCollection (i.e. that is actually useful for collections)? Why is it na
            • Re:Long Answer? (Score:4, Interesting)

              by DrPizza ( 558687 ) on Tuesday May 06, 2008 @12:49PM (#23313632) Homepage

              ICollections implement an IEnumerator interface and have an Enumerator object which counts the objects in a collection. Think of an enumerator as an odometer (like the one in your car). If the object implements ICollection, then it has an odometer, which you can get the count from. Arrays just have size.

              Count is not a method; it is not asking "Count how many things are in this object". It is a property, and as such an intrinsic feature of the object (that behind the scenes it might have to do the verb thing is beside the point--it's semantically a property, even if it's actually a verb). And even if one were a verb--why should be it?

              Again, difference between an attribute and a verb. LongLength is an attribute of an object, such as height or width. Whereas Count is like an odometer in your car. Your question is kind of like asking why can you travel tens of thousands of miles along the US highway, but my odometer in my car only goes up to 1000. Well, that's just the way the odometer was made. Length of trip is distance, whereas your odometer is a counter and is only a *measure* of distance. It's simply a distinction the language makes.

              Again, though, the language doesn't make the distinction you are making. Count is a property, just like Length. If Count were a method I could sort of understand the difference (I don't really agree with it, it seems spurious, but I could sort of understand it). But it's not; it's a property.

              Chunking. .NET gets used on both 32 and 64 bit platforms, and the performance penalty for splitting a 64 bit word into two is greater than using two 32 bit words. In the first case, you still have to use 64 bit words, but you pad the first 32 bits with zeros, and convert to 32 bit words. Requires an extra pass through the processor to calculate, whereas adding two 32 bit words into a 64 bit word is trivial. I'm not explaining this concept well, but if you look it up you'll find more info on the question you're asking. The design decision was based on current market saturation of 32 bit processors, and the LongLength was an added conversion for the 64bit programmers.

              But LongLength means we won't have a clean transition, because it means people will have to fix up APIs to take longs where they currently take ints; making everyone pay the price for longs might be a short-term cost (though not a great one), but it'll be a long-term gain.

              Using the odometer analogy as above, the enumerator only goes forward; although you *can* reset it. If you want to do reverse iteration, copy your collection into a new collections backwards, and iterate over that new object. Alternatively, for most reverse or bidirectional iterations, you'll simply want to ditch the 'foreach' loops, and use a simple 'for' loop. Then you can start high and use decrement iterators to count down. I also like to use decrementor collections which get an object removed with each pass of the for loop.

              That really doesn't answer the question. In both Java and C++ I have iterating objects (java.util.ListIterator, C++ bidi/random iterators) that can go forwards and backwards. I use these quite regularly; why can .NET not provide the same?

              You need to get the IEnumerator object from the ICollection, using the GetEnumerator method. It will have a Position field.

              Neither here: http://msdn.microsoft.com/en-us/library/system.collections.ienumerator.aspx [microsoft.com] Nor here: http://msdn.microsoft.com/en-us/library/78dfe2yb.aspx [microsoft.com] So I'm not altogether sure what you mean.

              Probably implemented somewhere else.

              I don't understand what you mean.

              You're splitting hairs here. It's a strongly typed language. It's meant to be explicit, not inferential. In other projects, besides yours, do

    • by Anonymous Coward on Monday May 05, 2008 @10:24PM (#23307590)
      Short answer?

      Windows is bad for developers.

      Long answer?

      Windows is bad for developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers! Developers!

      (The lameness filter complains about my Ballmer joke -- it must be detecting residual Microsoft lameness.)
    • Re:Long Answer? (Score:5, Insightful)

      by dcam ( 615646 ) <david@@@uberconcept...com> on Monday May 05, 2008 @11:34PM (#23308122) Homepage
      Everything I have read and heard about Microsoft suggests that they are cowboys.

      Code first, design later. For example, I note with interest the amount of pain involved in trying to provide server protocol documentation for the EU. Some of the foot dragging is deliberate but some of it is that they don't have quality internal documentation.

      There is a severe lack of direction and leadership at Microsoft. They are just not forward planning. As a result they are tearing themselves to pieces, doing the same work again and again.
  • by dreamchaser ( 49529 ) on Monday May 05, 2008 @08:22PM (#23306574) Homepage Journal
    "It passes bad parameters to API calls, uses memory that it has released, assumes that files live in particular hard-coded locations, all sorts of things that it shouldn't do."

    Those are basically programming errors, not problems with the API. Don't get me wrong, I find Win32 to be a pain in the ass sometimes, but this article just reeks of flamebait.
    • by Ulfalizer ( 881975 ) on Monday May 05, 2008 @08:41PM (#23306716)
      I think you missed the point.

      The problem is that many major legacy applications depend on undocumented behavior because they make sloppy use of the Windows API (e.g. by assuming that a particular function will not segfault when passed a bad argument). For those to keep working, newer revisions of the API implementation must have the same undocumented behavior, which causes a maintenance nightmare.
      • by dreamchaser ( 49529 ) on Monday May 05, 2008 @09:05PM (#23306942) Homepage Journal
        No I didn't miss the point. Using an undocumented API is another example of bad programming. Yes, even HAVING undocumented API's is bad as well. Like I said, I was not excusing the mess that is Win32, I was just sayin'...
        • by fimbulvetr ( 598306 ) on Monday May 05, 2008 @09:50PM (#23307322)
          Maybe the point was that MS fosters bad programming by keeping legacy API calls around indefintely, whilst other systems do not. I'm the last guy to ever go pro-apple on /., having "been there, done that", but he really does have a point. MS is afraid to deprecate bad ways in favor of keeping some minor share of customers happy.

          While this has short term benefits, the long term imposes a hefty penalty, the same penalty MS (and some of its developers) is paying now.
          • by fractoid ( 1076465 ) on Monday May 05, 2008 @11:19PM (#23308022) Homepage

            MS is afraid to deprecate bad ways in favor of keeping some minor share of customers happy.
            This is true, but remember that when you sum up the minor share of customers that's made happy by each of these legacy APIs, you end up with a large number of developers. A very large number.
          • by SuperKendall ( 25149 ) on Tuesday May 06, 2008 @01:56AM (#23308894)
            I too have "been there, done that" on a lot of platforms.

            The term you are looking for is "enabler" - Microsoft is an Enabler for other applications to engage in bad practices, for users to engage in bad security.

            As you say it makes a lot of people happy now, but look what else it has given us - lots of decrepit systems, and hundreds of thousands of zombies that make our lives miserable in other ways.

            Sometimes even a business can't just be about making people happy, it has to be about moving the market as a whole to dry land when they see the flood coming.

            Just like the Spiderman quote tells us.

        • Yes, even HAVING undocumented API's is bad as well.

          I thought we discussed this when Apple did it? Undocumented APIs are there for the use of operating system developers and other people who feel the need to tickle the operating system at a low enough level to fool it into doing things that it was never intended to do for you. This is your prerogative; it's possible to sniff through the structure of the binaries to find new functions, and it's possible to debug functions to see what they're calling, so no one can really stop you anyway.

          At the same time, being upset when they stop functioning correctly is the mark of a whiny idiot, because let's face it, they're undocumented. If you want that functionality exposed, by all means, cry to the OS vendor. Or, you know, you could support open source and/or free software and work with an environment in which you have all the source code and can at least make relatively responsible use of the undocumented functions, and if you are feeling froggy, even submit patches which express this functionality consistently with a documented API.

          Seriously though, undocumented functions are par for the course. If the functions are never intended to be used by anything other than the operating system, and the functionality is expressed through the OS in some fashion (use of undocumented APIs in Win32 has often been done to work around a bug) then there's really no problem. The portions of the OS that need to be changed when the libraries change can be changed in such a situation (and hopefully will fail tests if someone doesn't think to do it beforehand.)

      • The problem is that many major legacy applications depend on undocumented behavior because they make sloppy use of the Windows API (e.g. by assuming that a particular function will not segfault when passed a bad argument). For those to keep working, newer revisions of the API implementation must have the same undocumented behavior, which causes a maintenance nightmare.

        So, you problem is that programmers make use of undocumented API calls. While "undocumented" does not always equal "unsupported", using them is just plain stupid. Whether it is Windows, Linux, MS-DOS, DR-DOS, OSux using the system in an undocumented/unsupported way is well, U N S U P P O R T E D. Don't blame the OS or the those that coded it, blame those that wrote against the API in an unsupported way.

        RTFA turns out to be a effort in slogging through another of the author's attempts to explain why anyone on Windows is just benighted. He blames HIS short comings on the OS.

    • by 0123456 ( 636235 ) on Monday May 05, 2008 @08:42PM (#23306726)
      "Those are basically programming errors, not problems with the API."

      I think you missed the point. For the sake of backwards compatibility, Microsoft supports applications which do all these things, and drags all the associated crap into future versions of Windows so they still run.

      For that matter, so do hardware developers: back when I was writing drivers for Windows I had to deliberately put bugs in our code to support applications which only worked because of bugs in the Microsoft versions of the drivers and would crash if we didn't replicate those bugs ourselves. We also spent weeks working around abuse of the API by a certain big computer company that can't program PCs worth a damn (or even, apparently, read API documentation).
      • Re: (Score:3, Funny)

        by x00101010x ( 631764 )
        At least with Windows7, the backwards compatibility nightmare will be over with virtualization similar to what Apple did.
  • by erroneus ( 253617 ) on Monday May 05, 2008 @08:26PM (#23306612) Homepage
    The culture of DOS programming was corrupted from the beginning and you can partly blame IBM for a crappy BIOS. Were it not for the crappy BIOS, programmers wouldn't have had to resort to writing directly to hardware to get an acceptable speed on the screen. And it just kept going on from there. And now when a developer wants more "something" from the OS than they can get naturally, they write VxDs to help gain an advantage.

    The culture is all about writing code to get past deficiencies and shortcomings in DOS/Windows.

    Windows programmers don't respect the rules... and if they do, they write what appears to be crappy software.
    • by frank_adrian314159 ( 469671 ) on Monday May 05, 2008 @08:48PM (#23306782) Homepage
      Yes, but making the hardware suck to scrape a couple of pennies off the price didn't help the BIOS. Actually, I blame IBM more for not choosing a better processor than the x86. There were sane architectures out there at the time (e.g., Motorola 68000). A lot of the craptacular nature of the BIOS (not to mention DOS and early Windows programming) came out of that particular decision. But, back then, IBM was a fairly craptacular company anyway. It seems to have improved a bit since then (although, it's hard to tell; with a company the size of IBM, you may be looking at the stern of the oil tanker and everything looks fine, while on the bow, fires are raging).
      • Re: (Score:3, Interesting)

        by Sentry21 ( 8183 )
        The Motorola 68k was their first choice, but unfortunately it wasn't ready in time for them to use it for their systems. Intel's 8086 processor was their unfortunate second choice.
        • by rs79 ( 71822 ) <hostmaster@open-rsc.org> on Monday May 05, 2008 @11:49PM (#23308212) Homepage
          I worked at a computer manufacturor in the late 70's and early 80s in LA and started on 8-bit micros and was there for ther introduction of 16 bit chips.

          I was used to PDP-11s keep in mind.

          The problem wasn't the 68000 wasn't ready, it was ready. There were just no support chips yet. Intel actually delivered a complete solution: CTC chips, PIC's, serial ports, dma controllers, i/o processors (that nobody but us used).

          Motorola had a CPU and that's it. A vastly *superior* CPU, but the hardware guys wanted to build systems not wait for the rest of the stuff they needed. So we all held our noses and went x86. And bought Amigas as soon as they were out (I have serial #11. Still.)

          This crap as all in one chip these days, but back then computers had several large black chips inside them.
    • by EmbeddedJanitor ( 597831 ) on Monday May 05, 2008 @09:58PM (#23307394)
      In DOS days, there were often 3 ways of doing things. For example, take writing to the screen:
      You could call the BIOS interrupt function.
      You could call the MSDOS Interrupt function.
      You could detect the hardware and write directly to the hardware address.

      Both the BIOS and DOS mechanisms were slow and broken and did not follow the conventions of any programming language. For example terminating strings with the $ symbol, FFS.

      All commercial programs (and most hobbiest ones) wrote directly to hardware for speed.

      DOS was not really an OS at all. It did very rudimentary memory management. About the only thing you'd really use DOS for was disk access and application launching, otherwise DOS applications were basically "bare metal" applications that managed just about everything (screen, keyboard, serial ports, mouse,...) internally.

  • by account_deleted ( 4530225 ) on Monday May 05, 2008 @08:27PM (#23306628)
    Comment removed based on user account deletion
  • by timmarhy ( 659436 ) on Monday May 05, 2008 @08:27PM (#23306634)
    "And that software is usually buggy. It passes bad parameters to API calls, uses memory that it has released, assumes that files live in particular hard-coded locations, all sorts of things that it shouldn't do.'"

    and this has exactly what to do with MS? the coding habits of programmers has NOTHING to do with MS.

    • by Adambomb ( 118938 ) on Monday May 05, 2008 @08:47PM (#23306770) Journal
      well, it does indirectly but not as blatantly as the article would have one believe. Don't forget that if the API's were properly designed to begin with, it would have been impossible to give invalid parameters to the function or allow the use memory that has been released.

      I have no idea what hes on about with the hard-pathed file references.

      The problem is so many corporate coders back in the day (and still) would use whatever shortcuts they could within the api including "undocumented features" like the former two issues. If Microsoft were to fix these issues without compatibility for these "features", it would break tons of legacy applications. Therefore, ongoing developing must include these already-incorrectly-designed portions of the API as well as whatever they really want to be working on.

      Just because a company does something poorly to begin with and people adapted to it, doesnt mean the company isnt to blame for the issues.

      Course that doesnt make this article NOT a flaming pile of rhetoric, it just makes it slightly less than complete and utter bullshit.
      • Re: (Score:3, Interesting)

        by timmarhy ( 659436 )
        it's only just better then total bullshit though. using his own benchmarks linux is a piece of crap for dropping features, and we should bash it for each any every poorly coded OSS project (believe me there is a LOT)
      • by techno-vampire ( 666512 ) on Monday May 05, 2008 @09:53PM (#23307348) Homepage
        I have no idea what hes on about with the hard-pathed file references.


        This goes back even before Windows. I used to have some DOS programs that would only let you save a file to a floppy. Not just games, a poster-creating program had A:> built into the path for saving files, and there was no way to change it. Granted, even the most rabid Microsoft-basher can't blame that on them, but it's part of the way programs used to be written. It's the same type of mindset as caused game designers in the early DOS days to hardcode timing loops because, of course, the PC would always run at 4.77Mhz.

      • by mdarksbane ( 587589 ) on Monday May 05, 2008 @10:31PM (#23307662)
        The difference is that Mac OS X was a cutoff line for all of that backwards compatible crap. A whole lot of stuff broke with Mac OS X, and it sucked for a while.

        Then everyone started using the new shiny API's and it started getting a lot better.

        He's complaining that the windows API's are still hauling along cruft and junk and nastiness from 15 years ago. It hurts MS's ability to improve them, it hurts developers ability to use them, because they have to wade through pages of deprecated functions to find the correct ones, or hit strange inconsistencies that have been hanging around for years. It's also just bad for the general consistency of the experience - see the comment about the system32 on Windows 64.

        He knows that bad developers doing stupid things isn't Microsoft's fault. But how you react to bad developers is. It's a tough decision to make - do you slap the bad developers on the wrist and break things because *they* were doing something stupid, or do you keep letting them have their way until it's their decisions that rule the API and the platform?
  • by Anonymous Coward on Monday May 05, 2008 @08:31PM (#23306660)
    I am 'this' close to jumping ship. I use Ubuntu on machines at home and find it fast and clean, even on older hardware.

    I have access to all MS software as our MSDN and Gold Certified Partner plan administrator. I have tried Vista on a couple machines. Even on a brand new Dell dual core laptop with 2 gigs of ram, it was sluggish and still could not use the full aero interface. Yet I installed Ubuntu on a 4 year old 600m with 512MB ram and got a full interface with snappy performance.

    I don't need aero to develop code. The features I was most interested in all got cut from Vista... most notably the filesystem upgrades. Now add frequent updates to the framework that require $1200 software packages to use to the fullest extent. Then add the insane cost of a legit SQL Server license on which to deploy it. Plus as a domain admin, I find the administration to be a drag. And I still don't trust them for a second on security. It all adds up to a monumental drag.

    I am a frustrated .Net developer. I don't know that it is that much better on the other side of the fence frankly, at least as far of the coding environments go. But I KNOW for a fact that I prefer linux to Windows.
  • by Goody ( 23843 ) on Monday May 05, 2008 @08:32PM (#23306662) Journal
    It's quite nice if you want to use the same techniques you learned 15 years ago and not bother to change how you do, well, anything

    Apparently the author never heard of vi and gcc on Linux...
  • by morari ( 1080535 ) on Monday May 05, 2008 @08:34PM (#23306674) Journal
    How Microsoft Dropped the Ball With... Developers Developers Developers *insert techno beat* Developers Developers Developers Developers Developers Developers Developers Developers Developers Developers?
  • by crt ( 44106 ) on Monday May 05, 2008 @08:39PM (#23306700)
    Microsoft has dropped the ball in a number of areas, particularly with regard to user-interface APIs which this article focuses mostly on, but in other ways it is far and away the easiest platform to develop for - mainly because of the quality of their development tools. Having done lots of development across Windows, Mac, and Linux with all kinds of editors, IDEs and debuggers, nothing comes close to Visual Studio in terms of functionality, quality, and just being solid. It's not perfect, but it's way better than anything else out there. For that reason alone Microsoft deserves some kudos from developers.
    • by nojomofo ( 123944 ) on Monday May 05, 2008 @09:20PM (#23307068) Homepage
      Holy shit, you've got to be kidding me! Even VS 2008 needs Resharper to even be close to Eclipse in functionality. Have you ever used a different IDE?
  • "one developer" (Score:5, Interesting)

    by Whitemice ( 139408 ) on Monday May 05, 2008 @08:39PM (#23306702) Homepage
    "how one developer migrated from Windows to OS X"

    That pretty much says it all: "one developer"

    The argument about old krufty code in Windows and the Win32 API has been around since.... the Win16 API! It didn't really seem to slow down Win32.

    On the flip side is the argument that the need for backwards compatibility is holding back Windows - yet developers complain about the migration from XP to Vista?

    All smells like we-will-find-anyway-to-condemn-Windows to me. Note: I do all of my development on LINUX, so I'm not a Windows booster. I think lots about Windows just stinks but there is an issue of credibility here.

    If you want a clean new coherent API and you want to develop on Windows Microsoft has provided an option: .NET
    • Re: (Score:3, Interesting)

      by Drishmung ( 458368 )
      From TFA, he claims that .NET is neither clean nor coherent, i.e., that MS squandered an opportunity.
  • by clintp ( 5169 ) on Monday May 05, 2008 @08:41PM (#23306714)
    The False God of Backward Compatibility has Microsoft by the short hairs. Even new programming environments like .Net have Win32, Win16, and DOS lurking right around the corner. There's no fresh start anywhere in the Microsoft environment, everything reeks of DOS.

    Which would have been find if DOS (Win16, Win32, etc..) were a multi-platform, extensible OS to begin with -- but it wasn't. It was a quick hack that lives on and on.

    I'm a developer that works primarily in Windows, with 15 years of heavy-hitting Unix programming experience behind me.
    • by daemonenwind ( 178848 ) on Monday May 05, 2008 @09:34PM (#23307160)
      It's not a false god, not when you're coding for a business that has to meet both regulators and profitability expectations.

      Code written for Windows 95/NT (back in 1996) still works today on the Windows platform. 12 years later.

      Try that with System 7 code on OS X.

      Yes, this is part of why writing business-logic code sucks. You seldom get to just re-write anything to be really, truly good instead of something perennially built-upon and increasingly hacked-together. No one will pay for a change that doesn't deliver "business value". (And no, greater stability/performance is almost never enough, as that argument usually demands an associated headcount reduction) But at least the app still works and can continue to deliver. And since some will doubt, yes, I do maintain/enhance such code.

      The market speaks - this sort of backwards compatibility is a conscious choice by MS, and it does sell their OS. Not concidentially, it also sells mainframes and *UX systems. And I'm convinced it's one of the big reasons Apple isn't bigger in the corporate world. Steve's demands for newer/better/faster totally supplanting the old are well known, and rightly feared.
  • Glory days are here (Score:5, Interesting)

    by icepick72 ( 834363 ) on Monday May 05, 2008 @08:48PM (#23306774)
    Despite what's underneath Windows, programming it through the .NET platform is very slick. Most of what had to classically be linked to in obscure ways is wrapped in the Framework Class Library. Most people complain it's large but after you learn the basic structure you can find immediately what you need using the documentation. Microsoft has also abstracted away the trickyness of DLLs and you can program against mostly any functionality using your language of choice [dotnetpowered.com].

    When articles claim Microsoft dropped the ball I think it's more wishful thinking than anything, because Windows programmers are in their Enterprise glory days right now, no longer restricted to VB and half-assed object models. Not anymore. We now have full OO features and much much more, and Java is playing cathup feature-wise. It's nice for a change.

    I don't care how messy Microsoft's underlying code is, as long as they've tested it and ensure it works enough for me to program against it. The Microsoft security updates help a lot too. They're very frequent which means there are a lot of security flaws but they take care of them quickly (I'm sure I will get numerous examples where they didn't take care of security quickly but if you're on Windows update you see them coming thought all the time).

    • Re: (Score:3, Interesting)

      by ronark ( 803478 )
      Sorry, but I must disagree with you. I program daily with .NET at work, and the number of times a P/Invoke is required to get advanced functionality is simply shocking. Not to mention the fact that despite the claim of being purely an object oriented framework, many parts of its design spit in the face of OO. I'm not talking about rarely used classes either. File, Directory, Math, Convert, Encoding, to name a few major players, cannot be instanced as they are declared static. How this is different from a si
      • Re: (Score:3, Insightful)

        by icepick72 ( 834363 )
        Because some classes are static or sealed does not mean the CLR doesn't support full OO features. I known you recognize that but some people reading your post might misinterpret it so I'm clarifying. What it means is some class designers made some arguably bad decisions about how to allow their classes to be used. Sometimes the class designer is Microsoft. But I'm there with you buddy. If only I could count the number of times I've cursed a class designer because they didn't let me instance it. In the cases
  • by TheNetAvenger ( 624455 ) on Monday May 05, 2008 @09:00PM (#23306898)
    No concept of what .NET really is, misleading users.

    No mention or acknowledgement of WPF/WCF or the new APIs that are and 'set' to replace Win32/Win64

    Completely misleads users about API concepts and features of OS X compared to Windows, for example XAML/XPS concepts compared to Display Postscript is a massive difference in display technologies that are part of the new Windows API sets, that Carbon or Cocoa cannot provide to developers. (Go to Channel 10 and watch videos on why XAML/XPS was created and how it trumps every aspect of other display/print technologies. - Let alone how it is an integrated aspect of the video API system in Vista, making programming freaky simple for advanced features and new UI platforms like 3D.)

    The author then jumps into UI consistency with dialog wording, and doesn't mention OS Xs lack of keyboard support, consistency of delete/backspace or 100 other things more important than dialog wording which is also NOT PART of Win32 inherently.

    Author doesn't realize Microsoft and IBM wrote most of the GUI and UI guidelines that OS X even uses today.

    Office 2007 is a new direction in GUI paradigms, and is WELL accepted in the business world. Not something to make fun of when OS X is still using old MENU (textual word lists) concepts. Menus were a hack to make features available in a GUI context, but are a draw back to non-graphical UIs. Vista and Office 2007 moving away from word lists (MENUS) is the right direction, too bad Apple isn't innovating on UI and just keeps throwing the same UI slop at users and telling them it is good. (And don't even mention multi-touch UI, go watch the freaking TED conferences Apple ripped the ideas off from several years ago, let alone the MS multi-touch work that also preceded the TED conference. MS Research has and is doing more with UI than any other think tank in the world.)

    Author also totally ignores Adobe not providing any 64bit support for OS X because Apple dropped the ball on Carbon x64bit support that has been promised forever from Apple. In contrast 64bit development on Windows in both Win32/Win64 and .NET/WPF is easy, transparent and has clear and easy paths for migration. (Let alone OS X is still a hybrid 64bit OS, using 32bit code throughout the OS, unlike Vista x64)

    So for 'real developers' like Adobe (OS X) is a failure, and has failed paths. Which means if you want a 64bit version of Adobe products, you will have to move to Windows for the peformance and benefits. Oh, how brilliant Apple and OS X is...

    This brings up the horrid Carbon/Cocoa platforms and migration paths, and even then not even touching on the development tool constrast between the two platforms.

    I challenge Mr. Bright to a real debate on the topics covered, maybe he can try to justify some of his misleading and outrageous claims.
    • Re: (Score:3, Insightful)

      by Homburg ( 213427 )
      "Vista and Office 2007 moving away from word lists (MENUS) is the right direction."

      Quite right. The last thing I want in my word processor is words.

  • by buss_error ( 142273 ) on Monday May 05, 2008 @09:08PM (#23306968) Homepage Journal
    Back before my current gig, I was a software developer for companies that hired me to do their work and for several packages I wrote for my own profit. This story comes from the programs I developed for my own profit.

    Because the software I wrote was also licensed for source code if the user wanted it, I picked Visual Basic as the platform to use. I wanted to use Visual C, but you could more easly find programmers that could get by in Visual Basic than VC. I should have picked VC rather than VB for a lot of reasons, the main one being that if you had experience in VC, you were at least likely not to be a total idiot. Not so with VB. I found that VB programmers were idiots at the approximate rate of 7:10, while VC programmers were likely to be idiots at an estimated 1:10 ratio... which isn't to say that all VB programmers were idiots, only that they were cheaper labor, and therefore less likely to have a solid background in programming logic.

    That said, we'll focus only on my own development problems, just so we are dealing with only one (possible) idiot... me. I started out with VB 2.x. The upgrade to 3.x went fine, with very few problems. When 4.0 came out, I found I had to rewrite about 20% of my code. Sure, there were conversion programs, but they didn't quite fit in with exactly what I wanted the program to do. It'd get it about 90% right, but then I'd have to slog through the rest of the automated code to correct that last 10%. It was faster to discard that code and re-write it.

    Then 5.x came out. Only about 50% of my code still worked. And again, the automated process to "ease" transisition left something to be desired. When Visual Studio 6.0 came out, it was a nightmare. only 20% of the code ported. At that point, I sent the 5.x code out to all the people that bought the program (with source or not), and told them that the code was now moribund, I would not be maintaining it, and that I was releaseing the source code to the public domain (5 floppies included). As I recall, that was about 1998-1999 or so.

    As late as March 2008, I've been contacted about the code. Of course, it's morphed far past anything I'd written, and I could only help with the general business case logic involved, not the actual code. But having to deal once again with Microsoft development tools, one would have to offer me far, far more money than it would be worth. No, I'm done with Microsoft "development" games. I'm done with school yard bullies trying to take my lunch money. I'm done, PERIOD, with closed source, whenever I have a choice.
    • Re: (Score:3, Informative)

      by Gazzonyx ( 982402 )
      Just wondering... did your program interface a database at all? You should see the regressions with DAO/ADO/ODBC/JET/Etc. Now all you've got is ADO.NET, and from what I've seen the calls aren't the same as the calls for the pre-.net APIs.
  • by Coryoth ( 254751 ) on Monday May 05, 2008 @09:20PM (#23307070) Homepage Journal
    One of the nice things from this article was actually this nice screenshot [arstechnica.com] of a selection of current versions of MS software running on Vista. The thing to notice is that not a single one of those applications has a GUI the same as any of the others. There are different toolkits, completely different look and feel, some have menus, some don't; it's a horrible, horrible mess. And yet despite that, we still get people complaining about GNOME vs. KDE and the clash of different toolkits and how that's what is holding Linux back. You can run GNOME and KDE apps side by side and, while they'll have differences, they'll sit together far more elegantly than the mishmash that is Windows. I think I'll have bookmark that screenshot so I can bring it up the next time a Windows fanboy starts decrying the excessive number of GUI toolkits on Linux.
  • by abes ( 82351 ) on Monday May 05, 2008 @09:22PM (#23307090) Homepage
    I'm not big on the M$ love. I'm a mac/linux proponent. However, I think that M$'s current problem with a really horrible API (I'm saying this having programmed for win32, GTK, QT, WX, and Cocoa) isn't an easy to solve problem.

    They could pull an Apple, and completely redo their windowing system. Apple benefited from using NeXT's system, which was well thought out, uses a language well suited to windowing systems (objective-c), and could be altered based on previous user experience.

    However, in doing so they would lose all compatibility they current have. Keeping compatibility, even if it creates a developer's nightmare, is in the end what keeps them on top of the market.

    That is not to say it's not impossible for them to do so. Apple did provide a virtual machine to run old OS9 software with the first releases of OS X. However, since both Mac and Linux machines also have the same options (currently running Parallels on my machine), it would still take the clear advantage M$ has in the market away.

    It's not clear whether their bad API spells the eventual doom of the company. The more pragmatic developers will still value making products that more people can use over writing nice looking code. Additionally, wrapper libraries, such as WxWidgets or Qt can help hide much of the ugliness.
  • by Opportunist ( 166417 ) on Monday May 05, 2008 @09:44PM (#23307262)
    "When you use undocumented API calls you're in the wrong".

    So far, so right. In theory.

    In practice, though, let's look back into the world of Windows at its beginning. We're in the middle of the 90s, Win95 is fresh out the door and you're supposed to write for it... erh ... well, if you can. There are a few API calls documented which will allow you to write a few cute Windows programs but they will invariably be unable to compete with programs developed by MS. Why? Because you have no access to API calls. API functions that make your programs faster, or easier to use, or simply allow you to do something at all. Especially graphics and network code was notorious for being impossible to implement sensibly without resorting to functions that were available only when you dug through disassembled DLLs and guessed what was expected from you.

    So programmers faced the choice: Either write programs that cannot compete with programs written by MS (or companies that somehow got a hold of that information), or use calls where a few parameters are described as "set to NULL" or "unknown function".

    MS has a history of releasing information about formats or calling parameters at trickling speed, at best. Anyone who ever wanted to get a hold on, say, the Office container format can vouch for that. It's not really a lot better for API documentation. Usually, you get it for a lot of money, if you are deemed "worthy" first of all.

    Programmers don't let a company do that to them, though. They start figuring things out, reverse libraries and even existing program code to get the information they need. Of course, this results in the occasional mistake.

    Jump into the present. The companies that created software back then don't exist anymore, blown up in the dot.com bubble. Their software, though, still exists. And companies now rely on this software. So MS has to maintain those "buggy" APIs, else companies running buggy software will refuse to upgrade.

    Who is to blame? Basically, whoever decided that it's a smart idea to withhold the API documentation.
  • Qt (& GNOME) (Score:5, Informative)

    by scorp1us ( 235526 ) on Tuesday May 06, 2008 @01:32AM (#23308784) Journal
    As someone who had to learn C++/CLI and writes code to allow legacy code to interop with C# at work, I have this to say.

    If you are going to learn a new platform for a "modern" app or OS, then let it be one that allows you to target more than one platform. Seriously. Lets take a look at .NET:
    - Everything in the library is new.
    - You can only officially target one platform. (Mono not withstanding)
    - You have to learn a new language to use it effectively.

    Now look at Qt:
    - New library
    - Build onto same C++ compiler you've always used
    - No messy COM, COM wrappers needed for introspection
    - You can target any platform with a modern C++ compiler (VS6 and higher on win32, gcc on all platforms)
    - Ground up C++, clean consistent API.
    - Active development with binary compatibility within major releases.
    - Python, ECMA scripting, (some C# support too!)
    - Java version
    - Meta-object compiler adds introspection. (no need to deal with COM)
    - ActiveX interop in the commercial version (You can use Qt widgets in Winforms and vice-versa)

    I don't know as much about GNOME, but it shares a lot with Qt, so should not be excluded.

    About the only thing you miss out on is the automatic garbage collector. Qt emulates this to some degree by allowing every QObject to have a parent. Then the only thing missing is the ability to defragment memory in the heap. I've only heard about this being caused by lots of small memory allocations, but Qt block allocates so this isn't a problem. Also, many types are implicitly shared, meaning they are more like handles to the objects, meaning that 1) they can cross thread boundaries 2) they are references until they are modified.

    All in all I see you only lose out on the memory defrag. But you don't need to learn C++/CLI or C#. (My opinion of C# is that if you're going to go that far, you might as well take the goals of the language to completion, in which case you end up with Python, oh yeah, there is a Python wrapper for Qt too)

  • I don't see it (Score:4, Insightful)

    by golodh ( 893453 ) on Tuesday May 06, 2008 @07:30AM (#23310298)
    Where exactly did Microsoft "drop the ball" with developers?

    According to the article there are three types of developers: (I) the ones who bang Excel macros and Access databases together with VB (not very many), (II) in-house developers for large companies who program in whatever language is in demand (the vast majority), and (III) craftsmen-programmers who look for clean orthogonal programming tools and also program in their spare time (a few).

    The article goes on to argue that Microsoft catered very well for categories (I) and (II), and not at all for category (III.)

    Since I believe that the programmers who make stand-alone third-party applications mostly belong to category (II) I absolutely don't see how or why Microsoft supposedly "dropped the ball" for any developers except category (III). The article points to the messy API's of Win32 and the shadows that projects unto the .NET framework. Ok, fair enough, but who cares?

    Not the end-users and not the managers. And they're the ones who determine where the money, and hence the bulk of the development effort goes. That means that what end-users actually see and care about, their _applications_ will continue to be in plentiful supply for MS-Windows.

    Sorry, but the author will have to do a lot better to convince me that Microsoft shot itself in the foot as regards development effort. It's not the smartest thing that Microsoft could have done to alienate the craftsmen-programmers but I don't see how that puts a dent in their business.

The end of labor is to gain leisure.

Working...