Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
GNOME KDE Linux

Torvalds Takes Issue With De Icaza's Linux Desktop Claims 616

An anonymous reader writes "Linux creator Linus Torvalds has poured scorn on claims made by the co-founder of the GNOME Desktop project, Miguel de Icaza, that he (Torvalds) was in any way to blame for the lack of development in Linux desktop initiatives. De Icaza wrote in his personal blog: 'Linus, despite being a low-level kernel guy, set the tone for our community years ago when he dismissed binary compatibility for device drivers. The kernel people might have some valid reasons for it, and might have forced the industry to play by their rules, but the Desktop people did not have the power that the kernel people did. But we did keep the attitude.'" Update: 09/02 18:39 GMT by U L : The original source of the comments (and an exciting flamewar between Free Software heavyweights).
This discussion has been archived. No new comments can be posted.

Torvalds Takes Issue With De Icaza's Linux Desktop Claims

Comments Filter:
  • WTF. (Score:5, Informative)

    by eexaa ( 1252378 ) on Sunday September 02, 2012 @12:10PM (#41206537) Homepage

    I got linux on desktop.

    It works perfectly.

    Seriously, what's the problem? Just because ever-growing bloated software megapackages like KDE and GNOME aren't as successful as they were meant to, even on a platform that is meant not to favor such big packages, the linux on desktop is failing? Come on.

    • Re:WTF. (Score:5, Insightful)

      by vlm ( 69642 ) on Sunday September 02, 2012 @12:20PM (#41206623)

      I got linux on desktop.

      It works perfectly.

      Seriously, what's the problem?

      Agreed, "it" has worked properly for a long time. But someone elses "pet project" doesn't, so we have to hear endlessly about how "it" is broken.

      His hammer doesn't install drywall screws very well, therefore we are all supposed to be in a tizzy that the world is not ready for drywall.

      Bye bye gnome, bye bye kde, awesome / xfce / ratpoison are the way to go.

      • Re:WTF. (Score:4, Insightful)

        by MightyMartian ( 840721 ) on Sunday September 02, 2012 @12:34PM (#41206713) Journal

        De Icaza is a rat fink, period. He long ago used up any capital he had in the FOSS community with his dalliances with Microsoft. Frankly, if there was never another /. article involving anything that piece of crap had to say, we would still have about three dozen too many articles out there involving his weasily mutterings.

        • Re:WTF. (Score:5, Informative)

          by Black Parrot ( 19622 ) on Sunday September 02, 2012 @01:02PM (#41206895)

          De Icaza is a rat fink, period. He long ago used up any capital he had in the FOSS community with his dalliances with Microsoft. Frankly, if there was never another /. article involving anything that piece of crap had to say, we would still have about three dozen too many articles out there involving his weasily mutterings.

          His "lets make it like Windows!" attitude turned me off years ago. Now he sounds like a has-been, trying to get into the spotlight and blaming everyone else for his failures.

          • Re:WTF. (Score:5, Informative)

            by binarylarry ( 1338699 ) on Sunday September 02, 2012 @01:28PM (#41207109)

            The company that bought Novell completely threw his projects out during the take over.

            Can you imagine how little value Mono and his other projects must have if a holding company just wrote them off?

            • Re:WTF. (Score:5, Insightful)

              by whoever57 ( 658626 ) on Sunday September 02, 2012 @02:14PM (#41207527) Journal

              Can you imagine how little value Mono and his other projects must have if a holding company just wrote them off?

              They have provided excellent value ...... to Microsoft. Stymieing the development of Linux has been priceless to Microsoft, for the cost of refusing to hire him.

              • Re:WTF. (Score:5, Insightful)

                by dbIII ( 701233 ) on Sunday September 02, 2012 @08:24PM (#41209857)
                To be fair, it is actually useful for some commercial dotnet (fucking stupid name) software that has been tested against it and so runs on linux. It means you don't have to hotseat an expensive single user at a time bit of software and can just run it over X to wherever the user is sitting (vnc and remote desktop performance sucks for local access and it's more mucking about for the user.
                It fills the same compatibility niche as WINE.
                We can criticise him for some things but mono provides a benefit. Forcing mono into distributions to support some flaky software is a different story that appears to be somebody else's fault - I don't think mono itself is the unstable part. We can't blame him for that any more than we can blame him for the nightmare of gconf on gnome which was somebody else's bit of abandonware.
                • Re:WTF. (Score:4, Insightful)

                  by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday September 03, 2012 @07:19AM (#41212235) Homepage Journal

                  We can criticise him for some things but mono provides a benefit. Forcing mono into distributions to support some flaky software is a different story that appears to be somebody else's fault - I don't think mono itself is the unstable part

                  GNOME pushed Mono because of Miguel and now lots of us have it stuffed into our distribution as its legacy. WINE hasn't been forced on anyone; indeed, my Ubuntu Precise x64 install refuses to install it at the same time as the LSB core package.

      • Bye bye gnome, bye bye kde, awesome / xfce / ratpoison are the way to go.

        Nooo! Put the bottle of ratpoison dow....oh...I see....and you say that's a window manager [wikipedia.org]?

        *shakes head*

        All these young'ns with their confusing softamaware names. Next I'll see a wm called "; rm -rf /" and by golly I'll probably try to apt-get install that sucker.

      • Re:WTF. (Score:4, Insightful)

        by taiwanjohn ( 103839 ) on Sunday September 02, 2012 @02:17PM (#41207553)

        DE designers should be retired, just like the guys who gave us the basic clutch+brake+accelerator layout of pedals in cars. The basic combo of windows, widgets, & menus has served us well enough for decades already. There is nothing "more intuitive" waiting to be discovered... at least not as long as we're still using keyboards and mice.

        FFS, quit mucking about with "innovation" on the desktop!!! (Remember KISS? "If it ain't broke, don't fix it"...?? Any of this ringing any bells?) If anything, you DE designers should be more concerned with convergence than differentiation. Every time you hear the screams of millions of users crying out against the latest "New-Paradigm"[tm] from MS or Apple, that should be your cue to GIVE PEOPLE WHAT THEY WANT -- ie: what they are USED TO -- not an "Even-Newer-Paradigm".

        If you've got time on your hands, and are looking for something to do, please spend it on improving your favorite apps. The UI does not need anything "new", nor do the users want anything new or unfamiliar. It's more than enough hassle to keep up with "innovations" in the app space... please don't make us learn new tricks in the WM too!

    • Re:WTF. (Score:5, Interesting)

      by future assassin ( 639396 ) on Sunday September 02, 2012 @12:45PM (#41206781)

      I got you beat. I have 4 laptops in my house all using some version of mint from 11. They all work just fine for the 4 of us even with three of the people wife/kids being casual users. People not into computers could care less about eye candy cosidering most computer usage will be browsing the web or office work. So why do you need some compicated bloat ware for opening programs or changing the desktop background.

    • by perpenso ( 1613749 ) on Sunday September 02, 2012 @01:16PM (#41207007)

      I got linux on desktop. It works perfectly. Seriously, what's the problem?

      Well it is annoying to have to rebuild things when the kernel is updated, vmware comes to mind.

      These things add up and explain the many defections from desktop Linux to Mac OS X, as attested to by various long term Linux users in yesterday's article on the subject. The short story is that many Linux users merely wanted a *nix environment, they were not into the politics or crusade. That is desktop Linux's problem, its becoming a less interesting option for those who just want a *nix environment and don't want to join a social movement.

    • I kind of wish there were two paths to get Linux on the desktop. A cutting edge path, tailored to the distro, and a Standard Desktop for Linux (SDL) that was included with most distros (voluntarily) that was designed for minimal change over the years (bug fixes and minor improvements, but otherwise stay the same).

      During the install, you could just select the desktop environment you wanted.

      I think having a standard desktop that didn't change with every update or vary across distros would actually be business

    • Gremlins (Score:5, Insightful)

      by Unknown Lamer ( 78415 ) Works for Slashdot <clintonNO@SPAMunknownlamer.org> on Sunday September 02, 2012 @01:47PM (#41207269) Homepage Journal

      People like to pretend that Windows and OS X don't have their own unique problems... computing environments in general are still overly difficult to use and all have their own obnoxious quirks (given enough time and people think of them as features).

      My Grandmother runs KDE on Debian testing... she couldn't fix Windows when it broke, and at least Debian breaks less often... and the solitare game is better I hear. And when my cousins visit her I don't get the "the kids broke the computer with their stupid websites" calls any more ;)

    • Many years ago, I foolishly attempted to install the "Red Carpet" Gnome on my RedHat system. The install went very well, but the desktop was now so different that I wanted to revert. How to revert? Well, it seems that Ximian did not consider that possibilty. I spent a frustrating few days removing many Ximian packages and then repacing with RedHat packages (without yum and access to up2date a much more difficult task than it would be now)

      I am sorry, but if you want people to try your stuff, you need to pr

    • Re:WTF. (Score:5, Informative)

      by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Sunday September 02, 2012 @02:32PM (#41207647) Homepage

      Seriously, what's the problem?

      Here is a nice and detailed List of Major Linux Problems [narod.ru].

      • Re:WTF. (Score:5, Insightful)

        by jedidiah ( 1196 ) on Sunday September 02, 2012 @04:21PM (#41208613) Homepage

        That page is hysterical nonsense.

        It conflates "some problems exist" with "nothing ever works for anyone". It also ignores that many of the same exact problems exist for Windows which is a monopoly product produced by a large company and supported by an entire industry of large companies.

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Sunday September 02, 2012 @02:44PM (#41207745)
      Comment removed based on user account deletion
    • Re:WTF. (Score:5, Insightful)

      by LordLimecat ( 1103839 ) on Monday September 03, 2012 @02:21AM (#41211229)

      He (de Icaza) hits "the problem" on the head with these choice quotes:

      Why bother setting up the audio?
      It will likely break again and will force me to go on a hunting expedition to find out more than I ever wanted to know about the new audio system and the drivers technology we are using. ....
      When faced with "this does not work", the community response was usually "you are doing it wrong".

      Anyone who has used a recent distro (last 5-6 years) has probably faced this at least once. Its lots of fun the first time, learning the ins and outs of Linux. Then 6 months passes, a new version of the distro rolls out, you upgrade.....and your sound doesnt work again. Somehow, this time, the prospect of spending another Saturday learning about the new soundsystem is less exciting.

      Seriously, I wish I could blame this on Ubuntu and say "its all your fault", but we now have how many init replacements? How many incompatible soundsystems (OSSv4, OSSv5, ALSA, PulseAudio)? How many X11 backend doohickies?

      Choice is great, but not when things dont "just work", and cause this reality:

      This killed the ecosystem for third party developers trying to target Linux on the desktop. You would try once, do your best effort to support the "top" distro or if you were feeling generous "the top three" distros. Only to find out that your software no longer worked six months later.

      When a developer doesnt support the new WIndows, its not a huge deal really. You delay your rollout for 6months to a year till they get their act together, then you upgrade, and youre good for 2, 3, 4 or more years-- there will be MAYBE one additional Windows version in the intervening time, and it is highly likely your dev will continue to support you with updates throughout that time."

      With Linux, you get maybe 6 months of grace, before the new version comes. Will your dev continue to support your version? Will he support the next one? Did he even decide to support RedHat/Fedora, or did they just go with Ubuntu?

      Its a fair bet that if they say "supports linux" that youll get some kind of script that will probably work, but on occasion it just doesnt, leading to more fun chases figuring out what library is missing or what dependency is unfilled....

      I dont know what the solution is, and I dont really care who the fault lies with, but surely this is not how things should be. All the frills in the world on a composited desktop mean jack squat if your user has no sound and cant figure out why.

  • by rueger ( 210566 ) * on Sunday September 02, 2012 @12:18PM (#41206605) Homepage
    I'm using Mint Cinnamon, and am very happy with it. The "classic" desktop works fine - why the need to reinvent it?

    I had a Mac for several years, and didn't find OS X - much less the idiotic Dock - to be any more useful than plain old Windows XP. I ran Ubuntu until Unity, which simply didn't offer any real added utility, just more pointless doo-dads.

    The reason why so many people stick with XP, or Vista, or even Windows 2000 is because it just works. They understand it. They don't need added gobbledy-gook flying all over the screen, or the OS "hiding" stuff on the assumption that they don't need it.
    • by jones_supa ( 887896 ) on Sunday September 02, 2012 @12:42PM (#41206757)

      The reason why so many people stick with XP, or Vista, or even Windows 2000 is because it just works.

      BTW there is a Japanese guy who has made a improved version [msfn.org] of Windows 2000 KERNEL32.DLL, making it possible to run some software that should work on XP only.

  • I agree with Linus (Score:5, Insightful)

    by Omnifarious ( 11933 ) * <eric-slashNO@SPAMomnifarious.org> on Sunday September 02, 2012 @12:19PM (#41206617) Homepage Journal

    He's absolutely request. GNOME's compatibility breaking is all GNOME. It's not a cultural norm set by the kernel developers.

    Of course, it's much harder to define a good, stable API for upper layer stuff. It's closer to things that need to change frequently. Though X has done a remarkably good job of that.

    Maybe, if that's what GNOME wants, they should sit down and think really hard about how to do it. And ignore all the current 'hot' technologies and buzzwords. That's what led them to .NET and CORBA, and those were complete dead ends.

    Windows has, more or less, done it. I suspect though that it costs them a great deal. The Windows API has always been an insane mess, and IMHO a great source of the reason it was originally so very unstable.

  • by ibsteve2u ( 1184603 ) on Sunday September 02, 2012 @12:23PM (#41206637)
    FOSS ain't totalitarianism. The point, IMO, of open source is do it the way you think is the best way. If enough people conclude you're right, your way is incorporated. If insufficient do, you reanalyze and improve (at least a couple of times) until your approach gains acceptance. All while keeping an eye out for parallel development efforts that look "smarter", "better", "more efficient", or what have you - and then incorporating those ideas if feasible or abandoning your effort if the general direction you're going becomes a dead end/obsolete before acceptance.

    To summarize, when you have complete freedom failure is a decision you choose for yourself - it ain't somebody else's fault. It can be a community's "fault" if you feel you must attribute fault (we call those who attempt to lay blame and isolate all power to themselves "Republicans" in America, and must constantly duck their accusations that community involvement in any and all things is "mob rule"), but hey - that's democracy.
    • by fm6 ( 162816 ) on Sunday September 02, 2012 @12:58PM (#41206871) Homepage Journal

      Going back to De Icaza's original blog post, I don't see him playing a blame game. He's trying to understand why he can't seem to find a audio driver for his Linux box that doesn't break every time he does a major update. He thinks it's because of certain attitudes in the core Linux community that are driven by Thorvalds personality. I find his argument pretty dubious, but is he saying it's all Thorvalds's fault? I don't see it.

      The blame game started when the story spread beyond De Icaza's post. You can see it in the headline for this story. The problem is, the hacker community is very big on finding a Good Guy and a Bad Guy, I see this over and over again on Slashdot. Really, we all need to forget all those stupid TV shows we spent too much time watching as we were growing up.

      • by fikx ( 704101 ) on Sunday September 02, 2012 @02:51PM (#41207819) Journal
        "Thorvalds" ... why do I now have an image of Linus holding a hammer , wearing a cape with a penguin on his shoulder stuck in my head?
  • by 0-9a-zA-Z_.+!*'()123 ( 266827 ) on Sunday September 02, 2012 @12:25PM (#41206655) Homepage Journal

    the 'failure' of the linux desktop is basically applications. libreoffice and linux gaming initiatives are the way to win that battle. making a prettier desktop is not.

    • I'm using Linux on desktop for 11 years. Dual boot just so I can play games once in a year. The 'failure' of the linux desktop is that trivial stupid things like ... sound, sometimes is not working OR is not working properly.
      Besides sound, some (other) desktop components are not working properly. And in every version there's a different problem. I'm not even gonna start talkin about consequences of upgrades (if it works for you, congratulations. Read below. It's not working for me and if you type in google

    • the 'failure' of the linux desktop is basically applications. libreoffice and linux gaming initiatives are the way to win that battle. making a prettier desktop is not.

      Why does it need to win anything? For me, in my home at least, linux desktop won less than a year ago. That's the time I reformatted my last remaining dual-boot Windows/Ubuntu laptop with only Ubuntu (I really didn't need Windows anymore, it's not that I'm some kind of linux evangelist). Now, I have no Windows machine/software in sight.

      I'm perfectly satisfied with Ubuntu/linux desktop as it is. And no, I'm not even one of the "kernel people" Icaza keeps referring to (by the way, I love the imagery of tiny l

  • by John Hasler ( 414242 ) on Sunday September 02, 2012 @12:32PM (#41206697) Homepage

    Thank you, Linus.

  • by Yahma ( 1004476 ) on Sunday September 02, 2012 @12:40PM (#41206735) Journal
    I'll probably get modded down for this, but here goes...

    I agree, at least partly, with De Icaza's assertion that ABI breakage (binary compatibility) in each kernel release is a problem for vendors, and likely helped push hardware vendors away from supporting Linux. While in the ideal world, every vendor will release their drivers as open-source, this is the real world. There are numerous reasons (legal and others) why companies cannot or will not release their drivers as open-source (ie. Nvidia). With each new kernel release breaking binary compatibility with prior releases, this forces the companies to release a new driver every time the kernel gets updated. This might not be a problem for a big company with resources such as Nvidia; however, for smaller companies, this is likely a big reason they do not support Linux in the first place.

    Case in point, Dell paid PowerVR to develop a Poulsbo graphics driver for their Dell Mini netbooks (which at the time were on Ubuntu 10.04). PowerVR developed the driver. As Ubuntu released newer versions, the driver stopped working due to the ABI breakage. Users were entirely dependent upon Dell to pay PowerVR to constantly update the driver for new Kernel releases, which they did not.

    This type of continual ABI breakage is not seen in both the Mac and Windows worlds

    • by Microlith ( 54737 ) on Sunday September 02, 2012 @01:08PM (#41206935)

      Poulsbo was a disaster even on Windows thanks to Imagination Technologies.

      This type of continual ABI breakage is not seen in both the Mac and Windows worlds

      They also aren't open source. That the kernel ABI doesn't remain constant is something that has held true for Linux since it was created.

      Imagination Technologies is a company that, IME, is very hostile to open source as a whole. If you are foolish enough to license their core without also getting the driver sources so you can rebuild as you see fit, then you deserve the misery you incur. Nokia did this, with the licenses required that allowed things like this project [merproject.org] to continue supporting multiple devices with a PowerVR GPU almost 3 years after release of the first.

      Intel seems to be slowly learning that lesson as their SoC designs are trending towards an internally developed GPU rather than PowerVR.

    • by unixisc ( 2429386 ) on Sunday September 02, 2012 @01:19PM (#41207033)

      This type of continual ABI breakage is not seen in both the Mac and Windows worlds

      And nor is it seen in the BSD world, since they don't keep breaking ABI or API compatibility.

      What's worse is that every variable in the Linux subsystem is versioned, be it the library version, the compiler, the version of GTK or Qt, and so on. Trying to mix and match them would just numerically be a nightmare - never mind that in most Linux distros, they don't test out all these. In short, all this 'openness' just contributes to making a mess of things from a compatibility standpoint.

  • by Pecisk ( 688001 ) on Sunday September 02, 2012 @01:04PM (#41206901)

    I really don't like when people are trying to spice up their articles or blog posts with sensacionalist claims (Slashdot mods, you are guilty as Miguel are).

    First of all, Linux desktop isn't dead. Millions of people use it. Ok, we are smaller than Windows definitely (can't be sure about OS X). I personally don't see it as a problem, as long developers are keeping fire of competition alive.

    What Miguel propably wanted to bring up is regular point of criticism instability of Linux/free desktop based API (window enviroment, sound, graphics). While there have been some little fallouts about this in open source world, in nutshell open source desktop guys *care* about back compatability. And lot of commercial apps which can be easily run on various enviroments and distributions (and most of them even provide compatible packages for mainstream formats like deb and rpm) indicate that it is not that hard.

    As always yes, there are hardware driver bugs (Windows aren't also free from this, and it has official vendor support), there are some competition in desktop enviroment (but let's be honest, in general that's not big problem). Problems for small software vendors is that mostly they can't compete with free - we don't need five different file compression applications, we have usually one general for each enviroment. Problems for big vendors - well, market isn't simply big enough (for Adobe for example).

  • Actual discussion (Score:5, Informative)

    by TyFoN ( 12980 ) on Sunday September 02, 2012 @01:08PM (#41206931)

    Here is the actual discussion on G+ [google.com] instead of an article that just quotes everything they say.

  • by bjourne ( 1034822 ) on Sunday September 02, 2012 @01:19PM (#41207039) Homepage Journal

    Is C. GNOME is still 98% built using C which is crazy in this day and age. And not modern, pretty nice c99, but ancient c89 because the latest GNOME has to compile on some 20 year old Solaris workstation otherwise Sun wont support the project. Now Sun is gone and Oracle doesn't give a shit. Novell has given up on using GNOME as a way to push Mono and only Redhat remains. Maybe stuff will change now because previously gnome has been incredibly resistant to change that is not initiated from within one of those three companies.

    I want to see more changes in Gnome not less. And I want them to finally realize that they are spending 10x as much effort writing gui components in C as they would have in C#, Java or any other managed language.

    • Funny you should mention that, as the GNOME foundation actually has a (modernish) langauge that can used to write GNOME programs:

      Vala [gnome.org].

      It compiles to C with all the appropriate boilerplate for Gnome's libraries and introspection files to allow calling from python / java etc.

      Shame very few of the core gnome devs want to use it though. I wrote some bindings for rhythmbox in it that would have allowed the devs to write parts of rhythmbox in Vala - but they are too invested in C and only wanted to use it for a p

    • by Skapare ( 16644 ) on Sunday September 02, 2012 @01:35PM (#41207173) Homepage

      10x is an extreme overestimate. The actual figure depends on the skill of the programmer. C can in fact increase the programming time. But that increase is also bringing in thinking about how you make the solution work. Skilled programmers that understand what is going on can get that ratio down near 1x. Sadly, many projects just don't have the skilled programmers available, and simply would never succeed with C, and must use something cool like Python or Ruby. And I have seen programmers out there working on open source projects that would not be able to even get Hello World working reliably on their own in their preferred language. And too many projects these days are ending up as "Frankenprojects" which are not much more than a bunch of other things all bolted together. Where's the KISS principle when you need it? It seems C is holding it hostage.

      • by gbjbaanb ( 229885 ) on Sunday September 02, 2012 @02:02PM (#41207405)

        Its true that the cry of "write in a managed language and all your developer productivity problems disappear" is bullshit.

        The problem with developer productivity is documentation - once you learn how to do something, it doesn't really matter if the boilerplate that makes up 80% of your GUI app is C, Java or Python.

        However, even if you don't accept that, you must realise that if writing code in C is "slow", and a higher level language is faster, then you must also realise that writing in a script language is going to be even faster (and perf isn't that big a deal for LoB apps, just look at the perf problems with WPF to see that it isn't a big deal for nearly everyone).

        So... why not write your GUI code in javascript using Qt quick. Anyone demanding java or C# should know that jjs is going to be even faster, and that if that's their argument, they need to upgrade past a mediocre managed language for Qt (which has perf too as you can write as much as you like in C++)

        Java or C# indeed, neither as fast as C/C++, nor as productive as script. No reason to have either of these compromises :)

  • by Skapare ( 16644 ) on Sunday September 02, 2012 @01:25PM (#41207093) Homepage

    ... mess that computers, particular PCs, are in, blame the peripheral industry. Some of this blame also belongs to Microsoft when they made it easy in DOS and BIOS for peripheral makers to effectively add drivers. But this is a very small blame because the full scope of what we could have had not even been envisioned. Flexibility was needed for new kinds of devices and peripherals. But the peripheral industry abused this by making new devices of the same class operate differently in too many cases. Access to floppies and IDE hard drives escaped a lot of this just because those were boot devices, and adding BIOS drivers increased the price. The peripheral makers could not even establish compatibility standards within their own product lines. So many new models of a device simply failed to be compatible with the previous interface (and driver) even if all you wanted to do was do the same old things of the previous model. This was not just a case of manufacturers trying to protect some kind of intellectual property or lock people in to their own product.

    What was needed was a generalized model of how a CPU based host would access peripherals. A message based model would still have provided plenty of flexibility to expand the capabilities of new devices, as well as the ability to move more device drivers into user space, outside of the kernel. Ideally, all that was needed was one message bus controller interface design, and one driver to operate it to send and receive messages and status reports. Beyond that a ring of trusted device driver processes could be used. Combined with some community and market pressure to maintain compatibility over short time frames (about 8 to 10 years), devices could easily be interchangeable with minimal driver changing.

    Then every once in a while, a class of device would have its standard message interface/protocol upgraded to a new version, and it would be expected that all new devices would adopt that. And this could still be done with full compatibility with the previous version via a version code in the basic standard message header. The new version would include a standard way to access features that were generally available now and had been implemented via extensions in the previous message protocol version.

    Linus is not to blame. He just gets blamed sometimes because his vision of making the Linux kernel more usable for everyone sometimes means others might have to do a little more work to keep up (any vision would, but his is the one we see).

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Sunday September 02, 2012 @02:00PM (#41207385)
    Comment removed based on user account deletion
  • by SEE ( 7681 ) on Sunday September 02, 2012 @02:59PM (#41207911) Homepage

    Starting with your decision in 1997 to abandon what was the GNU project's official GUI toolkit in favor of GTK.

    If you'd stuck with GNUStep, the discipline of compatibility with a written spec (OpenStep) and the pressure for compatibility with a living rival implementation (OPENSTEP, then Mac OS X) would have avoided the "blow everything up and restart" problem. And you wouldn't have spent any time on CORBA if you already had PDO baked-in.

    And it would have been actually following the kernel approach. Whatever the kernel might do with its internal structure, in its external interfaces it's been stable. Further, that external interface has been a re-implementation and extension of an existing good-enough interface (Unix/POSIX/SysV), rather than running off and implementing its own ideal of how an OS should work.

  • 10 years ago (Score:5, Insightful)

    by hduff ( 570443 ) <hoytduff@[ ]il.com ['gma' in gap]> on Sunday September 02, 2012 @05:26PM (#41209009) Homepage Journal

    Ten years ago in an editorial in LinuxFormat I called Miguel de Icaza a "sell-out" and have yet to be proved wrong. His Quisling-esque career would be resigned to the /dev/null of Linux history except for all the damage he has done. Now he serves as a cautionary tale.

If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol

Working...