Please create an account to participate in the Slashdot moderation system


Forgot your password?
Operating Systems Linux

Linus Torvalds: 'I Still Want the Desktop' 727

darthcamaro writes: Linux has clawed its way into lots of places these days. But at the LinuxCon conference in Chicago today Linus Torvalds was asked where Linux should go next. Torvalds didn't hesitate with his reply. "I still want the desktop," Torvalds said, as the audience erupted into boisterous applause. Torvalds doesn't see the desktop as being a kernel problem at this point, either, but rather one about infrastructure. While not ready to declare a "Year of the Linux Desktop" he still expects that to happen — one day.
This discussion has been archived. No new comments can be posted.

Linus Torvalds: 'I Still Want the Desktop'

Comments Filter:
  • by WaywardGeek ( 1480513 ) on Wednesday August 20, 2014 @03:50PM (#47714827) Journal

    All Google has to do is dump that stupid steaming pile called ChromeOS, and admit that Android wins. A desktop customized version of Android (complete with a real desktop) is still based on Linux (at least Google's fork of it), already has hundreds of thousands of apps, and could be better in nearly every way than Windows or Mac OS-X in 2 years, IMO.

    The other broken OS, GNU/Linux, needs a major overhaul before it will ever be popular among anyone but geeks who are willing to accept that their OS is hostile to sharing new apps, or too blinded by fan-boy-ism to notice. I write this from my Ubuntu laptop, where my code contributions are far lower than Android or even Windows, even though I put in most of my effort here. It's just easier to publish an Android app. It's even easier to publish software for Windows. If Mark Shuttleworth were just a bit smarter, I think he'd realize he needs to abandon managing .deb packages and start this whole mess over based on a more git-like aproach. He's done a lot in that direction - user PPAs for example, but it's still not there. No RPM or .deb based Linux OS will ever become the basis for the Year of the Linux Desktop.

  • Oh it'll happen... (Score:4, Interesting)

    by MikeRT ( 947531 ) on Wednesday August 20, 2014 @03:55PM (#47714887)

    The day that the various desktop environments decide to cut out the middlemen. When I can go grab an official KDE install disk that gives me a polished KDE experience with the latest kernel and Wayland from, that's the day Windows will start really hurting. Then I can say to my relatives "Linux? Just go get KDE" and there'll be no confusion anymore. If it's KDE compatible, it's KDE compatible. Load the binary, off you go. Just like OS X and Windows.

  • Re:Infrastructure? (Score:5, Interesting)

    by TWX ( 665546 ) on Wednesday August 20, 2014 @04:03PM (#47714969)
    Well, to an extent he's right; the kernel does what kernels do, and that is, talk to the hardware at the lowest level. It does that just fine.

    Unfortunately the stuff piled on top of it is either not keeping up with trends (X and the way modern video changes on the fly), or not really good at handling what a user would want automagically.

    I attempted to use the most integrated desktop with vanilla Ubuntu 14.04, but I found its window manager to be so restrictive as to be useless to me. It handled a lot automagically, but not what I wanted, and it was also very unclear how to go about getting to what I needed to change. It wasn't even intuitive on how to bring up a terminal window, for example, which is basically the bulk of what I use Linux for.

    The lack of documentation is also hurting, badly. I'm working on building a multiseat box at home and LightDM was redone sometime between Ubuntu 12.04 and 14.04, and there wasn't any good support documentation explaining how the configuration files now work. I ended up switching to kdm even though I'm not using KDE, just so that I could configure a display manager that would actually work right.

    I think that the golden age of FOSS documentation is over. For a long time Linux and other FOSS docs were based on how commercial UNIX documentation was written, but slowly more and more developers aren't creating volumes of use or configuration docs in the UNIX model anymore, and as few UNIX-era developers work on Linux and other FOSS, there are less people who remember how those documents were made and why. I think that is what will hurt FOSS the most, simply being unable to figure out how to do the things that one wants to do because the docs don't exist.
  • by 91degrees ( 207121 ) on Wednesday August 20, 2014 @04:05PM (#47715001) Journal
    Successful desktop operating systems have been based on various kernels. Apple used a pretty crummy one before switching to a BSD derived one. The Atari ST and Commodore Amiga each used their own, and they had certain success in their niches.

    The problem is the GUI. People don't like X, and Linux people have no desire to give us anything else. Engineers and enthusiasts may well argue that it's better from various objective reasons but the end user doesn't care. They use it and they think it sucks. Perhaps the problem is that it still pretty much needs the shell. Perhaps it's large, slow and clunky. Perhaps it's the poor support for games.

    Android doesn't have these problems because the developers didn't cripple themselves with X. TiVos and Tomtoms (before switching to Android) used Linux without X and people were quite happy with them.

    Give us a nice, simple, standard GUI without a bazillion customisations, and with the ability to to just install an app from the GUI and run it from the GUI, and Linux might actually work on the desktop.
  • by WaywardGeek ( 1480513 ) on Wednesday August 20, 2014 @04:08PM (#47715021) Journal

    It's GNU/Linux's fault. Android, still based on Linux, could likely win the desktop if Google got their act together and stopped pushing ChromeOS. Notice how my binary applications run on *very* many Android devices without recompilation, even when I write in C using the NDK. Notice how Android does not introduce bugs in my applications by swapping in a buggy shared library which I never tested. Notice how nearly impossible it is to publish a GNU/Linux app in comparison. In one case, you just publish your app to Google and wait a day or so. Notice how my app simply installs in a comparitavely secure jailed directory rather than having to disperse crap all over the file system. For Linux, you need to write and test different and binary incompatible installatoin packages for RedHat, Arch, Debian, Suse, then wait a few years for your package to be accepted and migrate from unstable to testing to stable, and even then you don't run everywhere.

    Just freaking stupid.... year of the GNU/Linux Desktop my butt!

    On a completely unrelated note, WTF is up with the new slashdot site? I had the newly dumbed-down ads disabled with a check-box. The check box is gone, and the ads are back, and dumber than ever! I miss the days of Barracuda ads that made sense on slashdot. The new ones aren't targeted at geeks at all.

  • by armanox ( 826486 ) <> on Wednesday August 20, 2014 @04:10PM (#47715053) Homepage Journal

    They used to have a link to an OpenSuSE live CD to do just that (well, with XFree86/X.Org. Wayland isn't a priority for KDE). It would appear that is no longer present on the site. Also, KDE doesn't really care to be Linux - they target UNIX compatible systems (AIX, FreeBSD). GNOME, on the other hand, wants to be just Linux, and is largely in bed with the Fedora Project.

  • Re:Infrastructure? (Score:2, Interesting)

    by Zero__Kelvin ( 151819 ) on Wednesday August 20, 2014 @04:11PM (#47715059) Homepage
    No. He isn't saying that. Of course, a big reason he isn't saying that is because Linux is on the desktop, and has been for more than a decade. Linux has also been superior on the desktop for quite some time. I have two laptops. One dual boots to Win 7 and Mageia Linux. The other dual boots to Win 8 and Fedora Linux with Secure Boot / UEFI. I occaisonally boot into Windows to apply updates so that if I ever actually need Windows I won't have to wait an hour between clicking "Shut Down" and the computer actually turning off if I ever do need it. I don't use Photoshop, so I haven't actually needed Windows in years.

    Several years ago I installed a new DVD Drive and k3b was crashing. I needed Windows then to see if the hardware was bad or if I had a driver issue. When Windows hung hard the minute I tried to use the drive, as opposed to Nero merely crashing, I knew I indeed had a bad DVD Drive. So yes, Windows has its use, but being productive in 2014 isn't one of them.

    People who purport to know about computers need to stop asking stupid questions like "When will Linux be ready for the desktop ?", and start asking intelligent questions like "When will the general populace get a clue ?"
  • Re:Infrastructure? (Score:5, Interesting)

    by SQLGuru ( 980662 ) on Wednesday August 20, 2014 @04:15PM (#47715093) Journal

    I think the main problem is that Linux is *TOO* configurable. "Normals" don't want hundreds of options. They want people to tell them which of a limited number of options will work for them.

    Which distro should I pick? Which window manager should I pick? How do I configure my computer to be optimal for *ME*? I'm a techie and I can't tell you which distro is really the best for most people. I can tell you which ones are more stable.....but it isn't just ONE.

    With Windows....and even Apple.....those choices are more or less made for you. All a "normal" needs to do is decide which apps they need to run and whether their OS supports those apps.

  • by Anonymous Coward on Wednesday August 20, 2014 @04:18PM (#47715131)

    That's because unless Linus reins in all the different variations of Linux distros into some kind of "one Linux" , there never will be a Linux desktop.

    Hell there still is no such thing as a Linux Desktop. ChromeOS, FirefoxOS aren't desktops, and KDE/Gnome don't work with each other without both being installed.

    Like it's somewhat ironic in a way, that to get a Linux Desktop, you need to install everyone's flavor-of-the-week libraries and frameworks, so you end up with a much more bloated mess than had you simply developed the application for OS X if you needed UNIX support or Windows if it's not important.

  • by sproketboy ( 608031 ) on Wednesday August 20, 2014 @04:24PM (#47715197)

    Microsoft probably has somewhere between 6 and 20 thousand engineers working on device drivers for various windows versions out there making about 80k a pop. Sorry but Linux simply does not have these kinds of resources. It would be nice but I don't see it happening.

  • by bADlOGIN ( 133391 ) on Wednesday August 20, 2014 @05:19PM (#47715721) Homepage

    As a monopoly, Microsoft gets to hold the proverbial "gun" to device vendors heads and say, "support our OS on our schedule exactly how we say we'll fucking destroy your market and feed you to your competitors". Thus, Windows drivers get support from device manufacturers. Linux device drivers come from begging, pleading, and sometimes reverse engineering and all volunteer efforts of the open source community. Sometimes this happens despite hostile responses and legal threats from device vendors. My hope is that some day Linux will get to wield that gun...

  • by Iniamyen ( 2440798 ) on Wednesday August 20, 2014 @05:27PM (#47715789)
    This doesn't sound like a design decision, though. At least not directly. Still wondering what the parent is talking about.
  • Re:Infrastructure? (Score:5, Interesting)

    by 0123456 ( 636235 ) on Wednesday August 20, 2014 @06:19PM (#47716167)

    Superior by what definition? Stability? sure, I'll give you that. ease of use?

    1. Take a random Windows XP user.
    2. Sit them in front of two machines, one running Window 8, one running Linux MATE.
    3. Ask them to start a text editor on both machines.
    4. See which one takes longer, and results in more bitching and swearing.

    I mean, seriously, if I didn't know about Windows+R, I wouldn't have been able to start freaking Notepad on the Window 8 machine I played with in a local computer store.

  • by Dimwit ( 36756 ) on Wednesday August 20, 2014 @06:29PM (#47716235)

    This is a much bigger deal than people seem to think. I tried getting my father set up on Linux not that long ago.

    "I need help, this says GNOME needs updating, I thought I was running Linux?"
    "You are, Linux is the kernel, but GNOME is the desktop environment."
    "Well, what's Debian? It says Debian needs updating."
    "You're running the Debian distribution of Linux."
    "I thought it was GNOME?"

  • by Pathway ( 2111 ) <> on Wednesday August 20, 2014 @06:52PM (#47716401)

    How Linux wins the Desktop

    1. We need a "Default". Not necessarily a default Distro, but a set of standards that all distros can follow. Of course, other options will be allowed, even encouraged. Rationale: We need the "fragmentation" problem to be addressed, and I would suggest that a good start would to have a standard interface that is common across all of "Linux".

    2. We need an easy way to manage a large group of computers. Large or small, businesses and schools want to make the configuration of their computers easy. Examples: Mass deploy Chrome. Setup a lab of computers to use a single printer. Setup logins with permissions and shared home folders. Rationale: These features are easy to configure on Windows and Mac OS X, but not so easy on Linux.

    3. Easy Deployment. There needs to be a scriptable deployment that can mass install Linux onto multiple computers easily, including initial setup and joining of whatever management system is being used. While "image based" deployment can work, native installation deployment with configuration would be better. Rationale: If it is going to compete against Windows and Mac OS X, it has to be as easy to deploy.

    I'm sure there are some projects that already fill some of these needs... but it's not there yet.

  • Re:Too configurable? (Score:2, Interesting)

    by Anonymous Coward on Wednesday August 20, 2014 @07:37PM (#47716653)


    Installed Ubuntu 14.04 in a VM - largely pleased with the results, but still hit a few snags in a VM running under Fusion on a 27" iMac.

    Also installed Debian Wheezy (7.6) in another VM, also on Fusion, also on the same 27" iMac.

    I'm a software engineer for pay, and a Debian maintainer for kicks, fwiw - so I also have a decent idea of how this shit fits together. I still ran into issues and annoyances:

    Ubuntu - there's no easy & obvious way to set up an IPSec VPN out of the box. Why not include that with the default networking stack in a way that's easy for users to access? apt-get install l2tp-ipsec-vpn did the trick. Then, once connected to the work vpn, I was getting barked at because the system couldn't resolve hostnames for work properly - turns out it was an avahi-daemon issue that I needed to work around because somebody at my company made the brilliant choice of naming everything with a .local domain name. Worked around that, and noticed that the desktop wallpaper was behaving weird when I'd first boot and go into full-screen mode: the upper left corner would display properly, the remaining 3/4 of the screen would just end up black. Still not sure of the root cause on this one, but opening settings and reapplying the wallpaper selection works as a workaround. Getting chrome working in the dock turned out to be a much harder proposition than it should be, as well - kept clicking my "locked to launcher" chrome button only to have no browser window come up. Got that working with some trial and error.

    Debian - desktop wouldn't even boot into 3d mode because of missing drivers. Desktop resizing issue? check. VPN missing? check. A host of other issues and command line fiddling ensued.

    The net result? Linux, when compared objectively to Mac or Windows, is *much harder* to "just use" on the desktop. And I love my Linux boxes, too. But let's not pretend that there aren't a significant number of issues to getting this working. My bet is that the company that wins the "Linux on the desktop" fight will end up being Ubuntu, because they're devoting so much energy and focus to it. But even still - they're not there yet. There's a lot you can do with Linux quickly on a desktop... but there's also a fair amount of fiddling required.

  • Ugh (Score:4, Interesting)

    by Greyfox ( 87712 ) on Wednesday August 20, 2014 @07:56PM (#47716765) Homepage Journal
    I just started maintaining an old Linux X11 app. A REALLY old app. Some of the function declarations still use K&R. It's all Motif and XT. Looking at it with an eye to modernizing it, well... I guess QT won. Problem is, if I go QT, I pretty much have to drink all the QT kool-aid, since they seem to have tried to re-implement the entire C standard library under their API. Other than that, the field's pretty much right where I left it back in the mid '90's, last time I really looked at X11 programming in a big way. Actually back then GTK and gtkmm were at least looking like promising competitors to QT. Looking around at an even lower level, I can find a rant from Rasterman about imlib being faster than Xrender, and pretty much everyone deciding that OpenGL was a better way to go than Xrender anyway. That's pretty much everything, since 1995.

    I think if you want the desktop it's going to take another linux-kernel-level effort around the GUI. The question is do we keep trying to put more band-aids on X11 or do we design something from the ground up that everyone can agree on?

  • by NotSanguine ( 1917456 ) on Wednesday August 20, 2014 @08:07PM (#47716829) Journal

    Now you got Windows 8 because desktops aren't as important a market as mobile phones and tablets.

    Uhh, no. Don't run that garbage except for testing (and laughing at its craptastic-ness) on a VM. I'm sure that Satya Nadella does *all* his work on his Windows phone. Please.

    Is your data in the cloud yet?

    Uhh, no. why do I want my private data hosted on "someone else's servers?" (that's the phrase you should substitute when anyone *ever* says "the cloud")

    Is your email client a web app?

    No. And it won't be anytime soon. Why should I? Standalone mail clients have *enormously* richer feature sets.

    Still sure about the future of the desktop?

    Eventually, "the desktop" will be commodity monitors and user input devices which you plug your mobile device which contain all your data, applications and other stuff. As long as there are people who need to crunch numbers, write code, write prose, etc, etc, etc, there will always be a market for equipment to allow people to use computing power in a stationary location. The equipment, software, form factors and input devices may change, but there will always be equipment which provides "desktop" like functionality.

  • by serviscope_minor ( 664417 ) on Thursday August 21, 2014 @05:10AM (#47718783) Journal

    This means that you have to have code review from the Linux kernel team. And you have to divulge any amateur or buggy code embodied in the source. Which may compromise the imaginary advantage your marketdroids think they have on other platforms.

    God yes this. 1000 times this.

    One particular example I remember well was TV capture cards in the early/mid 2000s.

    Basically the chipset was the Brooktree BT878, which was actually pretty good though remarably cheap. I ended up with a few capture cards what people gave to me because "they didn't work".

    That meant they didn't work on Windows. Every manufacturer wrote their own buggy, unstable, system crashy drivers and put effort into some god-awful shiny TV program which made heavy use of gradients and nonstandard TV controls.

    On Linux, they all. just. worked. There was one BT878 driver that was well written and well debugged and "shitty" capture cards that "didn't work" gave years of stable, flawless performance.

    The same thing cycled around with webcams. It was a wild-west of chipsets. They'd all work after a fashion on Windows. On Linux, they either worked perfectly or not at all due to lack of drivers. The ones that did work were invariable more stable and more featureful because the driver would be written to expose the full functionality of the chipset.

    These days the situation is better on all platforms since the standards people have realised that having standard driver interface makes for a much better experience. xHCI means that any random USB chipset works. Same for bluetooth now too. UVC means any camer works and so on and so forth. It's like magic. You can buy a cheap-ass piece of crap from any random vendor and it will just work, no drivers, no hassle on Windows, Linux and OSX.

    The thing is vendors are almost uniformly bad at writing drivers. On Linux this means they don't bother. On Windows the drivers are a pile of crap. Having centrally maintained drivers is in fact a large improvement on BOTH operating systems.

How long does it take a DEC field service engineer to change a lightbulb? It depends on how many bad ones he brought with him.