Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software IT Linux

Leo Laporte On UNIX As the Future 368

TractorJector writes "In a well-written interview with Mad Penguin, techmeister Leo Laporte (formerly of G4/TechTV fame) discusses his vision of the future of proprietary and open platforms: 'I think there's a lot of hope for Linux, although I don't think that Linux is the answer. I think that UNIX is the answer, in some form or fashion. It might be BSD, it might be Linux, it might be some third thing. But UNIX is such a well understood and smart to handle the issues that an operating system has to handle that it ultimately will prevail.'"
This discussion has been archived. No new comments can be posted.

Leo Laporte On UNIX As the Future

Comments Filter:
  • by AKAImBatman ( 238306 ) * <akaimbatman@gmaYEATSil.com minus poet> on Thursday July 28, 2005 @10:18AM (#13185325) Homepage Journal
    Unix is very flexible, and it certainly outlive Windows. However, its development [blogspot.com] will only take it through the near future. In the long term, the very idea of unmanaged code will disappear. As will the traditional concept of the Desktop.

    My predictions are:

    1. Desktops will be replaced with Browser simulations of a Desktop that can work anytime, anywhere.

    2. The traditional PC will then be replaced by a home server through which all activity will happen.

    3. Components for Music, Television, Desktop, and Video Game consoles will (in many cases wirelessly) interact with this server.

    4. The server itself will run an OS based on a managed code environment, making remote attacks difficult if not impossible. (Many Unix concepts would probably be reused in this system, but it won't *be* Unix.)

    That's my thoughts anyway. Sometime in the near future, I'll get them blogged down in detail. :-)
    • The server itself will run an OS based on a managed code environment, making remote attacks difficult if not impossible.

      Unless of course the server runs an MS OS, in which case it will actually open you up to attacks from all of the different types of devices interacting. ;)

      But seriously, I'm not sure this managed code model is the answer. At the very least it needs to be designed very well. I could see it being very restrictive for a lot of legitimate uses. And I think having a model where the OS has
      • And I think having a model where the OS has to approve code before it runs opens the door to monopoly leveraging, unfair treatment, unauthorized runtime limitations, and a whole host of other undesired behavior.

        That's not what managed code is. Managed code is systems like Java, .Net, and LISP that eliminate direct hardware access, thus preventing system bugs like buffer overflows. Java is a particularly good example, because it has a very flexible built-in Security system that could be leveraged to ensure that a given program ONLY has access to the resources it was given at install time. :-)
        • Ah, my bad. It seems I was confusing managed code with Palladium, TCPA, etc.
        • The problem is, not everything can be done with managed code. There will always be a need to get down to "bare metal" coding to get some things done.

          For example: Has anyone ever tried printing actual TEXT to a printer (not an image created from text input) on the .NET platform? I have, and lemme tell you, managed code won't do it unless you consider creating managed libraries utilizing unmanaged code to be still in the realm of "managed code".

        • LISP that eliminate direct hardware access

          In what way does LISP eliminate hardware access? Not in a "LISP machine", surely? In any case, its two fundamental instructions "CAR" and "CDR" stand for Content of Address Register, and Content of Data Register! These were certainly hardware registers sometime in the 1970's when LISP was invented!

          My other CAR is a CDR is fine on your bumper, but don't try to execute it!

          • by Bob Uhl ( 30977 ) on Thursday July 28, 2005 @12:10PM (#13186540)
            I think what he was probably thinking of was the fact that most modern languages prevent buffer overflows and the rest. Lisp actually partakes of both natures, though: by default the language is safe but not as fast as it could be; you can tweak it to make speed more important, and you can even tweak it to make safety less important. It's kinda cool, actually.

            I urge anyone who's not read it to take a look at Practical Common Lisp [gigamonkeys.com], which is an excellent introduction to an excellent language.

    • Interesting commentary. One thing though:
      > Components for Music, Television, Desktop, and Video Game consoles will (in many cases wirelessly) interact with this server.

      This sounds a lot like Network Stations that were tried around eight years ago. They were touted as the next big thing. The idea, though sound, just didn't take off as some thought it would. Perhaps the Network Stations were ahead of their time (i.e. like OS/2)?
      • Perhaps the Network Stations were ahead of their time (i.e. like OS/2)?

        That was certainly part of the problem. But as an admin who ran Citrix, I can tell you that the other half of the problem was Microsoft. After Citrix gained some initial momentum from their NT 3.51 product, Microsoft took notice and refused to license 4.0. Instead, Microsoft worked out a technology transfer deal where they would produce NT Terminal Server. Citrix was "allowed" to install their superior ICA protocol on top.

        The result was that you had the initial price of Terminal Server, plus the price of each "Seat" (which was in number of users, not concurrent connections like Citrix), then the price of a full copy of Windows NT for each thin client that would access the system. If you wanted Citrix ICA, you then had to pay Citrix even more.

        The result was that Thin Clients ended up costing *more* than a set of PCs, effectively killing the market.

        Fast Forward to today, and we find that Windows now has the RDP client integrated and that Sun has been having reasonable success with their SunRay product. People are starting to become conditioned to the idea of thin clients. Wait a few more years for the WebApp revolution to shift all power away from windows and the time will be perfect to wretch the market away. ;-)
        • Wait a few more years for the WebApp revolution to shift all power away from windows...

          Ain't gonna happen. Or rather, there are still large sets of problems that need robust applications running locally. I'm supposed to upload my 400MB Photoshop image to a remote server and work on it there?

          The fact is that we had "dumb" terminals before, which gave away to smart terminals which gave away to PCs running applications and client/server applications.

          Why the change? Because the user experience is several

    • Yeah, I remember hearing years and years ago that in the next few years, all PCs will be nothing more than Java runtime environments, and you'll rent your applications over the Internet from providers.

      Guess what? It didn't happen.

      What you describe in your post would take a substantial amount of work from many companies (not to mention a very slow migration process of the end users to completely shift paradigms). Companies doing this will likely do it incrementally if they do it at all (because software coma
      • I agree.

        The main problem with the thin client solution has always been that if the server goes down, everybody goes dead. And the server ALWAYS goes down. This is unacceptable to any right-thinking CIO - and even more so to the people who are actually doing the work.

        Of course, if you have proper system design, with failover and redundancy, this is less of an issue.

        According to recent trade media reports, thin clients are now on the upswing simply because of Windows - no need to patch ten thousand thin clien
        • by acvh ( 120205 )
          Thin clients have been well received by our corporate IT due to the ease with which applications can be updated, users can be given remote access, and local support people can be terminated.

          Users hate them because there are weird sync issues, files change or disappear at random intervals, they can't listen to music via their "pc".

          There is no one "server" that can go down to screw everyone up. A farm of three or more machines is standard practice here. Thin clients are NOT cheaper than PCs, until you factor
    • 2. The traditional PC will then be replaced by a home server through which all activity will happen.

      This is what I've also predicted. Here are my thoughts:

      A typical family might have two or three computers and a PVR or two. If the hard drives on all of these devices were aggregated into a single, logical server, then there would be benefits in terms of utilization, redundancy and speed - panacea. If we tie everything together with GigE, then we can PXW/network boot the PCs and PVRs with any operating sys
    • This just proves that time IS, in fact, cyclical! Consider trends with fashion? The 70s came back, the 80s came back, the 90s are coming back... Remakes of movies and music, too... The same is true with computers! Remember how we used to have these big centralized machines that occupied cabinets or frames in rooms? And people used these things called 'Terminals' to interact with the 'MainFrame'... Now we call it a Server, and the Terminals "Thin Clients", but its the same thing! If this was such a good

      • Of course, we abandoned it for exactly the same reasons we'll abandon it AGAIN in the future:

        Because the PTB that ran the mainframe were incompetent assholes who couldn't support our computing needs properly from a centralized position.

        And Microsoft is EXACTLY the worst ITS department anybody could have. And Sun and the UNIX vendors aren't far behind.

        So, yes, most people will use thin clients and complain about network response time - just like every terminal user used to do on an overloaded mainframe.

        The
        • Comment removed based on user account deletion
        • Because the PTB that ran the mainframe were incompetent assholes who couldn't support our computing needs properly from a centralized position.

          I think the problem was an ivory tower one - the IT group's goals were not aligned with the business units.

          I've seen this pattern re-emerging with the re-discovery of shared services in many companies. Here is how the cycle goes:

          1. Start with lots of departments running their own mini-data-centers, help desks, etc.
          2. Somebody comes to the realization that by centr
    • Very interesting, very insightful comments...unfortunately we've been hearing the exact same thing for the past 20 years. You're basically regurgitating what the pundits and "experts" have been saying forever. We've yet to see any of these things....

      That's my thoughts anyway. Sometime in the near future, I'll get them blogged down in detail. :-)

      No, these are not your thoughts. These are the ideas from about a 1000 different people over the years that have been saying the same things. It's bad when you have
      • You're basically regurgitating what the pundits and "experts" have been saying forever.

        Yes and no. Anything I say as a tech professional will ALWAYS be standing on the shoulders of giants. There's simply no way around that. However, these "experts" you're referring to have always been insensitive to the timing, and have offered no solid solutions to solving problems. While I'm making an abstract prediction now, I fully plan to make a solid prediction in the near future. :-)

        We've yet to see any of these things....

        Not true. It is becoming quite popular to purchase a computer with a Video Capture Card, use a LCD TV as the monitor/television, hook your computer up to your Dolby 5.1 speakers/stereo, download music and videos from the 'net, and use applications via WebApps. I'd say it's staring us right in the face.
    • Remember that buffer overflows are not the only security vulnerability. Currently they happen to be a large one, but there were many exploits that were possible simply because of bad logic.

      I remember an IIS flaw that was exploited because the server decoded a URL, checked to see if it was valid (i.e. not pointing to some arbitrary thing outside the document root), then before opening the file, decoded it AGAIN! This second decode was done without a second check, meaning that a URL that decoded twice into so
      • I remember an IIS flaw that was exploited because the server decoded a URL, checked to see if it was valid (i.e. not pointing to some arbitrary thing outside the document root), then before opening the file, decoded it AGAIN! This second decode was done without a second check, meaning that a URL that decoded twice into something harmful passed through.

        This is where Java's security model would have gotten in the way. When the file open request was received, it would have said "You don't have access to these
    • Consumer technology follows psychological factors, not engineer's logic. Hence we have iPods dominating the mp3 player market. It will also just be easier to just stick in well-known un*x and let managed code run on top of it. Not really secure, not the best way to do it, but its easy and time-saving and pointy haired managers will like it better than developing an in-house solution.
    • Very insightful, if only I had mod points. I agree. The computer of the future will be sitting next to the furnace and water softener (hopefully on risers), and LCDs will connect to them for functionality.

      How convenient it would be to connect my LCD to an ethernet port in the wall and have full access to the services of the main server in my basement (which would include virtualization capabilities, if I were in charge).

      My house currently has 7 computers in different parts of the house used for different
  • I agree (Score:3, Interesting)

    by MacFury ( 659201 ) <me@NOsPaM.johnkramlich.com> on Thursday July 28, 2005 @10:20AM (#13185346) Homepage
    MacOS X and operating systems that can marry the power of a good command line with the ease of an excellent GUI shall inherit the earth. I'm interested in how the new windows command line stacks up.
    • Re:I agree (Score:2, Informative)

      by MrShaggy ( 683273 )
      HMmmmmmmmmmm

      AMIGA Anyone ??

      >> MacOS X and operating systems that can marry the power of a good command line with the ease of an excellent GUI shall inherit the earth
    • Re:I agree (Score:2, Funny)

      by mnemonic_ ( 164550 )
      I wonder why MS is working on a new command line at all. Do people buy Xserves so that they can use the OS X command line? Do people run linux because they love staring at those grey characters on a black screen? No one really likes the command line... plenty of people get by with it, but it's obviously the most primitive computer interface. So why is Microsoft developing it? Do they really believe that *NIX users like their OS because of the command line?
      • Re:I agree (Score:2, Insightful)

        by Virak ( 897071 )

        Do people run linux because they love staring at those grey characters on a black screen? No one really likes the command line... plenty of people get by with it, but it's obviously the most primitive computer interface.

        Yes, some people (myself included) actually do like the command line. And as it's one of the most primitive interfaces, it's much faster and more reliable than a GUI, uses less memory, and for many operations is many times faster than a GUI. Until we can control our computers by thought, th

      • Re:I agree (Score:5, Informative)

        by kesuki ( 321456 ) on Thursday July 28, 2005 @10:45AM (#13185652) Journal
        Do people buy Xserves so that they can use the OS X command line?

        Yes, powerful Command lines are more than 'just' for end use, they open up the entire core functionality of the OS to non-interactive scripting. By having a powerful, flexible shell you can have powerful scripts that run fast, do everything you want, and can be quickly edited, they run as fast as compliled code, but since they're just a text file that gives comands to a precomplied binary you can modify them much more easily than a full fledged program.

        System administrators need a powerful command line interface, and while standard 'unix' tools sometimes have areas that need improvment. for instance chroot on BSD require the setting of a shell variable to change shell, but linux chroot which accepts it on command line, but can't change the user or group(s) that you're chrooting them to. That means you can't create a chroot jail to disable (remote) root access on linux (that allows remote logins)... but you can on FreeBSD/MacOSX
        • i meant a 'secure' chroot jail. sorry
          you can make an 'insecure chroot jail' on linux that is vulnerable to buffer overflow bugs in the os what not... because the chroot jail still leaves you as root, even if you have no access to a shell or a directory tree.. you're connected to the machine via a connection protocol, that may have a remote vulnerability in it as root access. if chroot can switch you to user none, or guest or something else locked down even if they exploit the jail, they still end up as a u
      • Re:I agree (Score:5, Informative)

        by Daniel Dvorkin ( 106857 ) * on Thursday July 28, 2005 @10:46AM (#13185664) Homepage Journal
        I wonder why MS is working on a new command line at all. Do people buy Xserves so that they can use the OS X command line?

        They buy Xserves so they have a choice -- use the nifty OS X Server GUI admin tools (which are really good, I have to say) if they fit the task, and use the command line if that fits the task. Choice is a Good Thing.

        Do people run linux because they love staring at those grey characters on a black screen?

        Very often, yes; (usually multicolored, these days) characters on a black (or whatever) screen may seem primitive to you, but to many people they represent an extraordinarily efficient way to get things done.

        No one really likes the command line...

        *falls over laughing*

        plenty of people get by with it, but it's obviously the most primitive computer interface.

        No, manually unplugging and plugging in vacuum tubes is the most primitive computer interface. It may not be obvious to you -- or to Neal Stephenson, for that matter -- but today's Unix shells represent an extraordinary level of abstraction from the underlying bare metal.

        So why is Microsoft developing it? Do they really believe that *NIX users like their OS because of the command line?

        In a word: yes.

        Look, not everything is best done on the command line. GUI's are wonderful things, if they're done right. (Which pretty puts any flavor of Windows out of the running, but that's a whole 'nother argument.) But as I said above, they are not the right tool for every task. For power users, especially admins and developers, the command line is very often a better tool. And the best of both worlds, as in Apple's current OS, which Microsoft is again trying (and no doubt failing) to emulate, is being able to switch seamlessly between them as the task at hand demands.
      • Re:I agree (Score:3, Interesting)

        What most of us CLI users dislike is graphical input. A lot of us don't mind graphical displays, as a lot of time they are better, but theres nothing better to having to find which of the 1,920,000 pixels my cursor currently occupies so i can move it over to click on something. The best input is the right amount of key bindings, with a command mode like vim (eg, :make).

        What I personally would have switched to had it ever been feasable is xmlterm. XMLTerm was a mozilla project to create an xterm clone that
      • No one really likes the command line... plenty of people get by with it, but it's obviously the most primitive computer interface.

        i disagree; i do like the command line - for many tasks it is the most advanced and suitable interface. Everytime I'm forced to use a Windows machine, I notice how much I miss a decent CLI and bash.
      • I wonder why MS is working on a new command line at all.

        Try comparing the old CMD shell in Windows to Bash...

        Do people buy Xserves so that they can use the OS X command line? Do people run linux because they love staring at those grey characters on a black screen? No one really likes the command line... plenty of people get by with it, but it's obviously the most primitive computer interface. So why is Microsoft developing it? Do they really believe that *NIX users like their OS because of the command line
      • Re:I agree (Score:3, Informative)

        by sootman ( 158191 )
        Assuming you're not trolling, I'll answer. Command lines are good for lots of things. Here's one set of reasons to like the CLI. Anything you can do at a command line...
        • can be done remotely over even the slowest network link.
        • can be put into a script...
        • ...which can be scheduled with CRON
        • produces textual output, which can be
          • instantly sent to a printer (hack-proof!--hard to delete logs when they're already printed*)
          • emailed
          • shown on a web page

        Without a command line's texty goodness, how could I do

      • > No one really likes the command line... plenty of people get by with
        > it, but it's obviously the most primitive computer interface.

        Speak for yourself, MCSE.

        The command line is the most natural interface possible if you are computer literate. Think of it as comparing books to TV. If you are a literate person you might still watch TV to veg out and because it is a totally different medium it can do some things better. But even though seeing the Battle of Helm's Deep was hella cool, the books tell a
  • by Anonymous Coward on Thursday July 28, 2005 @10:21AM (#13185356)
    "In a well-written interview with Mad Penguin..."

    "'But UNIX is such a well understood and smart to handle the issues that an operating system has to handle that it ultimately will prevail.'"

    Yep, seems pretty well-written to me ;-)
    • "Whether you're writing an open source browser or you're
      righting a symphony, I don't think there's that much difference."

      I've also seen a few other mistakes, and I'm only part-way through the first page. Well written? nope.

    • Re:Well written? (Score:3, Insightful)

      by MrHanky ( 141717 )
      Yuo must be new here. On this site, "well written" means well intended. If you can guess what it means, and it means well, it's good enough for us.
  • Ho ho ho (Score:5, Funny)

    by gowen ( 141411 ) <gwowen@gmail.com> on Thursday July 28, 2005 @10:23AM (#13185375) Homepage Journal
    UNIX is such a well understood and smart to handle the issues that an operating system has to handle that it ultimately will prevail
    That's right because, as we all know, the solution that is technically the best will always win out in the marketplace...
  • It's not (Score:5, Insightful)

    by Arthur B. ( 806360 ) on Thursday July 28, 2005 @10:24AM (#13185392)
    Playstation, XBoxes, Mobile Phones, DVDplayers type of operating system are the future. The OS has been developped far ahead of most people abilities. The future is going towards less and less user control on this OS. Quite the opposite of UNIX.
  • Control (Score:4, Interesting)

    by Anonymous Coward on Thursday July 28, 2005 @10:25AM (#13185405)
    PC as a thin client browser?

    I don't know about you, but that doesn't satisfy me and I think there will always be room for people who want a traditional desktop.

    As a gamer and just fan of controlling the computer in front of me completely without all this abstractness, I don't think that everyone is going to bite on this kind of stuff.

    I'm sure it has its place, but for everyone?
  • some third thing? (Score:5, Insightful)

    by intmainvoid ( 109559 ) on Thursday July 28, 2005 @10:26AM (#13185421)
    It might be BSD, it might be Linux, it might be some third thing.

    Talk about ignoring the elephant in the lounge room [apple.com].

    • Umm, OS X [google.com] is (Free)BSD [google.com] with a few relatively minor changes [apple.com]. Those changes don't change that it's still BSD.
      • Umm, OS X is (Free)BSD with a few relatively minor changes.

        Yeah, some minor changes like the display system, the libraries and APIs, the utilities and pretty much anything else with which a user interacts... ls and cat are the same, though.

        In any case, this is an interview with a Linux site. Laporte is just being polite.

        • I guess you didn't read the differences, and you've never used OS X (or BSD).

          Ok, so changing the display system changes the OS. Are you saying that my Linux system running X.org is different from one running Xfree? Or that mine running Windowmaker is not Linux, while one running twm is? No, those are all Linux, or all BSD, and the article wasn't about the user interface.

          The utilities are all BSD, except for the *additional* utilities for OS X-specific stuff (like the things for netinfo, the disk imager,
  • Arghh (Score:2, Interesting)

    by realmolo ( 574068 )
    Unix is fine. It works.

    But....it's 40 years old! Wouldn't we all like to see a completely MODERN operating system? I know I would. Keep all the good stuff from Unix, update it, and throw out the bad stuff.

    Of course, in the end, we'll still be stuck with Windows and MacOS and Linux because they're the only 3 that have developer support.
    • you mean like BeOS or Plan9?

      yeah those modern OS's sure did take off.

      it's not about modern, it's about who can market the hell out of what they are selling, and how fast can you sucker the other people into buying what you sell.
      • BeOS was a marketing disaster, and Plan9 was never more than a proof of concept, that doesn't mean that their core ideas were bad, just that their time hasn't come yet.

        Squeak [squeak.org] is another 'proof of concept' system with a lot of promising ideas. (Finally a system where you can rotate your windows 37.5 degrees anticlockwise. Pointless but very cool)

        It will probably take another 50 years or so, before a viable OS is built based on these ideas (Squeak for example needs a host OS and has no security at all, makin
    • Re:Arghh (Score:3, Informative)

      by tomstdenis ( 446163 )
      Naive.

      ReiserFS and O(1) schedulers and IPv6 and ... were not in UNIX 40 years ago. Heck ReiserFS is a relatively new addition. I recall using ext2/ext3 and having to "fix up" the drive after every unclean shutdown.

      I have yet to lose a single file to a ReiserFS on a medium that still operates. Even through several blackouts [before I got my UPS] and other shutdowns [emergency and otherwise].

      I think you just need to reflect on what is actually in the Linux kernel to realize it is nothing like UNIX of 40 ye
    • yeah, and we are in the 21st century primarily using 19th century transportation technology. It doesn't make sense but things are so stable (in an economic and cultural sense) that a radical change would require a huge destabilizing event.
    • Re:Arghh (Score:5, Insightful)

      by Anita Coney ( 648748 ) on Thursday July 28, 2005 @10:35AM (#13185522) Homepage
      Merely because something is old does NOT mean it should be replaced. We're still building houses out of wood after thousands of years. Our cars run on internal combustion engines. And after all these years we're still carbon based life forms.

      You admit it's "fine," that it "works," and that there is "good stuff" in it. If all of that is true, then why replace it merely because it's old?! That kind of mentally makes no sense.
      • Merely because something is old does NOT mean it should be replaced. We're still building houses out of wood after thousands of years. Our cars run on internal combustion engines. And after all these years we're still carbon based life forms.

        Oh boy did you bark up the wrong crowd with those words.

        Houses would be better built with steel and concrete as they do less environmental damage, have a better resistance to natural disasters and depending on where you are from, it might save you some money on yo
        • Re:Arghh (Score:2, Insightful)

          by tootlemonde ( 579170 )

          Because something is old, it needs to be evaluated for replacement.

          On the surface, the criteria for replacing something old is the same as the criteria for replacing something new: is there a better way to do it.

          In practice, things like amortizing existing investment, vested interests and training are decisive. These economic and psychological issues cannot simply be dismissed as a failure of imagination since innovations have work in the real world, whatever else their merits are.

          Sometimes, like the en

      • You admit it's "fine," that it "works," and that there is "good stuff" in it. If all of that is true, then why replace it merely because it's old?! That kind of mentally makes no sense.

        yes, but lord knows it sells widgets. which is what keeps the US economy running my friend.

        what are you, a communist? :-P
    • Re:Arghh (Score:2, Insightful)

      by i7dude ( 473077 )
      While it its old, it adresses the issues associated with current system design. One thing people are overlooking is that the basic architecture of pc's (non supercomputers) has remained relatively static for quite some time. Perhaps in order to see a radical change in operating system design we also need to rethink the hardware architectures that we have used for soo long.

      dude.
  • I'd say ... (Score:5, Funny)

    by hawkeye_82 ( 845771 ) on Thursday July 28, 2005 @10:27AM (#13185430) Journal
    the future is the HURD. Even in the future, the future will still be the HURD.
    • the future is the HURD. Even in the future, the future will still be the HURD.

      If the GNU kernel had been ready last spring, I'd not have bothered to even start my project: the fact is that it wasn't and still isn't.--Linus Torvalds, 1992.

      Some things never change, eh?

    • And the HURD will be running on GaAs processors so we can finally up the clock rate into the GHz range and use optical fiber rather than copper.
  • Let go (Score:3, Funny)

    by Anonymous Coward on Thursday July 28, 2005 @10:28AM (#13185447)
    "Not only is UNIX dead, it's starting to smell bad."

                                                    --- Rob Pike
  • by gatkinso ( 15975 ) on Thursday July 28, 2005 @10:31AM (#13185478)
    ...is that back in the day when really only technically savvy types owned or operated computers, is when MS gained their stranglehold on the market.

    We would like to think MS somehow bamboozeled the teeming masses, but that is BS. It was us they bamboozled with MS-DOS of all things.

    We did this to ourselves.

  • The Quote? (Score:2, Informative)

    by Saggi ( 462624 )
    Is it just me?

    I cant find the quote: 'I think there's a lot of hope for Linux, although I don't think that Linux is the answer. I think that UNIX is the answer, in some form or fashion. It might be BSD, it might be Linux, it might be some third thing. But UNIX is such a well understood and smart to handle the issues that an operating system has to handle that it ultimately will prevail.' ... in the article?!?

    It has spawned a discussion, but the linked article is much more about Open Source, than UNIX. Try s
  • Linux is not UNIX ? (Score:2, Interesting)

    by randalware ( 720317 )
    This is a distinction I do NOT understand.

    The underlying code open or not is just the implmentation.
    And some implentatoins have different switches on the commands.
    Like BSD,Irix,SYS V5, didn't

    Sticking to a design (what UNIX standard, POSIX?) is the bigger issue in my opinion.

    But the end result is the same.

    "Everything is a file" ( rea
  • But UNIX is such a well understood and smart to handle the issues that an operating system has to handle that it ultimately will prevail.

    -----

    WTF??? I mean, really, come on now... WTF!

    ~D

  • Do One Thing Well (Score:4, Interesting)

    by wild_berry ( 448019 ) on Thursday July 28, 2005 @10:40AM (#13185574) Journal
    Laporte says:
    "It's funny, because in the early days of UNIX, the philosophy of a program was, "do one thing well, and then pass the result along and interface with others." We've gotten to the complete opposite, which is do everything kind of okay, and interface with nobody. That was clearly a wrong turn. It's a response to market forces, not computer science forces."

    In the case where there is just the CLI and a list of programs spawned from a single input line, having a whole collection of tools that work well together is a must. But when you move to a graphical interface, so huge is the change in interface mechanics that the idea of the end-user setting up a chain of programs to run from one mouse click should be alien.

    The UNIX mentality of small, modular programs doing one thing well can still be maintained while a graphical environment is running, but his criticism that "do everything kind of okay, interface with nobody" can't be taken as criticism: it's just the way that GUI stuff appears to the user*. The computer system may be organised so that the GUI program you're using shares a lot of libraries and calls a lot of helper programs to do its work, but the user should only see the graphical interface, making his point moot.

    *: Maybe he means something else: that an environment where one program does only one thing, from ground to GUI, does not help people to tinker, develop and hack new features into the software.
  • Failure (Score:4, Funny)

    by kryogen1x ( 838672 ) on Thursday July 28, 2005 @10:40AM (#13185585)
    Isn't there going to be some sort of Unix failure in like 2038? How can it be the future if that's true?
  • by Dink Paisy ( 823325 ) on Thursday July 28, 2005 @10:42AM (#13185607) Homepage
    Hrm... I'm not going to say a whole bunch of mean things, but read the interview. Leo Laporte isn't an OS hacker, doesn't seem to know the details of operating systems, and doesn't seem to know the history of Windows or Unix.

    Although this interview doesn't have the controversial tone of a John C Dvorak article, the content seems to be similarly well thought out.

  • OS (Score:4, Insightful)

    by paithuk ( 766069 ) on Thursday July 28, 2005 @10:43AM (#13185618) Homepage
    If there is anything that drives me insane, it's people dribling on about what OS they use. Dude, it doesn't matter like it used to 20-30 years ago, we're past the OS era and what Linux or Unix really needs is some good quality, easy to use applications that complement a great graphics engine. Changing the OS is highly unlikely to change the success of a particular system, but changing how you think will...
  • by defile ( 1059 ) on Thursday July 28, 2005 @10:43AM (#13185621) Homepage Journal

    Microsoft's platform is the standard because they focused on the business of the software products market. They promised something to independent software vendors and delivered it-- a single platform that any developer no matter how big or small can target. At the same time they pushed hard to get this platform on as many PCs as possible, breaking kneecaps along the way when necessary.

    They achieved a form of write once run anywhere. In 1985.

    It does not matter what's under the hood, it mattered that the ISV only had to write one binary and not have to spend the money supporting two dozen incompatible platforms. Even Java cannot match this (I know, because I have to deal with it).

    Today there must be half a billion PCs that the ISV can generate one single binary for, and with that you've covered what, 90% of the market?

    Linux needs to offer big marketshare (doesn't have) and good developer support (has, sorta) for ISVs to care about it, because Microsoft proved that most ISVs won't bother targetting more than one major platform.

    • Microsoft's platform is the standard because they focused on the business of the software products market.

      But fact is it is not the standard platform. There is none actually, Linux is the closest one to have this title.
      MS had Win9x, Win NT, WinCE, and now .NET.
      MS is incapable to make a standard anyway. You have mistaken monopoly for standard.

      They promised something to independent software vendors and delivered it-- a single platform that any developer no matter how big or small can target.

      They didn't then.
  • by suitepotato ( 863945 ) on Thursday July 28, 2005 @10:44AM (#13185638)
    One need only look over this book and do six months of desktop end-user support on Windows to see how insane an idea it is that Unix of any kind is going to win in the market over Windows as long as the Unix community remains ruled by sadomasochistic techie dweebs who love things based on how hard they are which is the exact opposite of the attitude that has allowed Microsoft and AOL to prosper and thrive in the common end-user market.

    I love my FC3, but once again, don't mistake my technical abilities and the chance to flex them each day on it for meaning that everyone is going to take to it like a fish to water.

    Apple's OSX most definitely is the best Unix-ish distribution ever conceived, built, and sold to end-users without any doubt in my mind. But do the Linux geeks get it as to why? No, they try mightily to avoid the BSD-ish ancestry of it and sit there wishing this beautiful *nix-style OS with such wonderful design and construction were a Linux distro.

    Won't happen. Linux is dominated by the sort of people on whom it is still lost that ease of use, administration, and support are paramount over everything else for end-users. Windows XP and Mac OSX give them what Linux never will as long as the current crop of leaders and movers and shakers controls the Linux scene.
  • by bahwi ( 43111 ) on Thursday July 28, 2005 @10:52AM (#13185718)
    Of course Linux isn't ready for the desktop! But what they don't tell you is that Windows isn't either!

    C'mon, Spyware, Adware, Numerous Bugs(My Soundcard driver crashed the other Day. My Microsoft Certified Driver completely crashed. A reboot and it worked, but that's unacceptable as it never has any trouble in *Nix). Crazy Service Packs, bad to no real support.

    Hell, you NEED an Anti-Virus just to browse the net and check your email, even if you don't download and open any attachments. Just to protect you from the wild internet. You have to combine XP + Norton + Ad-Aware/Spybot S&D just to get a near usable PC. That's quite a stone's throw away from a desktop.

    The problem with it, is Windows IS used as the desktop, even though it isn't ready for it yet. That means it is the standard, however, how often has your mother had to call you over to fix it? Linux wouldn't require the same thing, especially if all they want is browsing and email. They're quite matched at that point. But no, Linux isn't ready. Neither is Windows.

    (I can't speak of OSX, I don't actively use it)
  • by British ( 51765 ) <british1500@gmail.com> on Thursday July 28, 2005 @10:54AM (#13185742) Homepage Journal
    Unix in the backend, handling all computery stuff(services, servers, etc).

    A nice, pretty GUI up front(Macintosh, Windows, whatever you like), that grandma can use.

    IIRC OSX does this to an extent already.

    Thus, the reverse mullet approach. Party in the front, business in the back.
  • It reminds me so much of the company I work for where endlessly arguing about how something is impossible and how change is absolutely the wrong thing to do is what we do in and of itself. Explaining to ourselves why we need to continue to safely fail is really what we spend almost all of our time and effort doing.

    I celebrate mediocrity and I cheer that open source is finally in the boat with us!!!!!

    Huzzah Huzzah!!
  • by agraupe ( 769778 ) on Thursday July 28, 2005 @11:07AM (#13185888) Journal
    Is that programmers like to develop for an open source system. It's easier that way, and if they release their code as OSS, it just keeps building. People always ask me, "How do I do X?" where X is a semi-difficult task. I always find myself saying, "Well, I'd do it with this program in Linux, it would take about 5 minutes. The windows equivilent, on the other hand, takes the afternoon to figure out and get right." If there are any moderately useful programs for windows, they are usually cheap payware or annoying shareware. The reason that UNIX/Linux/BSD/OS X will work is that you can do almost anything for free.
  • ...Of ZDTV Fame, not G4/TechTV

    I miss the basement studio
  • by tod_miller ( 792541 ) on Thursday July 28, 2005 @11:33AM (#13186147) Journal
    It might be BSD, it might be Linux, it might be some third thing

    So, its either option A, or option B, or an option C which can be anything.

    He has given himself quite a bit of leeway there.

    If Marshmallows evolve into the dominant lifeform on this planet, his dying breath will be, I was right I tell ya!!! Its the third thing!!

    (yes I RTFA and yes he really says that)
    • For the record, the whole quote is:

      [...] I think that UNIX is the answer, in some form or fashion. It might be BSD, it might be Linux, it might be some third thing. [...]


      So, no, he won't say "I was right I tell ya!!! Its the third thing!!" when Marshmallows evolve into the dominant lifeform of this planet. Unless they are a breed of UNIX by that time and that UNIX has transformed into a lifeform which I seriously doubt....
  • UNIX is such a well understood and smart to handle the issues that an operating system has to handle that it ultimately will prevail.

    Leo, what the fuck are you trying to say, dude?
  • by bored ( 40072 ) on Thursday July 28, 2005 @04:42PM (#13189683)
    More clueless crap, for Unix to really be the future, it needs to get rid of its legacy bagage and truely become "well understood". Frankly a lot of people think they understand unix because they are stuck in a single process/text based enviroment mindset. In reality the "extensions" made to unix to support current programming models are full of holes.

    When RAS, threads, async io, multiple processors, and may other things that really are the "future" (or rather the current state of the art) are well understood by the unix community they will understand what needs to be changed in the model from the 1970's the people claim is Unix. When that happens unix will be the future, but it won't be "Unix" as you know it.

    Now for some more concrete examples. Lets start with a simple one. What does the system call "close()" do? Thats right, did you know it can fail? Whats the solution? Try again. Now think about what happens in a multithreaded enviroment with open() happening in other threads. I can't find a link to Linus's comments on this but they are ammusing. The bottom line is that in a threaded POSIX enviroment you have to write code that looks like (in psudo code to remove the specifics):

    app_open(filename,...)
    {
          lockmutex(globalopenlock)
          rc=open(filename,...)
          unlockmutex(globalopenlock)
          return rc
    }

    app_close(filehandle)
    {
          lockmutex(globalopenlock)
          while (close(filehandle)!=EBADF);
          unlockmutex(globalopenlock)
    }

    If such a simple unix concept as open/close is screwed up by threads, just imagine what happens when you write code to trap percise floating point exceptions, deal with async filesystem IO over an unreliable network, the list goes on. Basically unix is good for certain kinds of applications and absolutly blows chunks for other kinds. Everyone doing a lot of these things has tied themselves to a particular Unix implementation and uses system specific knowledge to solve the problem.

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford

Working...