Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

A Praise To Unix 136

MotyaKatz writes: "ZDnet has an article from Evan Leibovitch which he calls The Unix Phoenix. As he states, 'I come not to bury Unix, but to praise it'. He mostly deals with the aspects of Unix surviving during Linux growth."
This discussion has been archived. No new comments can be posted.

A Praise to Unix

Comments Filter:
  • I remember a few years ago, back when I was just a wee warez pup and lame IRCer. The biggest challenge I ever faced was "well, iF Ur s0 31337, t3Ll me a unIx coomand!", and from that day forward I learned Unix command by command. You may say, you are so fucking lame, and I am, but it was a good building block for someone who was just knowledgable of the Mac OS. It launched me into the world of Linux and Unix, and if it wasn't for the days of script kidding and Unix challenging, I probably wouldn't be at the level of geekdom I am at today.
  • by mikpos ( 2397 )
    WTF dude? Why does it always have to be Microsoft vs. Unix? There are other software companies out there. There are other operating systems out there. Windows sucks. Unix sucks. Let's call the whole thing off.
  • Linux has only happened in a small area. That is the area of servers. Linux has not yet happened in the mainstream, or (significantly) in the developer area. Your post said that people should be flocking to Linux because it is a better development environment, and that it was pointless to look at it from a user's point of view.

    A) Linux is not a better development environment. NT can do everything Linux can (GCC, VI, etc) plus more (Delphi, Visual Studio, etc.)

    B) User DO matter. If Linux had stayed hard to use, it would never have had the (limited) penetration it does in the business market. If it stays hard to use, it will never have a significant penetration in any market except those where UNIX was already strong.

    C) Your theory about my theory is off. According to your theory, developers should be flocking to Linux, and Linux should be succeding in Windows's traditional market (consumers.) They aren't. It isn't. You say that the user's point of view doesn't matter. It does. They only reason that Linux is succeeding in the server market is that it is better for the user (the admin, the management) than NT is. It isn't succeding in the consumer and business market, because for those users it ISN'T better. Smart people are working on making it better than Windows for Windows users. Other people (you) are saying that the user's perspective doesn't matter, the developer's does.

    PS> You miss something quite obvious. By you logic, Linux (or BSD) would have succeeded a long time ago. The development environment was more or less the same back then. So obviously that is not the cause of Linux's success. What is the cause is vendors getting together to make the Linux user's experiance more pleasent. Not only a business user or consumer, but a sysadmin's too by making administration more central and offering technical support.
  • Solaris, NeXT, etc. Display Postscript.
    i think there is a program called 'gv'(ghostview) and it displays postscript.
  • A) Not chaotic in terms of spaghetti, but chaotic in terms of change. The Windows API really hasn't changed in a while. (Or are you living in the OS/2 days?) Which one is cleaner is irrelevant, I'm talking about which one is more stable.

    B) Are you kidding? UNIX is only clean if you're doing console stuff. Of course, console stuff is clean on DOS too! X is hard to program, there really isn't a decent sound architecture (ALSA is still too immature) and you have to go through all sorts of hacks to do basic things like change resolution. Of course, all this is encapsulated if you use something like Qt, or GTK, but those are available on Windows too.
  • No, Solaris doesn't "waist" the power of the hardware it runs on; quite to the contrary, some of Sun's best work (IMHO) has been in optimizing their operating system for many processes running on multiple CPUs.

    What makes you think that Solaris is inefficient?

    --
  • by Anonymous Coward
    Please don't post things that are going to be offensive to other people beliefs. Free speech is one thing but it doesn't mean that people should go around and hurl racist, sexist or religious abuse.
  • On the contrary. The mode by which you access the code (CVS or FTP) does not make the project more or less open. Please feel free to download a development-tree kernel whenever you like. (Note: The link is to the v2.4-test kernels, which are not mirrored on all the kernel mirrors.)

    Amazing that it actually was moderated up, just goes to show the bias. So you're saying that downloading a RELEASED testkernel is more open than a continuing source branch? Pure sophistry. Why do i waste my time on slashdot.....

  • by benedict ( 9959 ) on Sunday August 13, 2000 @02:40PM (#859397)
    On the other hand, I am told it is much easier to become a Linux developer, so Linux is more open.

    On the other other hand, there are fewer license restrictions on what I can do with BSD software, so BSD is more open.

    On the other other other hand, Linux's license promotes open-source software, so Linux is more open.

    On the other other other other hand, "open" is a word with a lot of meanings and a lot of connotations, and perhaps there are better things to do than worry about which project is "most open".

    --
  • by Fervent ( 178271 ) on Sunday August 13, 2000 @09:14AM (#859398)
    The problem with today's users is that UNIX-mavens naturally assume what they like and use is what the general populace should use. The truth is, regular users don't really care about the same features that mavens like.

    Case point, I tried installing Linux on my family's home machine. I tried to explain that the system would rarely (if ever) crash, that each person in the family would get their own desktop, and they would get the access to the internet they always had.

    You know what? They hated it.

    'Why do I need to type a password every time I want to get on a machine I own, in my home?' my mother complained. The kids said they couldn't install any games they liked, and the ones they could rarely ran. They wondered why they needed something called a 'superuser' to install Q3A.

    And here's the kicker: 'Why do we have to worry about crashes?' To them a crash was standard behavior, like a car misfiring on startup or a TV channel fizzed out for a few minutes. 'But then you won't have to reboot,' I'd explain. 'But we have to reboot anyway. We shut off the machine every night.'

    You see, the practices of the UNIX/Linux world don't seem to jive with the world of "Joe User". They've ascribed to practices that, for better or for worse, they don't want to deliniate from. Case in point: the whole VHS vs. Beta debate a while back. Not looking at which format was "better", once people started using VHS they never looked back.

    Same thing with Windoze PCs. Once people started looking at the pretty icons, the ability to run all those games, relatively simply document/folder analogy (even though copied from Mac) and the ability to use all kinds of neat hardware, they were hooked. Password protection, uncrashable system? This was a family computer living in the den. They didn't care what was the pride of sys admins, web servers and academic folk in college. Unix never existed in their mind, and if it did, it was that brief glimmer of "nerdiness" they all wanted to avoid.

    My commentary on the state of technology and the world.

  • heh... and, in a few years, suddenly we'll have pre-emptive multitasking, multiuser cellphones based on some new reincarnation of unix. they're already networked... but now, your cellphone can also be a firewall, mailserver, and host shell accounts.

    the scary thing is that i'd buy this. if IBM's labs put linux on a watch, this isn't too far fetched.
    -legolas

    i've looked at love from both sides now. from win and lose, and still somehow...

  • in the future, they'll get what the programmers give them. that's the nature of software. if the programmers choose linux (for handhelds, etc.), then that's what users will be using.

    You're kidding, right? A lot more projects get made than those programmers would make on their own - the people who pay programmers decide, based upon market perceptions, what will be worked on.

    Think of the difference between production-made cars and sports cars - sure, mechanics might prefer to work on hot sports cars, but working on minivans and SUVs is what pays the bills.

    Don't get me wrong, I believe Linux has a very bright future - just not for the reason you give.
  • by WhyteRabbyt ( 85754 ) on Sunday August 13, 2000 @07:17AM (#859401) Homepage

    I think the real future of Unix looks something like MacOS X, not Linux. Only if you think Unix becomes a single-vendor prospect. Building a (admittedly very nice, very clever) GUI on top of a Unix core however is not appropriate for all the areas Unix is used in. Desktop/Workstation areas, yes I think the Apple stuff will have an influence, but I dont Apple are about to start competing with the heavy-duty kit SGI, IBM and Sun produce. I dont see Apple's new GUI as being particularly relevant to servers. And thats Unix primt-time.

    Gnome and KDE are just the first iteration towards a useful user experience.

    Yup, just first iterations. How many iterations of MacOS did Apple do, before they dumped most of it for MacOS X? How many iterations of Windows have there been? I get the impression both KDE and GNOME are progressing far faster than either of these did.

    Linux so far is a step sideways at best.

    Sideways from what? A GUI is not an OS. You might think that an OS without a fancy GUI isnt useful, but that doesnt make it a fact.

    Pax,

    White Rabbit +++ Divide by Cucumber Error ++

  • "I think the real future of Unix looks something like MacOS X, not Linux."

    Then you're saying that the future is FreeBSD. Because that's what MacOS X is. And FreeBSD ain't that much different from Linux.

    The GUI is not the operating system. It's a user shell layered or plunked on top. It adds nothing to the computing power of the computer. It only facilitates the user experience. I had to dump KDE and GNOME off of my laptop due to a small screen and sparse memory. I didn't lose any power at all, only some convenience.

    The computing world may be working towards Mac and Win like GUIs, but what they will end up will be far different. They will get generic and swappable operating systems, so that the user can use any OS with any GUI on any platform. To get there you will need some standards, and Unix or its derivative is going to be the standard.
  • And where does BSD fit into all of this? It is every bit as advanced as Linux. Most of what people think of as Linux isn't really Linux anyway, but stuff that runs on BSD (and Solaris) as well, stuff like KDE, GNOME, Perl, Python, Apache, etc.

    If you've been paying attention, the BSD world is rapidly catching up to the Linux attention, after a half-decade of being asleep at the wheel. A mere two years ago Linux was unheard of in the enterprise. What will the situation be like in two more years? You just can't predict the future, especially with computing. Leave the crystal ball gazing and 0% prediction successes to the Gartner group.
  • "Thies boxes run two operating systems.. Solarus and Linux..."

    Check again. NetBSD, and the newest FreeBSD also run and Sparcs. So does LynxOS, an embedded Unix. I'm sure there's a few more, I just don't know about them.

  • foo

    "The difference being, that MacOS (X)"

    should read:

    "The difference being, that MacOS (pre OS X)"

    bar
    Resistance is NOT futile!!!

    Haiku:
    I am not a drone.
    Remove the collective if

  • So you want a microkernel. Mach provides much of what you ask for, except the fine-grained security perhaps.

    It might be constructive to look at past Mach implementations and other microkernels... there maybe good reasons these architectures didn't dominate the market. They solve some problems all right, but not without introducing others.

    Absolute performance is portrayed as vital by the media, and this just might have killed the microkernel. Isolating components in their own address space and passing messages doesn't come cheap. Context switching can be fast but not free. And processor design seems to be evolving in another direction, e.g. VLIW chips will benefit long blocks of straightline execution and a style of code that is becoming extinct as OOP becomes more refined.

    Intel segmentation hasn't been utilized much, that's true, but it is probably impractical for **ix systems. How do you write an efficient C compiler that doesn't share one segment for stack and data? Besides, x86 has a limited number of segment registers, and writing to a segment register is not cheap.

    Look for the ongoing thread "programming for segmented systems" on comp.arch if you're interested.

  • So what actually still is the difference between UNIX and GNU/Linux anyway? All the BSD's are unixes and most of them are opensource. You can run KDE on OpenBSD. You can compile most anything without a lot of tweaking on any *NIX.

    Linux is a sapling of the Unix tree, Unix is alive, Linux proves that much. Whatever differences remain I think are going to be "converge", whatever differences there really are are historical and commercial.

  • He says "The Unix euphoria of the early days was gone by the end of the decade, which had seen vendors choose sides and celebrate the forking of Unix into so many vendor-specific mutants."
    I suppose that's what competition does. Let's not let this happen to linux. Or is it just going to happen naturally? Is it already happening?
  • i'll be one to agree. you get real plug and play, good hardware support(firewire anyone?), a standard setup (no debian OS X or redhat OS X), easy installation, SMP support, a GUI that more people can understand, great cut and paste, you're not using windows, you can install your favorite Open Source Software.

    still, it's really nothing new. the only new OS out there is the BeOS. that's revolutionary, it's not NextStep part II.

    linux isn't there yet, but it can be, people can make it be. it can sidestep, dodge and not run forward into bullets. the progess of this OS is amazing and it can let curious geeks that can not afford the $$ have something they can play with and take apart. it will run on my vaio. =)

    linux will get many enhancements when SGI finishes with it. it too can be a multimedia OS like the MacOS. so don't count it out yet. the power of linux is that you can make it whatever you want it to be, we just need to stop fighting windows to do it.

    but i ramble.

  • People who hang out on Slashdot because they seem to think its part of Windows Gamer magazine get the reaction they deserve.



    --
  • Slimebolics "historical" Lisp Machine? I learned (and loved!) Lisp (on a unix machine :) before any hardware Lisp Machine ever existed.

    Hrrrrm. I wasn't aware that there had been Lisp systems for Unix before the early or mid-80s, and definitely not before 1975 when the first Cons machine was built. In fact, I thought the first compiler for Unix outside of C and PDP assembly had been the Berkeley Pascal compiler, from (IIRC) 1977. Strange timing, eh? :)

    That's why I quoted "historical" ...

    By "historical" I meant that it made history, being years ahead of its time (and only comparable to the Xerox PARC's inventions such as the Alto). In one way or the other, the Lisp Machines did pretty well (in the US and Europe at least) before the big AI bust of the 80s, but now they've been relegated to museums and dusty office corners - at best - and their design lessons have been forgotten by the industry (who is now busy reinventing it, only twenty years late and in a half-assed manner - see Sun's MAJC and Transmeta's Crusoe).

    but in all my days I've never seen any coder quoting "coding", as you did.

    Well, to be honest, the term "coding" bugs me a bit. I translate it in my head to the Portuguese "codificar" (I'm Brazilian), which is associated with negative notions: making things arcane and undecypherable. I like to think my code isn't arcane and undecypherable - unless specifically intended otherwise - so I prefer "programming", "hacking" or just "algorithm designing" :) It's nothing rational or purposeful, just a pet peeve of mine.

  • how it started with single process, single user computers, built up to multiuser, multitasking networked computers (with cross platform support), then, with the pcs and macs, we went back to single user, single process computers, then multitasking, and now we're back to networked, multiuser (with the recent rise in populatity of Linux/*bsd/etc), cross platform OSs on the computers on our desktops. not to mention using services that reside on other computers.

    It's not only clothing that repeats itself. =^)
    -legolas

    i've looked at love from both sides now. from win and lose, and still somehow...

  • So what do you think about the Spring and EROS [eros-os.org] papers? You have read the Spring and EROS papers, right?
  • Many different libraries for the same thing is bad?! That's one word to describe that : COMPETITION (not EGO). More competition means better quality products.

    No wonder you are back to Win98SE : they have exactly one library for one job. And that's why it is unstable as hell. (And the company that produces it is being sued by being anti-COMPETITION.)

    Linux is not for everybody. It's for people who are willing to trade off some time to learn some new things, and have fun while doing that. If you are not willing to invest time in figuring things out, fine. But don't write inflammatory rhetoric accusing the people who spend their valuable time writing code for free as egoists.

    Now, be a good boy.

  • by WhyteRabbyt ( 85754 ) on Sunday August 13, 2000 @07:22AM (#859415) Homepage

    said it best: "Linux is only free if your time has no value."

    Whereas with Windows NT you're stuffed both ways...

    Pax,

    White Rabbit +++ Divide by Cucumber Error ++

  • I have 1067 .DLL files on my NT systems main drive. I'll bet you a shiny new dollar thats a hell of a lot more than any Linux system.

    That's a good bet... I just checked on mine... 1159 DLLs on my main NT drive, and 2455 DLLs on the drive with my program files. Doing a "locate *.so | wc" on my development web server (Slackware 7.1) returns 423 separate libraries. And that's with X installed.

    Granted, I have a lot more applications installed on my workstation than on my web server. But checking the local NT web server, running Access, Perl, Netscape Server, there are 836 DLLs total. Just shy of twice as many. They're just better documented on Linux :-)
    --

  • AmigaOS? Advanced? Well, ten years ago, in comparison to the operating systems of other home computers, yes it was.

    Remember this: Windows 98 is built on technology that is 20 years old.

    I guess you'd better tell the deluded motherfucking idiots at Dell, IBM, Compaq, Oracle, HP, and SAP that they need to get the shipping orders of antipsychotics in, pronto.

    You're trolling and I fell for it.
    --
  • by Darth RadaR ( 221648 ) on Sunday August 13, 2000 @10:00AM (#859418) Journal
    /* just my opinion & experience. Your's may vary. */

    Let's just face it, Unix's death has been predicted for years. Kinda like rock & roll music, it won't go away. Like rock & roll, Unix always evolves and pops up with different flavors and tools to suit the times & needs.

    Unix is not for everyone. I agree that it's a bit much for someone who just wants to send mail, play games, use AOL, and do word processing. But for people who really need more control & precision over their systems, Unix is the way to go.

    It took until Win 2K server for M$ to realize that file quotas might be a good idea in their OS, whilst Unix has had that for years. That's how Unix always evolves around realistic and current needs more immediately, whilst (IMHO) Win* is either catching up, or putting out vaporware (i.e. J++).

  • by Animats ( 122034 ) on Sunday August 13, 2000 @06:03PM (#859419) Homepage
    what he suggests is far beyond squeak (i've never used genera, so I can't say). He wants efficient, secure cross process objects.
    Yes. Things like this has been tried at the language level. Java is the most visible example. I'm proposing to do it at the OS level, where the protection hardware can make many of the checks. That way you're not locked to a specific language, and can reuse more existing software without trusting it.

    So what do you think about the Spring and EROS papers? You have read the Spring and EROS papers, right?
    Spring was an interesting idea. Many of the ideas ended up in Java, but once they started thinking cross-platform, they were stuck with running on top of an OS, not writing one. Yes, there was a JavaOS. Haven't heard much about it lately.
    As for EROS, I was impressed with that project, even though it never quite got finished. They had several problems, some technical, some political. They never really explained how to get from capabilities to policy. Capabilities are a low-level mechanism, and it's not clear how you get to, say, a web server with secure server-side applets, or its dual, a web-browser with secure client-side plug-ins. That's a classic failing of the capability crowd, all the way back to Norm Hardy and KeyKos. Too much abstraction and not enough explaination. I also tend to think they overemphasized persistence. The opposite of persistence, the pure transaction system, is closer to what people want today. Consider cgi-bin programs, where the transaction program is flushed at the end of each transaction. Hokey schemes like running Perl in the web server's address space are needed to make this efficient. That's a concrete example of where the OS has totally inadequate mechanisms. The persistent part of a server belongs in a database where you have a coherent model of the data. Think about how hard it will be to clean junk out of a persistent object store. It creates the versioning problem from hell.

    Congratulations, you've just described the foundations of Windows NT.
    Early NT was closer to this than NT is now. "Event pairs" were getting close to what I'm talking about here. However, rather than fixing NT's IPC performance problem, Microsoft chose to kick Dave Cutler out of the top NT job and dump much of Windows 95 into the NT kernel. Now it's just another big, monolithic OS vulnerable to a bug in any of millions of lines of code. (My favorite discovery was that the decompressor for RLE-compressed .BMP files is in the kernel, and contains a buffer overflow.)

    Strange, most of what you describe is MULTICS...
    Not quite. Multics security is hierarchical. I'm discussing peer-to-peer, which is more useful. Multics did a good job at security, and was very well regarded in the DoD world. NSA's public access machine was a Multics until just a few years ago.

    So you want a microkernel. Mach provides much of what you ask for, except the fine-grained security perhaps.
    Mach offers a classic I/O type IPC primitive. Actually, QNX does a better job at that.

    You are paranoid. You don't need an operating system; you need a psychiatrist.
    What I want is security that actually works, rather than having to be patched every week. A system with tiny amounts of trusted code, in which untrusted code can't break security. Something where we don't have a major nationwide crisis every time some script kiddie tries something.

  • Yes, the statement in my title is *technically* inaccurate, untrue. But the average computer user is not a technical person who knows the difference, and never will be. that's the fundamental problem I see in the Linux community--it fails to see how most end users will be using their computers, and therefore I fear that Linux will stop making progress on the desktop if another vendor presents a better OS--like if OSX were to go x86, which is doubtful--or that Linux will stop moving onto the desktop when another company offers a proprietary add-on GUI for either Linux or *BSD which offers the ease of use Linux continues to lack.

    For example, people here are talking about how multiuser timesharing machines are making a comeback thanks to home networking and the like. Bullshit. The average house still doesn't have a home network, but has instead one, maybe two, standalone PCs or Macs. Home Networking is still a geek/tech enthusiast thing. Until computers can be plugged together with a little cable, or with a couple of AirPort type devices, with absolutely no configuration necessary beyond a quick Wizard, home networking will remain a small minority market. The closest thing to that right now is Macs using the AirPort.

    And, most home networks probably consist of Windows boxen linked together, not multiuser Linux systems. I know of absolutely no one who actually uses the Windows "Profiles" version of multiuser, much less who uses a real multiuser Linux or NT box at home--NT, sure, but with only a single non-administrator user profile. Most home users view multiuser systems as a burden to be put up with, not as a useful feature. Do you know what home users do instead of using multiuser systems? They use single-user systems with different folders for different users' stuff. If there is a home network, it's usually a network of such single-user machines, not a network of multiuser systems running *nix or NT with different logons for each user.

    Please, don't make the mistake of thinking that home users think the same way or want the same things that tech enthusiasts and geeks do. they don't. And unless Linux developers start thinking about what Joe Windoes or Jane Macintosh use and want, Linux will never take over the desktop. My title was "No, the GUI *is* the OS" precisely because that's a true perception of home users--they don't see or care about what's going on underneath the GUI, the GUI is their world, the GUI is their perception of the OS; and if the GUI sucks, as most Linux ones do in the user-friendliness dept. compared to Win/Mac, then the OS sucks as well for the desktop user. It ain't about the server space any more; the desktop market is about look and feel, not multiuser complicated stuff. Just my 2 pence.

  • Personally, I don't care what anyone says, Linux is Unix. If it smells like unix, looks like unix, tastes, sounds and feels like unix...so as far as I'm concerned, it's unix.

    Linux's success isn't surprising to me. Until the last few years, unix never really had a decent desktop environment -- and please spare me the "What about openlook? (it's crap)", or the "what about CDE? (it's crap too)". Kudos to the Gnome/KDE people, they're really doing alot for the survival of unix (Not that I think it's in any immediate danger. NT wasn't quite the unix-killer that Gates' marketing staff hyped it up to be (big surprise). IT people are slowing learning that NT is crap with a nice GUI).

    I've never cared which flavor of *nix survived and I never listen to the distro war garbage or FreeBSD/Linux wars. My only concern was that some form of *nix survive so that lots of companies look for Unix expertise and I can keep avoiding the windows platform like the plague.

  • The new paradigm should be "I am so fricken tired of reinstalling Windows because the stupid user keeps deleting files that they shouldn't".

    You need the multi-user ability to keep the normal user from roaching the system because they were click-happy!
  • by Kaufmann ( 16976 ) <rnedal&olimpo,com,br> on Sunday August 13, 2000 @10:08AM (#859423) Homepage
    Congratulations, you've just reinvented Genera [uni-hamburg.de]. Or Squeak [squeak.org], if you like "objects" better than "functions", or if you want to run it on top of an existing OS, or if you don't want to be tied to a specific (dead) hardware architecture. And ETH Oberon [oberonethz.ch] is yet another OS based on the same ideas. In any case, that you don't see it everywhere doesn't mean it hasn't been invented yet - and that it's (not) popular doesn't mean it's (not) good. (For the first case, see Haskell - "groundbreaking" parametric polymorphism in the late 80s; for the second case, see Windows.)

    In any case, it will do you no good to use CORBA as it is today. Instead, use a dynamic, high-level language for user-level functionality, and just let applications people deal with objects in the language's natural idiom, making no syntactic distinction between "local" and "remote" objects.

    In any case, have fun, and don't let those Unix weenies tell you that systems research is dead - if it were for the conformists and the naysayers, we'd be rendering polygons with abaci!
  • Why do you need a multi-user machine at each desk in an organization, and in each bedroom in a large family's house?

    The machines in the _living_room_ must be multi-user, because I don't want to trek down to my room downstairs to access my files if I'm in the living room.

    Even if you assume that only one person will ever need a machine, they still have to understand the concept of multiple users - otherwise they won't be able to talk to the file server. Do I want to have to use floppy disks if I want to share files with my other family members?

    Why must each person be _bound_ to one machine? If anything, _that's_ a waste of hardware, as it requires one machine per person (you will note that I said my parents share a machine). If I have an extra $1k to spare, I'll add a mail server or upgrade the RAID or replace our aging laser printer, not add a new user terminal.

    I have a feeling I've been trolled here, but what the heck.
  • by Black Parrot ( 19622 ) on Sunday August 13, 2000 @07:26AM (#859425)
    > Linux so far is a step sideways at best.

    Maybe so, if you compare Linux to Unix as a contextless technology.

    However, if you look at the role Linux plays on the desktops of those of us who use it there, Linux is a huge step forward over what most of us would be using otherwise.

    --
  • Of course, when you do "locate *.so | wc" you also count all the symlinks, so the actual number of shared libraries is less than that.

    something like "find / -iname \*.so -type f -print | wc -l" should do it.
    --
  • I guess you're one of those fuckers who found out the new AmigaOS in the past year or so and think its fucking brilliant, eh? No I am not. And I have 1460 .dlls on my Win98 system, and I hate it, but I just tend to be able to get more done in Windows. I would like if there was a solid development plan for linux, rather than the haphard way of developing. Either Linus Torvalds or someone needs to say "OK, we need fully ass-kicking USB support by such and such a date" but because no-one is paid to develop it, it's slower to get done. It's a shame, and I know that buying RedHat in a boxed set isn't going to solve it. The Linux standards groups don't seem to have any real impetus. And as far as my use is concerned, while linux doesn't crash much (at all?), neither does Win2k. The problem lies in the fact that the apps (the big ones, not small CLI binaries like ping or whatever) for GNU/linux tend to not be as stable as the ones for Windows. I would love to buy Photshop for Linux, because Gimp, while excellent considering it's roots, has problems because the developers aren't getting paid to do it, are they? No incentive other than righteousness.
  • by WiseAcre ( 222010 ) on Sunday August 13, 2000 @07:29AM (#859428)
    people who talk about linux and unix as a user-platform, comparing it to the Windows user experience or the Mac user experience just don't get it: Unix and now especially linux are programmers' operating systems, designed and built for programmers. Programmers who enjoy coding so much that they'll do it for free like to use unix, and not Windows.

    People who poo-poo the potential that linux has need to remember two things:

    • in the future, they'll get what the programmers give them. that's the nature of software. if the programmers choose linux (for handhelds, etc.), then that's what users will be using.
    • don't believe me? in the past, they never predicted where linux would be today, so that's why we're not listening. and I, as my friends will tell you, was predicting exactly what has happened, and I was predicting it five years ago.
  • It's a pity that so much of the Linux/Free Software/Open Source community is so absorbed in this fin de siecle rush for the friggin' almighty dollar that you don't often hear much from the crowd that kept the personal computer going back in the days when it wasn't good for much: the hobbyist hacker. It sure wasn't the Unix crowd, who dripped contempt on our "toy" machines back then.

    I feel confident that I can speak for at least a few of my ilk when I say that the wonder of Linux is that it has given us something we can tinker with -- for free -- and learn down to the most minute detail if you feel so inclined. It's possible to play with Linux in ways that are terrifically difficult with Windows and which have always been impossible with Mac OS. When Apple deep-sixed their CLI-driven Apple II series in favor of the Macintosh, a lot of us swore never to get vendor-locked again if we could help it. After nearly more than a decade of waiting for an opportunity to play and explore again, Linus Torvalds and company gave us that chance.

    I know the dot-com-wannabe crowd will probably sneer at that, but hell, money isn't much good if you aren't having fun. I'm having fun again. Thanks, free-software-programmers!

    --
  • i don't know what i'm smoking, someone just gave me the pipe.

    OS X is a combination of some good things. it's seems usuable and configurable. they're working on it to be easy to use and will have really good hareware support.

    linux is small, light and growing to be better and better. right now, it's a good server OS and i use it often on my VAIO when i want to do something than browse the web and play my games.

    it's still not a NextStep on a Mac. it may be one day, but right now it's not. while it's not, i don't see why i wouldn't want to use the MacOS X.

    if i can use one OS and play my games and still use all my favorite Open Source products and learn then i'll be one happy man. Xfree86 4 isn't ready, KDE and GNOME are just one more layer ontop of an OS.

    i don't see anything wrong with OS X. so why shouldn't i use it? not that i will stop using linux, it has lots of good things about it too.

  • You just said the same thing I've been saying a long, long, time. You probably said it more eloquently than I ever have. Thank-you.

    I might add, it's frustrating when I say I like Windows, and people automaticly assume I like NT. Not so. I think NT is a lame server, just like so many others do. But on a consumer desktop, which is where I do most of my computing, Win9x rules.

    You might say that my ideal experience is surfing to websites that run *NIX, using my Win9x PC.

  • One user. One box. Single-user.
    Two users. Two Boxes. Share nothing, except by sneaker network.
    With more than one user and/or box, you get into the business of controlled access to shared resources. The timeshareing systems are not obsolete, we're just seeing more of the flip side.
  • Linux has rekindled the early enthusiasm of Unix because it redefines openness beyond anything a Unix vendor could dream of.

    Hmmm.....he claims Linux is the most open project out there. FreeBSD is even more open. I don't have to wait to grab a kernel that Linus/Alan deem worthy of public comsumption. Instead I can cvsup myself a snapshot whenever I want. The author was just saying what people want to hear. Deal with it.

  • He says "The Unix euphoria of the early days was gone by the end of the decade, which had seen vendors choose sides and celebrate the forking of Unix into so many vendor-specific mutants."
    I suppose that's what competition does. Let's not let this happen to linux. Or is it just going to happen naturally? Is it already happening?

    It don't think it will ever happen. As I see it, the reason for the forking of Unix into different and incompatible flavors was that most vendors could not share their code. So everyone implemented its own version of the same software, which was, of course, not entirely compatible with their neighbor's implementation. And on top of that, each vendor did their best to get ahead of the competition by including cool features that only their flavor of Unix had.

    Since Linux is free software, "vendors" don't need to rewrite anything to have a product. They can focus in fixing bugs and adding features. And every "vendor" can incorporate to its flavor of the OS the best features of the other flavors. I think this tends to keep most Linux flavors compatible with each other, if only because they share most of their code.

  • CDE. This is not a troll. CDE is the most mature desktop available.
  • >>Those are all evolutionary steps of the same system.

    Yeah. Windows 2000 is really MS DOS 2000.
    Does Windows 2000 have a real mode or is it strictly unreal?
  • > A 32 cpu Intel will smoke a 10000 with same number of chips/memory/etc...

    Might be true if you are comparing Intel's latest generation with Sun's current generation, but not the next. Sun has published their CPU roadmap -- they're getting close again. Besides, when are you going to find an Intel based OS that'll properly scale up to 32 processors and still be halfway efficient?

  • I reject all cookies so it didn't log me in after my second "PREVIEW" :O(
  • I guess you'd better tell the deluded motherfucking idiots at Dell, IBM, Compaq, Oracle, HP, and SAP that they need to get the shipping orders of antipsychotics in, pronto

    You are damn right they do. Big names don't mean shit other than a bunch of sort of ricvh companies trying to become really rich companies by jumping on something that they don't think will cost them much money.

  • I can't speak for BSD, but IMO there's a lot missing from Linux. Most of the high-end management tools you expect from an expensive UNIX system have an analogue on Linux, but not nearly of the same caliber. Same thing with big-ass support contracts... unless I'm hugely mistaken, you can't get the kind of support for Linux that we have, for example for our Unix PBX. Not that you really need it, understand :-) I mean, you can get tech support, but can I get a contract that gets someone out here NOW who doesn't leave until a problem is fixed?

    --
  • by be-fan ( 61476 ) on Sunday August 13, 2000 @07:33AM (#859442)
    A lot:
    Solaris: Massive SMP scaling.
    IRIX: Mature 3D framework.
    IRIX: Stable journeling file system.
    Solaris: Dynamic patching of most kernel code.
    QNX: Real time sheduling.
    Solaris, NeXT, etc. Display Postscript.
    NeXT, Solaris: Flexible, ObjectiveC object model.
    And of course, most commercial UNIX's offer management tools that are much more integrated and functional that Linux ones.
  • > Why can't people just agree on one graphics library, why do I need a million different libs for similar operations? Why? EGO! Too many programmers with too much ego.

    Back when Rock & Roll was king, they said the basic epiphany that got all the great bands started was when some geek^w musician went into a club, heard the band, and said to himself, "I could do that."

    And then, after listening a while longer, "I could do that better!"

    This phenomenon seems to be the driving force behind OSS as well. Call it ego if you will; IMO it is a fine thing. In a few more years people will wonder how we ever got by with proprietary software.

    --
  • Well I checked further; there's another 1441 .DLLs on my program files directory. And my Win95 partition, which has almost on it except a bare OSR2.1 install, IE5, Cubasis, and some games (eight or so), has a grand total of 869 DLL's.

    Pax,

    White Rabbit +++ Divide by Cucumber Error ++

  • Sorry, I forgot to add the addendu. Sure many of these technologies are available on Linux, but none are production quality. For example, SMP scaling isn't exactly that great even with kernel 2.4. XFree86 4.0 is still technically experimental, and the 3D framework is a lot less mature that other environments. ReiserFS and EXT3 are still experimental, as are the real-time kernel patches for Linux. Linux has display Ghostscript, but that is not a complete, fully compatible implementation, and it is rarely used in Linux programs.
  • I look forward to the next round of single-user single-task machines!

    Hrm... maybe they're already here: Palm, Cellphones, DoCoMo iMode thingies, dreamcasts...

    ---- ----
  • After all, the other great OS of the 70's was VMS, still going strong now at version 7.2 (VMS revs much more slowly than other OSen) and it's still lightyears ahead of NT in terms of stability, features, manageability and clustering.
    --
  • by Animats ( 122034 ) on Sunday August 13, 2000 @08:00AM (#859448) Homepage
    As an old-line secure operating system designer, I've been thinking about where to go after Linux. My current thinking is along the following lines:
    • The main job of the OS is to facilitate protected inter-object communication. What's needed is the ability to make a CORBA-type call almost as fast as an ordinary subroutine call. Applications can then be built as lots of little objects in different protection domains. (For example, anything that looks like executable web content needs to be sandboxed.)
    • Processes are not a primitive. The basic primitives are threads and address spaces. Threads can cross address space boundaries through call gates. This replaces inter-process communication.
      Why? Because when one object calls another, what you want is a subroutine call. Most OSs make you marshall the parameters into a buffer, make an IPC call that works like an I/O operation with one or more process switches, and then unmarshall on the receiving side. On return, these steps are repeated. All this is time-consuming. This is the main reason software isn't usually constructed out of little objects in different address spaces - it's too slow. Fix that problem, and clean design becomes efficient.
      How? The MMU and protection hardware in x86 machines has lots of stuff that's almost never used, like call gates and variable-length segments. If used properly, cross-address-space calls can be quite efficient. You don't have to copy everything, and you don't have to give up security to get performance. (Or so the data books indicate. I'm assuming all that stuff, like call gates and rings of protection, actually works.)
    • With fast, secure object calls, the pressure to dump stuff into the kernel disappears. File systems, networking, and drivers move outside the kernel.
    • Security is applied at the object call setup level. Look at the CORBA security model for a starting point. Security decisions are made during object loading and call gate setup. Once it's been established that object A can call method M of object B, no further OS intervention is required for that call.
    • Very little code is trusted. Writing a mail handler? Just about the only trusted part is the object that handles putting the mail in a local mailbox. A web browser? Maybe bookmark handling and some certificate functions; everything else is sandboxed. Most code needs no more privileges than an untrusted Java applet.

    This is an idea I've been kicking around for a while. With the GNOME crowd going CORBA, this is starting to look more practical. An OS like this will have something to run on it.

    Comments?

  • I only ask cause the bastardized quote comes from julius ceasar...

    As I am sure you know, it was Brutus, Julius' murderer, that said the original words.

    Ironic how the press is actually being blatant in this reference.
  • care to elucidate the fish reference?


    FluX
    After 16 years, MTV has finally completed its deevolution into the shiny things network
  • Programmers who enjoy coding so much that they'll do it for free like to use unix, and not Windows.

    *ahem* Excuse me, but I think someone forgot to send me the invitation to your crowning ceremony. FYI, I am a programmer, and I enjoy "coding" so much that I do it for free (I design programming languages on my spare time). However, I do not like Unix. In fact, it's fair to say that I hate it. (Just to keep you from accusing me of being a mindless Microsoft drone in a while: I also hate Windows.) I also hate C, vi and a bunch of other things that Unix zealots seem to believe to be the be-all, end-all of software. (When Rob Pike's article bemoaned the current state of systems research, and a bunch of people claimed that systems research should be stopped, being useless given that the ultimate system already existed in the shape of Unix... that was when I truly realised the sad state of our little community.) I've seen better things - I've seen truly open systems, built without preconceptions and making all the power of the machine available to the user, such as Symbolics' historical Lisp Machine OS, Genera. Compared to it and others, Unix's continued prevailance can only be explained by the influence and domination of the "worse-is-better" mentality in the IT industry. I hope that goes away in due time.

    Besides from all that, there's a big error in your argument. You see, Free Software is "free" as in speech; it doesn't mean that people don't get paid for programming. I myself work on Free Software for a living, as do many fellows at Red Hat, Conectiva, IBM, Apple and many other companies. These people write Free Software, but they do unto it as management tells them to. If or when their numbers and power in the GNU/Linux community exceed those of "lone hackers", it won't be the programmers' choice any more, but that of the consumers - the people who pay for the software to be written, one way or the other.
  • by Anonymous Coward
    Linux is just another version of UNIX, no? Sure, it's free, open-source, whatever. What makes it significantly (technologically) different than any other UNIX?
  • Somehow I think that Unix, both commercial and *BSD will be around for a long, long time.
    The support for Linux by the commercial Unix companies is just raising the bar. Eventually they will sell you the binary, supported and/or give you the source, unsupported.
    Symbiosis is mutual parasitism.Linux makes an excellent testbed for the advancement of Unix.
  • Sounds like the Microsoft sales dept.'s got into slashdot again...

    1,2,3 - *Sigh*
  • > Linux so far is a step sideways at best. This is unfair. Is is true, technologically linux introduced few (none?) new OS features. But that's not the point. What it did was raise the base platform. Suddenly any hacker with a few dollars good get a PC with a good quality OS with commercial features. A generation of Amiga hackers in Copper-burnout-mode was replaced with a generation of UNIX hackers learning respectable and marketable skills like C/C++, internetworking, sysadmin.
  • I agree, as an Apple fan, for me, as a somebody who appreciates a good user interface and wants access to tools for development web-based software, the future of Unix does look like MacOS X. I am not impressed, so far, with most Linux GUIs, which seem to be about either aping Windows or being able to use themes. If a project like Eazel really can address the core usability and maintainance issues, maybe I will change my mind.

    That said, the most promising avenues for Linux are in the server and embedded markets -- the second one isn't even really a promise but a fact. As for the server market, one disadvantage is the more freewheeling development and configuration model, as compared to a commercial Unix or the BSDs. On the other hand, vendors like IBM and SGI are putting significant resources and code behind Linux, and that's nothing to sneeze at.

  • by Morgaine ( 4316 ) on Sunday August 13, 2000 @11:18AM (#859457)
    But Unix still has value that the Linux crowd may vastly underestimate in its haste to issue a death certificate.

    What a total waste of electrons, both in the alleged view of the "Linux crowd" and consequently also in the article.

    Repeat after me, 10 billion times: "Linux is a Unix".

    Who cares a damn about the legal niceties (more like idiocies) that prevent one from using the label "Unix" where it's obviously appropriate. Ask any person with more than a little experience of Unix and you'll always get the same answer: Linux and the BSDs are all Unixes, through and through, every bit as much as the licensed proprietary versions. It's not just by accident either, it's by design. And in many ways (but not all, yet) they're the best Unixes around, with the older "legal" Unixes fighting hard to keep up. Anyone that thinks that the important thing about a "Unix" is its license is just so uninformed that it's sad.

    OK, I know it's summer and good news is hard to come by, but that article was about as empty of point and content as they come.
  • I know of absolutely no one who actually uses the Windows "Profiles" version of multiuser

    Now you know someone! The family computer at home runs Windoze NT4 (with 5 different users) and I have it NTFS formatted, I use the user Profiles and rights to a certain extent. None of my family is allowed to install anything and if they try the I'll scream their heads off. (A strategy that works quite well)
    Yes, I'm paranoid, I know...but it's the only way to keep the system relatively clean and stable.
    As soon as you've got more than 2 users on the same machine you need some kind of management or it goes right to catastrophy. I once reinstalled the PC of a family with 6 (!) Their box kept crashing and new installs of W9x kept going wrong after 2 months of operation. As soon as I installed NT4 with appropriate user-management it worked like a dream for them.

    And don't start that I should have installed Linux, because I already tried it for myself and I'm just not convinced. My P120 with 1.2Gig harddisk (my own computer which I use on daily basis != the family computer) works fine with W95-OSR2 and I tried different Linux distro's (RedHat, Corel and SuSE) and I didn't manage to install something working with X under 900Meg. Without X it's probably crammable on some 100Meg distro (Yup ZipSlack runs pretty fine), but then it's just not useable for my brother/sister/mother. Compare it to 300Meg for a W95 with all the rimram installed. The day Linux will be easily installable (without httpd, sendmail etc by default)and relatively lightweight, then it will be ready for desktop...not a second before.

    I'm way Oftopic by now...sorry....gonna post it anyway ^_^


  • What are you talking about? Linux is a piece of crap compared to other commercial UNIX. For instance Solaris has the ability to tie threads to particular CPUs. Linux barely runs on multiple cpus and has no real concurrency facilities(at least not any that I hear people using, but I may be wrong about this last part).

    KidSock

  • Solaris x86 is really poor, especially compared to Solaris for the Sparcs...

    --
  • So now you know why so much of free software is un-usuable.

    No, much free software is unusable for the same reason much comercial software is unusable. It just that most software is unusable. If you disagree go use a peice of comercial software at random. Don't forget all the in-house apps that need a ton of instruction to use, stupid instructions like "don't hit 'all packages', that just crashes, select them all by letter".

    Lots of comercial crap is worthless. Lots of free crap is worthless. It is probbably easier to find the worthless free crap, but only because it is free, not because there is more of it.

    Don't forget lots of free stuff is pretty damm nice. Apache. OpenBSD. GCC. The Smithsonian. Love. And some claim Linux even.

  • >
    > ... but can I get a contract that gets someone out here NOW who doesn't leave until a problem is fixed?
    >

    Yes... you can.

    --

  • Hmmm.....he claims Linux is the most open project out there. FreeBSD is even more open. I don't have to wait to grab a kernel that Linus/Alan deem worthy of public comsumption.
    On the contrary. The mode by which you access the code (CVS or FTP) does not make the project more or less open. Please feel free to download a development-tree kernel [kernel.org] whenever you like. (Note: The link is to the v2.4-test kernels, which are not mirrored on all the kernel mirrors.)
  • It's not a feature in a "stable" kernel. That's what I mean by production quality. Sure, kernel 2.4-pre5 is very stable, but no-one in their right mind would use it on a mission critical computer. Not the same people who don't use .0 releases, and those who still don't use XFree4.
  • by Elvis Maximus ( 193433 ) on Sunday August 13, 2000 @08:32AM (#859475) Homepage

    programmers are secondary to users!... Userbase is king. Windows has the userbase. If Linux doesn't get a big userbase, then people won't develop there.

    That's true if the developers are selling what they're developing. If they're giving it away for free... well it kind of turns that model on its head, doesn't it?

    -

  • by weezel ( 6011 ) on Sunday August 13, 2000 @06:22AM (#859476)
    I think the real future of Unix looks something like MacOS X, not Linux.

    The communinity development model so far has been unable to do anything other than kludge together something as important of the GUI. Gnome and KDE are just the first iteration towards a useful user experience.

    Apple, on the other hand, has taken the core of a Unix system and used a single vision/goal/thingy to synthesize something new and exciting from two fairly stagnant OSs. Borrowing from the low level functionality of Unix and the elegant UI of MacOS they have made a real step forward.

    Linux so far is a step sideways at best.
  • by fluxrad ( 125130 ) on Sunday August 13, 2000 @06:22AM (#859477)
    i long for the days of yore...with Bell Labs and the like. I long for the openness of the original unix. so i wrote a little play:

    curtain. we see a developer somewhere in california

    Developer: Boy, i sure like this AT&T Unix. Good thing it's open. I think i'll use some of the source for something

    AT&T: Do that and we'll sue your bitch ass!


    curtain

    The original unix was free as in 'not free'


    FluX
    After 16 years, MTV has finally completed its deevolution into the shiny things network
  • by Octorian ( 14086 ) on Sunday August 13, 2000 @06:24AM (#859478) Homepage
    This is what most people think...

    What everyone fails to realize, is that those systems continued to exist throughout the era of the PC. It's amazing how many people don't know that the PC has "not" been the be-all/end-all of computing in the past 10-15 years. Heck, until 4-5 years ago, the PC was an utter piece of crap compared to your average UNIX machines.

    Look at some of what I've been using lately and the dates of manufacture:
    Sun SparcStation IPX '92
    IBM RS/6000 POWERstation 350 '92
    SGI Indigo2 Impact '96

    These WERE within the "era of the PC", are absolutely not PCs, and run UNIX in all it's multi-user, networked glory.

    We never moved from high-end/UNIX to PC/Mac and then back. Both worlds existed in parallel, with very little communication between the two.
  • /usr/bin/perl -e 'for ($i=0; $i 10000000000; $i++) {print "Linux is a Unix";}'

    followed almost immediately by ^C because ten billion iterations of anything is just ridiculous.

    Steven
  • I didn't read the post your reply was replying to (I browse at threshold three), so this may completely miss your point, but I've wanted to point all this out anyway, and this seemed like as good a time as any.

    To save readers from my wordy retort, I'm moving the summary I put at the end of my reply at the beginning, which follows:

    You're right. The features you named are "missing" in that there is no glossy package you can install as a slam-dunk solution to the general problems involved. On the other hand, for each item, it's either a matter of time (6-18 months away), or the issues involved are much to complex to treat as a simple feature that can be added like a car accessory.

    Now the long version.

    Taking into consideration your self-reply about these things being available but not production-quality, I would like to add that all of these will be at or above whatever production-quality is _eventually_. Since these things are done when someone volunteers, there is no hurry. If you want it faster, write it yourself. Also, all of these things are a lot closer than they appear:

    • SMP - SGI seems to be trying to speed up development of this. Judging from recent slashdot activity it would appear that SGI is trying to bring Linux up to IRIX level in all of IRIX' specialties so they can dump OS development on the community and focus on what they really make money on. I think this is great. It's the whole point of cooperation. Let everyone do what they want and as long as it stays cooperative, everyone wins.
    • 3D - Again, SGI seems to be helping out here, as well as CreativeLabs, 3DFX, NVidia, Matrox, and a few other companies. Anyone with 3D hardware to sell wants as many viable platforms as possible, especially stable ones. No, Linux isn't a drop-in replacement for IRIX or an Evans and Southerland workstation...yet.
    • Journalling FS - SGI and IBM are both bringing their own filesystems to linux. They aren't even asking the community to do it for them. They're doing it themselves and giving it to us all. It could be a gimmick, but I think they Get It.
    • Dynamic kernel patching - I haven't seen any major threads on this on the kernel list, but I wouldn't be surprised if it was on the horizon for 2.7 or so. On the other hand, with the user space linux kernel doing so well, why bother with a patchable kernel when you can spawn a new kernel with its own VM? Certainly this is an over-simplification, but the point remains that the problem a patchable kernel solves has real-world Linux answers right now.
    • Real time scheduling - There's a project or two dedicated to RTLinux, and I'm sure they're progressing nicely. No, it's not production quality, if by that you mean 'apt-get install rtlinux' makes your debian box "real time". On the other hand, RT scheduling is a pretty small niche, and it's only being worked on because someone has a need and is willing to do the work themselves, not because we're competing with QNX or something.
    • Display Postscript - My first thought is "yawn", but I have to remember that people use computers for many more things than I do. I have no idea what projects are underway to satisfy this goal, but I know that Unix and Linux are similiar enough that you could buy DisplayPostscript for Linux if you really wanted to, and a free version will exist some day. It's just a matter of someone having the time and inclination.
    • ObjectiveC - GnuSTEP anyone?
    • Management tools - I had a big argument with my previous boss on this one. He wanted a single window he could consult to feel like he was at the helm of a dozen machines. I told him many times that while there are tools that give you that feeling, nothing gives you the control you get from actually having a clue. I probably could have been more diplomatic. The GNU world has NFS/NIS, LinuxConf is its own beast, DebConf is ... interesting, and there are many other solutions. I don't see this as a problem. If you have more than a handful of machines, you're going to come up with a solution of your own that suits your situation. Otherwise, you don't need "integrated and functional" management. Your situation is too simple. It would be like using wharehouse tracking software to manage your household. "Where's the toilet paper? - Hold on, I'll look it up in the database..."
    Maybe this is preaching to the choir on SlashDot, but the point of free software is not to compete with commercial software in the marketplace, but to satsify the individual programmer's goals, and work cooperatively on goals the community has in common. Whether those goals are compatible with commercial demand is irrelevant. Some day someone will do something for free that already exists for money. It's not that it can't be done, it's just a matter of time. If you want it now, buy it. If you want it free, write it or wait.
  • Solaris, Linux, FreeBSD, AIX, IRIX, HP/UX... I've used them all and they all offer neat features. Religious zeal is nice but in the end you can do the job with all of them pretty nicely (even W2K, sometimes [dons asbestos suit]). They all have drawbacks, too, but those tend to get exagerrated. For the x86 world (i.e. the "affordable" way of getting Unix on your machine) we have the BSDs, Solaris and Linux (and SCO, which I never liked). Linux is the most bleeding-edge one and seems to be becoming the de facto standard, for better or for worse - like the M$ of Unixes, in a way. Things like Beowulf, ReiserFS, XFS and most other interesting open source (and some other, pretty expensive) projects are written for Linux first. If enough decent projects get completed, the other x86 Unixes will have a hard time finding homes in hard drives. For a stable environment, at the moment I would recommend Solaris or FreeBSD. Solaris 8 performs well if set-up properly on PC hardware (I set it up with DMA enabled and a combination of RAID 0 and 1 and it screams). Also, you can get software mirroring of the / filesystem which, unless I'm wrong, you cannot with Linux/FreeBSD. However, low-level benchmarking I did with lmbench proves FreeBSD to be faster in things like pipe bandwidth and process forking. Linux is fine but, out-of-the-box it lacks reliability/performance features (i.e. the way it does only fully-async data+metadata writes [or fully-sync both, which slows things down terribly], which can cause long fsck sessions if there's a problem... NFS and TCP/IP under heavy load...) that are not big issues with other OSes (you can choose the behavior in a more fine-grained way with Solaris or FreeBSD). I'm sure future kernel releases will solve most of these issues. Ideally, I'd like to see full native support for the SGI XFS (far superior to ReiserFS and in serious use for a long time now in SGI servers, totally bulletproof IMO) and proper RAID functionality before I consider it ready for prime time. The one thing missing from the Unix world as a whole is a decent desktop but that's besides the point for a server. It will happen, though. But, to get on-topic for once, Unix is far from dead, never even was close, anyway. The issue is not the server market (which it never had a problem in) but the Average Joe desktop one which currently it cannot rule for a myriad reasons. However, that's the way to get the biggest market share: M$ first totally dominates the desktop, creates its proprietary and often silly stuff, changes/breaks well-established standards so that they only work 100% with M$ OSes, then makes server OSes that everyone buys by default. It always wins. Oh, and in countries where software anti-piracy laws are not enforced (i.e. everyone else but the USA and a FEW other countries), software cost is never an issue so if someone wants to deploy W2K AS (which normally costs around $4K a pop so it's never an option for normal-income, law-abiding citizens and small companies) to load-balance 30 machines or make a failover cluster of 2, they can do it for the cost of burning CDs. Setting it up for load balancing and doing software RAID of any partition you want is easy as pie and if it doesn't cost you a cent apart from the hardware, try convincing the CEO to go for Unix, especially if him and the rest of the tech team are tech/Unix-phobic/inexperienced. Most people like that want something easy to set up and fairly reliable. They don't care if they can get slightly less downtime and a bit more performance running Solaris, (which for $75 you get unlimited licences for, so cost is again not an issue), FreeBSD or Linux. For a techno-geek this is silly, but these are real people making real decisions I'm talking about. I believe it all boils down to: Unix has the strength in core functionality. Give it easy installation and configuration for EVERYTHING, a standard, decent, easily configurable desktop, decent free apps, and we're all home free. It's all starting to happen, anyway. And sorry for the long post, which probably nobody will read but it's my first so there. D
  • I set my parents machine up multi-user. It lets my little brother install all his games and wacky-dektop settings, while my parents keep the standard Windows settings with the Word icon out where they can get at it in the middle of the desktop.

    My brother's only computer literate at a user level, and my parents are barely that.

    Of course, they don't have passwords or anything, but its definitely multi-user.
  • First of all, I think anybody that actually does -any- work whatsoever on a computer appreciates the desire to avoid crashes. The agony of losing a 20 page term paper that you worked on until 2AM is enough to convert any college student.

    The rest of your argument - well, points-by-example, I can see pretty well. If you want an entertainment machine (ie, web, e-mail, vidgames) Linux is not there. However -

    Mozilla is cranking right along towards release and will be 'released' with in the year.

    X4 is cranking right along and will support 3D and DRI and all that great stuff to make games go.

    Wine is cranking right along, and can actually -run- things. Wow.

    And, of course, Corel and others have slapped some really good installation-and-support stuff on top of the nuts-and-bolts. (And Debian has made the BEST packaging system ever for Corel to build on! No, I don't really want to start a flame war, I'm just a Debian user, it was mandatory to spout that. ;))

    My point is, I've always said, 'Well, -someday- Linux will be good for the ordinary user.' Now, I'd say, 'within 1-2 years, Linux will be good for the ordinary user'; for some definitions of ordinary, anyway.

    I estimate that 1 year from now we should start seeing distributions with a stable X4, stable Mozilla (or, more likely, derivative web browser that spins off from the beta Mozillas, 'cause people are getting impatient and Mozilla is getting stable enough to spin projects off of without having to duplicate effort), a Beta Wine that is pretty much as stable as Windows itself (not saying much, but hey), and with any luck, a stable ALSA.

    Now, if Corel/Caldera/RedHat/etc keep up the 'friendly-front-end' progress, all this core functionality should be easy to use.

    Passwords at the console aren't necessary if people don't want them - though the code isn't in place, it is perfectly technologically feasible to permit login at the console without a password, and require password only for remote access (if the home user even bothers running a telnet or sshd in the first place).

    Logging in as root to install things isn't necessary - just make real ('human') users part of the 'installers' group and the installer program -rwsr-x--- root installers -- all behind the scenes of course, so the ordinary user doesn't have to know how to operate 'install'. (install should, of course, only work for people at the console - the more advanced permissions structure of 2.4 might help with this, or not, I'm still on 2.2).

    Any-way. Ordinary users -do- want some of what linux can offer them... namely -
    remote access to their machine (mostly to check e-mail)
    stability (though they don't want to pay an ease-of-use price for it)

    ip-masquerading. No. Really. I meant it. Every windows user I know wishes they had ipmasq. Of course, they don't want to deal with the IPMASQ howto, or even the WinGate install program, they want it to just work, so they can end their fights over who gets the modem line now.

    If the end-user-distributions can get Ethernet/IPMasq setup cleanly automated, that'll be a big step forward towards grabbing the web-and-e-mail sorts of 'ordinary users'. (And a minor-+ for game users that like networked games, though what they really want is a stable and automatic X4-with-DRI + ALSA + Joystick, and they want it all without having to know any technical terms. ;o)

    Interoperability with MS-Office is still, and will continue to be, I'm sure, a big stumbling block with the 'productive' sorts of 'ordinary users'. (Funny how many categories of 'ordinary' there are, isn't it?). Any-way, between the opening staroffice, the Corel-WPOffice, and KOffice, there are options for those who don't need -perfect- interoperability, and if even a few percent go to other office suites, it will discourage MS-core-users from shipping MS-only documents. (ie, ... they might send the easily converted Word5.0 format instead of the brand-new-nobody-knows-what-was-added Word-2001 format.)

    Any-way, if you want to give your family Linux right now, expect to spend days, nay, -weeks-, configuring, optimizing, installing KDE, removing KDE and installing Gnome, switching back, setting up sudo so they can do root tasks transparently and then setting up ipchains and tcp_wrappers so the Evil Crackers don't then exploit the weakened security, etc, etc, until it's a smooth ride, 'cause frankly, I don't think out-of-the-box linux -does- have much ease-of-use, but I know that tweaked-until-it-begs-for-mercy linux sure does.

    Anyway. Are we offtopic yet? ;)


    --Parity
  • So now you know why so much of free software is un-usuable.
  • by be-fan ( 61476 ) on Sunday August 13, 2000 @07:55AM (#859494)
    Aside from the fact that that's awefully arrogant of you (programmers are secondary to users!) you're theory doesn't hold water. Two points.

    A) Userbase is king. Windows has the userbase. If Linux doesn't get a big userbase, then people won't develop there.

    B) Developing for UNIX is not necessarily better.
    - There are a lot of commercial developers that grew up using Microsoft code generators. Those people will be the last to switch to *NIX.
    - For a lot of smaller developers (database developers, etc) the fastest way to do something is still use generation tools (like database generators, or stuff like Delphi.) Those developer's won't switch until a similar body of apps are available on Linux.
    - Any tool you can use in *NIX, you can use in NT. (GCC, Emacs, etc.)
    - The Windows APIs are generally less chaotic. While Linux is in a constant state of flux, the Windows APIs are more stable. For example, in Linux, you have the major DE's changing heavily quite often. In Windows, the major DE APIs haven't changed all that much since Win95.
    - Linux suffers from API overload. Which sound system do you program for? If you want featues and speed, you use ALSA. However, can you expect your users to have it? You could use OSS, but why not just forget it and use DirectSound? Which DE should you program for? KDE or GNOME? In the end, most commercial developers will just give up trying to choose and use Motif. And by using motif, they lose out on the cool features of KDE and GNOME. What toolkits? How can I make sure the user has the correct toolkit? What version of glibc? What version of X? You don't have these kinds of problems with Windows, mainly because MS forces people to stay on an upgrade path, and thus most users will have NT4, service pack 6a. And if they don't it's relativly easy to just include it on your CD, so you can be sure of the state of the system.
    - In terms of mutlimedia, the Windows APIs are still much better.
    - Developers still care about user experiance. When Sierra didn't let people choose the install directory in it's utilities, people had fits, and threatened to boycott their programs. In Linux, many programs have pages of installation instructions. Recently, I read a review of the 3 IDE's available for Linux. In the case of RedHat's GNUPro toolkit, the installation instructions consisted of half a page of CLI commands. Compare this to Visual Studio's, where you pop the CD, pick which options you want, and then go away while it installs.
  • The article almost seemed to suggest that UNIX is dying and the future is Linux. While Linux does have a very bright future, one UNIX vendor is not only doing very well, but thriving. None other that good 'ole Sun Microsystems.

    They've absorbed a great deal of market share. It its because they are truly innovating -- in hardware and software. What they've done today is great, but I've received my non-disclosure brefing on the new technology coming out. You think Microsoft listens to their customers? Sun has ABSOLUTELY listened to their customers about the hardware and innovated new products which address the complaints about the current line.

    The software/hardware combination (which is hard to get without a major backer -- maybe Compaq can pull it off) is incredible. When you've got a production system which is choking all of the sudden and needs more CPU and memory, what other system do you know of that will allow you to add (or subtract) CPU/Memory/IO devices on the fly, while the operating system continues to run, without missing a beat? It's saved our butts more than a few times.

    I'm just a little put off on the spin of the article that seems to ignore that UNIX is alive and doing well. Sun absolutely has their act together.

  • by Legolas-Greenleaf ( 181449 ) on Sunday August 13, 2000 @06:40AM (#859506)
    hehe... i'm quite aware that the large servers kept going with UNIX/VAX/etc. However, once the pc appeared, the average user didn't have to log into the mainframe to wordprocess.

    we have an sgi iris file from 1992 at our work. despite it's age, it's a pretty impressive piece of equipment.

    My point was simply this - the focus of the average user went from large timesharing systems to the personal computer, and now it's almost as if networked, timesharing systems have come back into vouge. i know, the whole while, there were multiuser computers running networks and databases and whatnot, but now we're even getting that sort of os on our desktop (i forgot to mention winnt as a multiuser os.)

    it was just a dumb thought provoking-type comment. don't worry too much about it. =^)
    -legolas

    i've looked at love from both sides now. from win and lose, and still somehow...

  • In any case, it will do you no good to use CORBA as it is today. Instead, use a dynamic, high-level language for user-level functionality, and just let applications people deal with objects in the language's natural idiom, making no syntactic distinction between "local" and "remote" objects.

    CORBA is already location transparent (that's how GNOME works), it's been around forever, used a lot, and carefully maintained. I would suggest going with CORBA as opposed to creating Yet Another custom inter-process communication mechanism. CORBA already lets's applications deal with CORBA objects as if they were normal objects in the language's syntax...that's what bindings are for.
  • Compare

    -- functional pseudocode: using a high-level language with a dynamic object system and built-in facilities, search entire environment and remote hosts for urgent documents which have been modified today

    let docs = [ doc from World | doc :: Document and doc.modification_date MEDIUM ]
    in display_numbered_list(docs)
    >> ask_user("Document number: ")
    >>= \n -> docs[n].display_nicely ;;

    to

    -- quasi-functional pseudocode: using a high-level language without a dynamic object system but with CORBA bindings, search entire environment and remote hosts for urgent documents which have been modified today

    namespace An_Arbitrary_Protected_Namespace ;;
    use ORBs_Special_Namespace ;;
    use Document_Class_Skeleton ;;
    use Special_Class_For_Object_Reference ;;
    use Special_Class_For_Object ;;

    let orb = implem_dependent_initialise_orb(lots_of_params)
    in do orb.implem_dependent_initialise_boa(lots of other_params)
    >> orb.get_naming_service -- if orb.supports_naming_service
    >>= \ns -> filter (\s -> s.query("BIZARRE_QUERY_IN_THIRD_PARTY_LANGUAGE: DOES THIS SERVER CARRY OBJECTS OF TYPE Document?").cast_to_boolean) ns.all_hosts
    >>= \hs -> map (\h -> h.all_object_refs) hs
    >>= \refs -> filter (\ref -> ref :: Reference(Document) and ref->modification_date.compare(YESTERDAY.cast_to_i nteger) and doc.urgency.compare(MEDIUM.cast_to_integer) refs
    >>= \refs -> do
    map (\r -> display_list_item(r->title)) refs
    >> ask_user("Document number: ")
    >>= \n -> refs[n]->display_nicely ;;


    A few other comments:

    it's been around forever

    "Since the late 80s/early 90s" =/= "forever". In any case, why not try and invent something new and potentially better, rather than to stick with an old and potentially obsolete method? "Reinventing the wheel" is worth doing if the wheel already invented is square.

    carefully maintained

    What do you mean by this? Are you talking about OMG's formal description of the architecture? Or about each ORB's code? Or about each CORBA program's code?

    I would suggest going with CORBA as opposed to creating Yet Another custom inter-process communication mechanism

    But the original poster doesn't want an IPC mechanism - he understands that today's heavyweight processes should be abolished altogether. All the conceptual overhead added by CORBA's architecture is just not worth it, any more than it is using it in a self-contained program.

  • Of course, this was *not* the Original Unix(TM), but what happened after the Suits realized that the giveawayware had "commercial value" (pause for ooos and aaaahs), and must be "protected".

    I still have one of Ken's free Unix tapes from the PDP days. Original Unix was. We must remain ever vigilant as the saying goes.
  • Compare

    I suppose it's dependent on language and libraries. One can easily make a library for an *existing* language to make CORBA look like the former pseudocode example. Never underestimate the value of reusing a language people already know (bindings for whatever language you want).

    "Since the late 80s/early 90s" =/= "forever"

    Yes it does. And certainly with respect to any new high level language that somebody is going to invent up today for some special purpose.

    In any case, why not try and invent something new and potentially better, rather than to stick with an old and potentially obsolete method? "Reinventing the wheel" is worth doing if the wheel already invented is square.

    Well CORBA is far from obsolete, and what the poster was suggesting is nothing more than building CORBA-like semantics into Yet Another High Level Language. I don't really see that as reinventing anything. CORBA does its job well.

    What do you mean by this? Are you talking about OMG's formal description of the architecture? Or about each ORB's code? Or about each CORBA program's code?

    Yes, no, no. The CORBA spec, as far as I can tell, goes through a lot of scrutiny, and is accompanied by detailed documentation (and a lot of vendor support) as the spec is revved. On the other had, if we created a new language to subsume this functionality it would be prone to all the typical new language requirements and features problem, would have to go through ANSI, etc. etc.

    But the original poster doesn't want an IPC mechanism - he understands that today's heavyweight processes should be abolished altogether. All the conceptual overhead added by CORBA's architecture is just not worth it, any more than it is using it in a self-contained program.

    And how exactly do threads share address space in a "lightweight" manner accross a wire? The "conceptual overhead" of doing this will be the same whether or not you call it "CORBA". *Something* has to format and send stuff over the wire and recieve it on the other end. It will just be built into a language library instead...it's not avoiding the overhead.

    You can argue that the poster is not concerned with over-the-wire IPC (message passing, shared memory, whatever you want to call it), but only doing IPC optimized on the same host. But CORBA can always be trimmed down to do this (like X was modified to optimize for a client and server in the same address space), while also still supporting over-the-wire communication if necessary. On the other hand, if you invent some great novel high level language to do everything CORBA does, except not over the wire, then, when you decide you actually DO want location transparence, you have to create a whole new protocol to deal with that. I say use CORBA, trim it down for local use if necessary (I believe what GNOME has done), but if you need to go over the wire, it's there for you on both ends already.
  • You do make some good points but ...

    I agree, he does make some good points... but they don't prove his point,

    Userbase is king. Windows has the userbase. If Linux doesn't get a big userbase, then people won't develop there.

    certainly, marketsize influences people downstream, but it's also true that in a high growth industry there are more buyers in the future than there are in the market at the moment, According to his logic, the PC would never have taken over from the Apple II, etc. Linux is here today, and it wasn't yesterday. That calls for explanation, not dismissal.

    There are a lot of commercial developers that grew up using Microsoft code generators. Those people will be the last to switch to *NIX.

    ... so? what's your point? Hoping they remember to shut the last windows machines off?

    For a lot of smaller developers (database developers, etc) the fastest way to do something is still use generation tools... Those developer's won't switch until a similar body of apps are available on Linux

    Oh gee! I don't have an answer to this!!! Oh wait, you answered it yourself:

    - Any tool you can use in [NT], you can use in [*NIX].

    What version of glibc? What version of X? You don't have these kinds of problems with Windows, mainly because MS forces people to stay on an upgrade path,

    They must have really good crack on his planet! due to the way Windows centralizes the installation stuff, you frequently need to reinstall the whole OS when you screw up or install a piece of screwed up software. (Admit it: especially when you help your friends and relatives)

    ... page of CLI commands. Compare this to Visual Studio's, where you pop the CD, pick which options you want, and then go away while it installs.

    ... but once it installs, don't go anywhere: you can't do unattended builds when you need to click buttons in a GUI to get the source code, build it, etc. Oh, the perils of a CLI!

    But, as easy as it was to refute most of his points, or at least show the other side of the same coin, none of it's important. The most influential people make decisions for the masses, so it does not matter what the masses expect. Look at the Computer Science labs of all of the top CS universities (MIT, CMU, Stanford, etc.) You will see more unix and linux than Windows, and if you talk to people they can articulate 1000s of reasons why. Even if they work on more advanced or experimental OSs or environments, chances are they do their work from Unix. They respect it. Nobody with credentials respects Windows.

  • Yeah... windows has pre-emptive multitasking, memory protection, threading, SMP support, and all the other buzzwords that make up a "modern OS.

    Too bad for the drones that bill's implementation of all of the above blows goats.

    The Amiga, as you say, had all of the above in the '80s. Didn't save the commode from getting flushed.

    It's not at all uncommon for real operating systems to have uptimes measured in *YEARS*. At my previous job, we had a Solaris NFS server that has been up and running WITHOUT A REBOOT since 1993!!! Show me a windoze box with even HALF that uptime. Bueller? Bueller? Bueller? Anyone? I'm waiting...

    Yeah windows is buzzword compliant. But gates' horde of trained monkeys typed out one piss-poor imitation of a pre-emptive, protected, multithreaded, "moderm" OS. My SuSE mail/web SERVER at home has been up without a single reboot since I moved into my new apartment two MONTHS ago. My windoze workststion at work (which has duty no more stressful than running Exceed to xterm into the real computers there) crashes, on the average, every two DAYS!!! (always at the WORST time)

    Hell, even my Macintosh, without protected memory, minimal threading support, and cooperative multitasking OS, crashes less than the windoze box... averaging only one crash a week, and then only while running a unique combination of Netscape and AIM.

    The difference being, that MacOS (X) is an EXCELLENT implemantation of a co-operative OS, wheras windows is a shoddy, piss-poor implimentation of a pre-emptive os. (With Linux ranking GOOD and Be EXCELLENT in the pre-emptive competition).

    It doesn't matter if you have infinite money to throw at infinate code-monkeys. The OS will still SUCK, compared to one designed by a smaller number of COMPETENT, dedicated, inspired programmers.

    john
    Resistance is NOT futile!!!

    Haiku:
    I am not a drone.
    Remove the collective if

  • I only code stuff that I'm going to use. So my userbase is one. As a programmer, I respond to what my userbase needs. I don't program just to program.

    TGL
  • So, do most people need multiuser, multitasking, networked operating systems? No.

    In the days when there was one computer in the house and it was used by one person, this was true. However, it is becoming less and less the case.

    If I have a computer, my brother has a computer, and my parents share a computer, I definitely need networking if I want to be able to do anything useful if I sit down at any machine other than my own.

    I also don't want to deal with synching my home directories on several different machines, which means having one dedicated system for storage.

    As I don't want my brother rearranging my home directory and I don't want my directory stomped if Windows crashes badly on one of the other machines, I need to have some concept of user-based permissions to protect my directories from others' mistakes... which means multi-user support.

    As I've had a few drives die on me over the years, and as I don't want to keep my own machine on all of the time, it's prudent to put user home directories on a dedicated fileserver with a software RAID...

    ...and so forth.

    Whenever you have more than one machine and more than one user in a house, you have something that looks less like the old "one person, one PC" model and more like a university computer lab - complete with multiuser support and networked services.

    So, I would indeed argue that these aspects of OS design are becoming more important for Joe Average user.
  • I started out on UNIX about 10 years ago at Uni where all the computing students had to work on Sun boxes. I used to think UNIX was great. It amazed me at the advanced state of a UNIX workstation/server. Looking over at other users using MS-DOS I used to laugh. It was like looking at "pretend" computing.

    After leaving Uni, I didn't want to have to work on MS-DOS but I couldn't afford my own Sun box. So, for the past 6 years I have been working on Linux. To start off with I thought Linux sucked a bit. It was cool to have the same sort of environment as at Uni but there were so many things missing.

    These days however. Linux rocks! I mean, everything I want to do I can do under Linux. Also, Linux is moving at a fast pace and every day people are bringing out new things, kernel patches and security enhancements etc.

    Recently, I have had to work on Sun Solaris machines. It was the first time I had had to do this since Uni. I was quited excited at first when my new Solaris workstation and server arrived in the office. Soon though, disappointment sunk into my heart. Solaris really sucks! I couldn't believe at how antiquated and stagnated Solaris was. I couldn't believe that some of the things I used to hate about Solaris years ago...still hadn't been fixed.

    I am not saying that Solaris hasn't changed at all. Most of the changes though are for the high end server market. May be this is where Sun see their market niche. However, they better watch out because Linux is racing and they are going to loose out.

    What annoys me today is when I hear people say The trouble with Linux is it is not mature These people should take a closer look and the commerical UNIX distributions should look too. If the commerical Unices don't start proper innovation across the market they are sure to loose out. The battle isn't over. We still need to innovate ourselves more in pushing Linux forward. But it isn't the commerial Unices who are going to compete with Linux in taking the market.
  • My point was simply this - the focus of the average user went from large timesharing systems to the personal computer, and now it's almost as if networked, timesharing systems have come back into vouge. i know, the whole while, there were multiuser computers running networks and databases and whatnot, but now we're even getting that sort of os on our desktop (i forgot to mention winnt as a multiuser os.)

    But the deeper point is that the hypothetical "average user" changed his focus by changing who he was, not by individuals changing focus that much. The big thing that the PC did was to break the computer from being a piece of equipment used by a comparatively small number of people who could wrangle terminal time on an expensive time-sharing system to anyone who could plunk down a couple of thousand dollars for a cheap desktop machine. Most of the business types who were using big iron kept using big iron, and today they're laughing at the PC zealots who claimed big iron was obsolete- only to find that their PCs are relying heavily on centralized servers.

  • by WhyteRabbyt ( 85754 ) on Sunday August 13, 2000 @07:05AM (#859524) Homepage

    Linux is a mess of different libraries

    I have 1067 .DLL files on my NT systems main drive. I'll bet you a shiny new dollar thats a hell of a lot more than any Linux system.

    I am back to using Windows98SE and AmigaOS - why? One runs everything, and is faster for desktop use than *nix. The other is the only OS that had an advanced design and implementation. I really hate those fuckers who found out Unix in the past year or so and think it's fucking brilliant.

    Maybe Win98 is faster, maybe it isn't. I work alongside someone who does OpenGL development, and he find Linux far faster running and compiling his code than Windows is, on the same code. So 'is faster' probably isn't true for everything. Linux is more stable. Thats a win for some people, more than speed. Meanhwile, BeOS probably has AmigaOS beaten in terms of implemented advanced design and implementation. Dont get me wrong, I loved my Amiga, and the OS was fantastic, but just because its the newest shiniest, doesnt make it better. I guess you're one of those fuckers who found out the new AmigaOS in the past year or so and think its fucking brilliant, eh?

    Linux is the past. It's merely a free implementation of something thats been around for years.

    And thats a problem why? Because its not new, it would appear. Sorry, but thats not good enough. Linux provides a solution for a problem. A modern Unix, capable of using a very decent subset of modern hardware, that has an aggressive development policy. Why is that somehow 'not good'? Lack of GUI apps? Sorry, GUI software doesnt make an OS better or worse, it just means you have GUI software. Desktop Linux is in its infancy. So what. Its progressing faster than the Mac or Windows ever did. And meanwhile, the core OS is still exactly what Unix always was; stable, secure, powerful.

    Pax,

    White Rabbit +++ Divide by Cucumber Error ++

Suggest you just sit there and wait till life gets easier.

Working...