Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

The Re-Unification of Linux 361

ESR has written a piece about the re-unification of the fragmented Unix world, as seen in the growing position of Linux. Click below to get the full read.

In the wake of the wildly successful Red Hat IPO stories mooting the possibility that Linux might `fragment' under corporate pressure seem to be proliferating. The memory of the great proprietary-Unix debacle of the 1980s and early 1990s is constantly invoked -- N different versions diverging as vendors sought to differentiate their products, but succeeded only in balkanizing their market and inviting the Windows invasion.

But amidst all this viewing-with-alarm (some of it genuine, much of it doubtless seeded by Microsoft) something ironically fascinating is happening. Unix is beginning to re-unify itself.

SGI's recent decision to drop IRIX and focus on Linux is one telling straw in the wind. Another is SCO's launch of a Linux professional-services group, clearly a trial balloon aimed at discovering whether SCO's branded-Unix business can be migrated to a Linux codebase. I visited a Hewlett-Packard R&D lab last week, and learned that many people there expect HP to deep-six its HP-UX product in favor of Linux in the fairly near future.

What's causing this phenomenon? Open source, of course. Whoever you are -- SGI, SCO, HP, or even Microsoft -- most of the smart people on the planet work somewhere else. The leverage you get from being able to use all those brains and eyeballs in addition to your own is colossal. It's a competitive advantage traditional operating-systems vendors are finding they can no longer ignore.

Playing along now and trying to defect later won't work either -- because running away from the community with your own little closed Linux fragment would just mean you didn't get to use those brains any more. You'd be swiftly out-evolved and out-competed by the vendors still able to tap the literally hundreds of thousands of open-source developers out there.

What we have now have going is a virtuous circle -- as each of the old-line Unix outfits joins the Linux crowd, the gravity it exerts on the others grows stronger. The Monterey and Tru-64 development efforts, the last-gasp attempts to produce competitive closed Unixes, can't even muster convincing majorities of support inside the vendors backing them; both IBM and Compaq are investing heavily in Linux.

Linux fragmenting? No way. Instead, it's cheerfully absorbing its competition. And the fact that it is `absorbing' rather than `destroying' is key; vendors are belatedly figuring out that the value proposition in the OS business doesn't really depend on code secrecy at all, but instead hinges on smarts and service and features and responsiveness.

These are all things the worldwide community of open-source hackers are really good at supplying. Vendors become packaging and value-add operations that never have to re-invent the wheel again. Customers get better software.

By joining the Linux community, everybody wins.
--
Eric S. Raymond

This discussion has been archived. No new comments can be posted.

The Re-Unification of Linux

Comments Filter:
  • Well, I'd rather not be part of a group of people that have an exclusionary and/or elitist attitude. I have tried Linux (I installed Slackware 3.0 lats year), but I wasn't impressed enough by its technical merits to continue using it. The RMS philosophy and the GPL intrigue me, but they're counterbalanced by the rest of the crap I see in the "Linux community." On just technical reasons, ignoring any ideologies, I see no reason to switch. Sure, Windows sucks in many ways, but so does X. Windows crashes, but it has a usable, consistent interface. X doesn't crash nearly as often (though XF86Setup required me to reboot a few times), but its interface pretty much sucks, and is not consistent in the least.
  • I'd say the reason why the "bulk of the interest is in Linux" *is* in fact (at least partly) due to the GPL. The GPL means that companies who adopt linux have to be very public about it, because the GPL requires them to make their source & changes publically available. In contrast, a company could be running a *BSD kernel in their shiny embedded box and no one need know.

    And market adoption is driven by "what the other guys are doing." Because the GPL forces publicity, it creates its own fad, and thus its own momentum. Not a technical reason, but a sheer dynamics quirk. A particularly successful meme.
  • It is likely that if company X wants to improve linux with patch Y and Linus says no they will just apply it in their distribution.


    And each time company X wants the latest features of the next kernel, they will need to re-apply their patches to the unfragmented kernel, or remain using the old kernel. If their patches are worth incorporating into the new kernel, developers will be screaming at Linus to fold it in. If not, they will find it prohibitively expensive to keep re-applying their patches with every new kernel release. And if they continue using the old, patched kernel, they will find themselves at a market disadvantage, eventually to the point that no-one buys their distribution. Fragmentation of the kernel is a non-issue.



    Oh yeah, how will we know if their patches are worth applying?
    Easy! We have the source. This is how the GPL prevents fragmentation, while the BSD licence allows it.

  • Their product can be downloaded for free and there are dozens of competitors offering the same thing for a lower price (other linux distros). Let's face it-- most linux people don't need tech support-- and that is the only thing Redhat -really- sells.
  • Can you say more about what the benchmarks were?

    I've read some very convincing papers asserting that various kernel critical paths (and in particular the system call mechanism) are much faster in Linux than in the AT&T-derived unices.

    Of course, it may well be that the libraries and userland are faster on *BSD for what you're trying to do. Or it could be that the papers I read lie. I'm rather curious to know which it is.
  • Ahh, I see..."It depends on what the meaning of the word 'is' is."

    Cheers,
    ZicoKnows@hotmail.com

  • I'll agree that there's a long way to go yet, both for Linux and all of the Open Source movement, but I think this is definetly the direction things are moving in. SGI is in the process of porting some of IRIX's big features towards Linux and hopefully will use there knowledge to make Linux a high-end server OS. The article Friday about Compaq terminating it's NT for Alpha project is also evidence to support this. Sun will probably be the last hold out and they may never go to Linux except through emulation. But they've got the revenue to pull it off. But there is starting to be some reunification of Unxi.
  • This is probibly the most important part of re-unification. Each *nix has its strength and the convergence of these strengths is an awesome thing.
  • Even if there is only a few unices, hopefully there will always be many mail transport agents, mail readers, servers and such to limit the spread of a monoculture virus. Microsoft has seriously weakened its defenses by monopolizing its application market: everyone not only uses Windows OS, but reads mail with Outlook, distributes mail using Exchange, serves pages with IIS, etc.

    Apache's dominance on the web is potentially a Bad Thing, for the same reasons.
  • If Linux really does unite the unix world by simply replacing all others, I very much doubt that regular users of HP-UX, Digital Unix (erm..), IRIX, AIX, Solaris etc will see this as an improvement.

    I agree. The best company-adoption-of-Linux story so far has got to be SGI, who's promised to port the groovy IRIX features to Linux. Iff this happens, then IRIX users might not feel they've been downgraded... but this is a big iff.

  • You, sir, are a clueless numbskull. The Linux community doesn't want you anyway. We need people who actually try to argue and persuade, not insult. Calling someone a "numbskull" because they feel uncomfortable with the amount of chest pounding rhetoric that has infused the Open Source community only shows your lack of tact. As others have pointed out talking to a rabid Linux advocate is much like asking a Marine why they love the Corps, "I love the Corps." "But why?" "I Love the Corps!". This can scare anyone off.

    I thought this movement is about choice, apparently you learned your lessons from Ford. "You can use any OS you like, as long as it is in Linux."

  • If you took all the files associated with FreeBSD, and replaced it's kernel (and support programs like ps, lsof, etc) with the Linux kernel (&etc), you would be running Linux. Wouldn't you?

    In fact, no. I would be running a FreeBSD system with a Linux kernel. And I suspect it would be pretty much useless.


    Everyone is distributing libc6. Some people are still running libc5. Backwards compatibility is achieved by distributing libc5 as well. Forward compatibility is achieved by installing libc6.

    Let us see...During the last 6 months, I am receiving about 4 emails a week (on the average) dealing with library incompatibilities between different flavors of Linux. I have seen libc5,
    GLIBC, and GLIBC2, each had its own problems. In addition, there are different versions of GLIBC and GLIBC2. Do you actually expect normal people (that is excluding Linux fans ready to fiddle with their system just for the sake of it) use something like this? Even more, do you still insist on the fact that this is not fragmentation?


    Has FreeBSD never had changes which are not forward-compatible?

    It had. The changes tend to be slow and gentle on the userbase though. Maybe because most of the userbase treats FreeBSD as a tool, not as a fetish?
  • The whole concept of shutting your computer down is obsolete. First of all, the heating/cooling cycles will shorten the life of your hardware. Secondly, the computing and networking industries are moving to a world of persistent network connections via cable modem, xDSL, etc. Users want acces to their information now. Also, there is a strong movement towards home networking. It would be a shame to not be able to participate in the future of computing because your OS can't be depended upon to stay up and running.
  • Of course they can contribute patented ideas, if they own the patent themselves. If an idea is patented, anyone hwo wants to use it has to ask the patent holder for permission, so they'll just license the idea to anybody, free of charge (e.g. under the GPL). So, if they really want to, they can.


    ---
    Ilmari

  • A Unix operating system is typically named for its kernel. FreeBSD runs the FreeBSD kernel, OpenBSD runs the OpenBSD kernel, Solaris runs the solaris and Linux runs the Linux kernel.

    Umm, the reason why "{Free,Net,Open}BSD runs the {Free,Net,Open}BSD kernel" is that the {Free,Net,Open}BSD kernel is called the {Free,Net,Open}BSD kernel because it's part of {Free,Net,Open}BSD - i.e., the kernel, in those cases, is named for the operating system.

    As for Solaris, well, "uname -s" seems to think it's running the SunOS kernel. :-) (And regardless of where you sit on the "SunOS vs. Solaris" debate, the kernel is called the {SunOS,Solaris} kernel because it's part of {SunOS,Solaris}, so the same point applies there.

    Linux systems are a bit different, as they've been assembled from pieces constructed and maintained by different groups; there's no One True Linux System, whose entire source can be found under "ftp://ftp.linuxsystem.org/src"; there's no single complete OS from which the kernel takes its name.

    If you took all the files associated with FreeBSD, and replaced it's kernel (and support programs like ps, lsof, etc) with the Linux kernel (&etc), you would be running Linux. Wouldn't you?

    No. You'd be running a BSD/Linux hybrid; it would feel different from many Linux systems, as the APIs would be a bit different, the administrative commands would be a bit different, the twisty little maze of "/etc/rc" files would be a bit different, etc. - and it's not at all clear that it'd be less different from a Linux distribution using one of the usual collection of Linux-distribution userlands than those distributions are from one another.

    (If you took all the files associated with Windows NT, and replaced its kernel with a Linux kernel, and wrote an "ntdll.dll" that implemented all the NT system calls atop a possibly-extended Linux API, would you be running Linux? :-))

  • One thing about how Mandrake is different; I believe that it's optimised for Pentium series processors rather than i386. I could be wrong, but I'm pretty sure of this.
  • As far as I'm aware, development of IRIX may be slowing, but it's far from stopping.
    Oh, despite ESR's tendency to assume all things good are a result of open source, it's a damn fine article.
  • Linux is too chaotic to code for.

    All of this chaos and fragmentation is caused by programming for Linux only. Using /proc, /dev/sound, using x86 specific assembly code, etc. When you code make your code as portable as possible. Learn and use APIs that are supported in multiple OSs and hardware architectures.

    For example if you do multithreading use gnu pthreads.

    To reply to the Microsoft internal uniformity rant, Microsoft has tons of APIs last time I remember, AFC, MFC, base API, COM, ActiveX, .... Windows code is has a sprinkle of 16 bit, 32 bit, it is far from uniform but since it is not open I guess we will never truly know. Linux will do just fine since I can use modules if the sound card code is acting up, I could get rid of it. And wait until a fix. And yes a fix will be soon up since everybody can see the code.

    To make a long story short the more portable the code the more people see your code, fix your code, enhance your code, hence it lasts longer.

    Write once, rewrite, rewrite, rewrite, .....
  • I agree with Raymond, actually,I wrote an article to this same effect which was posted in March on Linuxpower.org. You can read the article at http://www.linuxpower.org/display_item.phtml?id=11 1 .

    But there is no dispute that the Unix world is slowing unifying. And even as vendors like Sun and IBM try to beef up their own Unixen, they add features to them to make them more compatible with Linux (ie. Solaris runs Linux binaries

    I don't think everyone should pat themselves on the back just yet though. There are so many companies relying on proprietary Unix systems with closed source tools (the company I work for uses Solaris exclusivly for everything except a few of our front end apps running in Windows). It will take much to move these companies over to linux.

  • It would be stupid to tell someone "You need to ./configure that and then run a make. Then you need to do a make install."

    Every idiot knows what the word TYPE means. Sorry, but waht you're complaining about shows inability to teach not a flaw in Linux.

    And believe me most people can infer what they're doing from they're typing, as long as you remind them to pay attention.

    From that point on ./configure, make, make install are the most intuitive statements you could make.

    I worked at a help desk, I should know.

    Besides, CLIs are easier to teach over the phone than GUIs.

    As for those who shut down their PC's incorrectly. I don't have a clue who that is. It's certainly not the majority I've seen in comp labs and while I was working. The most tight ass PC-phobic-I-type-600-keys-per-second-so-I-can-leav e people I've seen do in fact shut down properly. The rest sometimes don't even care if they have fsck. And they never forget.

    Course there's that granny who needed memory warm-up exercises.
  • How happily is sun going to fork over the only technologies that are left to differentiate Solaris from Linux? Like NUMA. Will they just keep selling their product as long as theres a poor sucker left buying it? And then embrace Linux before they're crushed in it's path? Will the seperate unix vendors be cutting their R&D departments? Oh so many questions!
  • There are multiple ways to do everything. Just chosing which API to use can be frustrating. On top of that there is no common way to access multimedia systems such as sound and video. The kernel provides OSS.. but there is also ALSA. Not everyone uses X either.

    "End-users", in the sense you appear to be using them, don't use OSS or ALSA; they use applications. If the OS can support applications written either to the OSS or the ALSA API, and you don't have to know which API an application uses, why does the availability of multiple APIs make any difference to the end user? (The same applies to any other situation where you have multiple APIs; OSS vs. ALSA is just an example.)

    Back to the problem of GNOME/KDE and other GUI abstractions. No matter what GNOME abstracts it will never be fully in touch with Linux. Abstractions are generalizations. You can call an apple and an orange fruit. But a fruit is a fruit. See my point? No differences can be made with that abstraction. If animal_X likes apples and animal_Y likes oranges and you feed animal_X a "fruit" (which just happens to be an orange) animal_X will die (crash).

    To what exactly is your metaphor referring? To repeat the question I asked in a previous message - a question you didn't bother to answer - in what way would, say, an office application be an "animal" that "likes oranges" or "likes apples", i.e., in what way would it want to use Linux-specific features in a way that can't be abstracted away? (Don't just assert that it would - without an example, I have no reason whatsoever to believe such an assertion.)

    Since GNOME/KDE do not implement OS-specific features they will not take full ability of the OS. And when OS-specific features are implemented they will be mere hacks to the metaphor system.

    Presumably by "they will not take full ability of the OS" you mean "they do not currently make full use of the OS's facilities", given that you say, right after that, "and when OS-specific features are implemented", i.e. that it's not impossible for them to implement OS-specific features.

    By "they will be mere hacks to the metaphor system" do you mean that the UI would have to hide necessarily platform-dependent details because the entire desktop environment will be providing a completely platform-independent metaphor? I have no reason to believe that the desktop environment is obliged to do so; the bulk of the desktop environment may do so - just as the bulk of the Windows desktop environment may provide a metaphor independent of whether you're using Windows OT or Windows NT - but there's stuff under the Control Panel, say, that's not the same in the two OSes (if Windows had this wonderful metaphor that completely hides the differences between Windows OT and Windows NT, you wouldn't have control over power-saving stuff in Windows OT, because NT doesn't have that yet).

    Also, I meant "knowledgable" as in confortable with the metaphors presented by the GUI system. Once they get beyond that and get deeper "into" the system I believe they will get confused since the metaphors are invalid.

    In what way is this any different from the problems a Windows user might have if they "get beyond" the desktop and start playing with Your Friend Mr. MS-DOS Prompt? If the answer is "you don't have to fire up a DOS prompt in Windows", then perhaps the answer, for those users, is to arrange that they not have to do so in Linux, either.

    Linux calls a pipe a pipe, but a GUI system might use "pathway" or some other terminology to make it portable.

    "Portable" to what? Pipes are called pipes in all UNIX-flavored OSes (and in Win32, for that matter...), so why would a GUI system have to use "pathway" to make it portable to multiple flavors of UNIX?

    KDE/GNOME are basically making an operating system in an operating system. They are creating object sharing systems and using virtual file systems and various other operating system ideas. But why? Why can't we just use Linux's VFS?

    For one thing, because, for better or worse, most UNIX-flavored OSes, including Linux, don't generally have file systems plugged into their VFSes to support things such as HTTP or FTP access, which, if the "virtual file systems" to which you're referring are the ones I suspect they are, the VFSes of KDE and GNOME offer. There are some who argue that HTTP and FTP access should be provided through the OS's file system API, and implementations of that do exist (often done as, e.g., user-mode NFS servers, or other types of user-mode file systems, so that you're not obliged to shove FTP or HTTP client code into the kernel).

    Why can't Linux have object sharing at the kernel level?

    What do you mean by "object sharing"? Are you referring to the object models like KOM and Bonobo? If so, why should Linux have it at the kernel level? There's nothing Magically Wonderful about implementing stuff in kernel mode; I think the bulk of Windows' COM runs in userland.

    One might reasonably argue why the object model should be part of a desktop environment, rather than being a thing unto itself (which could be provided as part of a Linux distribution, say), to encourage non-desktop stuff to use it (COM isn't, as far as I can tell, desktop-only in Windows).

    Why libraries for GUI?

    The alternative to a library being? All the APIs offered, at least to programs written in compiled languages, on UNIX-flavored OSes and Windows, come from libraries (or code loaded at run time) - even system calls are called as library routines that contain a trap stub.

  • Most true, Jedi master. I see that you know the force well and it flows as fluid as water. But the fragmentation does have some valid points. Maybe not to the Linux kernel itself. But the OS in generl, those that are forwared by Debain, Redhat, SuSE, Caldera or anyone else.. File system standards are fine, but I dont really feel this is enough. Recently my company went through a whole regoranization (I was heading it). We moved from RedHat based servers to Debian based servers (some are even running the potato now). I was techinically more at ease with Debian, but, my fellow works, those that learned redhat from a book spent many a day bikering at how ugly Debian was (When in fact it was the other way), and how lost they were when they wanted to do something in Debian vs how they did it in Redhat. For normal users such OS changes are fine, but for adminstrators it means a completely different thing. I had to give quite a few seminars to my fellow works and bosses to make them get comfortable with Debian. Why is this happenning? Arnt they all supposed to follow some standard? even if the packgaging systems are different? This is a sad case and getting worse day by day. libc is another problem, some distrubtions just refuse to go up to glibc2 when others are already in glibc2.1, and some companies just put their newest products out in glibc2.1 (Eg: Oracle), when most ppl are running standard OSes that contain nothing but glibc2 at best. Our Oracle upgrade needed a potato upgrade in debian. This came with it's own problems since potato was an unstable OS. I suspect this kind of frangmentation would keep going on. Why cant we have some meetings and iron out the differneces between where files are stories (file system hiraachey standards) and othes. Till that day, I have to waste more time educating ppl and learning different OSes just to install a linux kernel on a box. good day.
    --
  • Let's face it-- most linux people don't need tech support-- and that is the only thing Redhat -really- sells.

    I have the impression that Red Hat expects (and, presumably, hopes) for that to change. In their S-1 [sec.gov], they say things such as

    Operating systems based on the Linux kernel are some of the better known open source products. Linux-based operating systems represented 17% of new license shipments of server operating systems in 1998, according to IDC. Despite strong initial market acceptance, these operating systems have been slow to penetrate large corporations at the enterprise level due in part to the lack of viable open source industry participants to offer technical support and other services on a long-term basis.

    and

    OUR STRATEGY

    We seek to enhance our position as a leading provider of open source software and services by:

    ...

    - expanding our professional services capabilities to capture large corporate business on an enterprise basis;

    ...

    and (in the list of risks)

    WE MAY NOT REALIZE ANY BENEFIT FROM THE PLANNED EXPANSION OF OUR SERVICES BUSINESS

    We have recently begun to expand our strategic focus to place additional emphasis on consulting, custom development, education and support services. Historically, we have derived virtually all of our revenue from software product sales. Although we intend to continue to develop and sell Official Red Hat Linux, we anticipate that product sales will represent a declining percentage of our total revenue if our strategy is successful. We cannot be certain that our customers will engage our professional services organization to assist with support, consulting, custom development, training and implementation of our products. ...

    Whether the bubble will burst or not is an interesting question. I could imagine it bursting (although it's not the only stock market bubble I could imagine bursting...), but I wouldn't assume that it'll necessarily burst because Linux will necessarily remain the province of those who "don't need tech support".

  • This is a really nice piece of Linux propaganda, which is ESR produces at an impressive rate. However, his assertions seem to be a bit premature, considering that only one Linux-centric public company exists to date. How can one assert that the *nix industry is converging on Linux, when Linux hasn't even begun to experience the level of commercial pressures felt by its cousins?

    Yes, it seems that several big Unix players have come out with modest support of Linux. Don't forget, however, that these companies are still massive entities, and the support that most have flung in the Linux direction is so token (for them) that they can hardly be credited with anything but protecting their own potential interests.

    Don't get me wrong. I really like Linux. I use Linux exclusively at home and at work. But the Great Linux Migration is still in its infancy, and there is a LOT of room for corruption and division.
  • I think that change must be facilitated if the improvements are going to push the boundaries of the possible. Easy changes aren't as interesting.

    We agree to disagree. That's fine.
  • Yes, but as soon as some /. reader notices the GPL-mandated disclaimer, an article goes up saying "WOW! FooBar systems is running LINUX!!!" and then the tech press --- which increasingly uses /. to do its homework for them --- obligingly follows up with a "mainstream" article some short time later. The press releases aren't required, but the current dynamics make press inevitable.
  • >ftp.kernel.org, but a custom kernel of their own, that differs not >just in how it was configured, but in the code itself.

    Umm no. The Redhat kernels make more use of modules than say what Slackware does. For instance ppp and sound are modules with the kernels RedHat uses. You basically don't have to recompile the kernel for sound with Redhat. I know.
  • I can't believe so many people willingly drink the Eric Raymond koolaid. One bullshit article after another for him, and people are begging for more.

    His constant factual errors are bad enough, but what really makes his writing so terrible is the constant fake bluster he exudes. I honestly don't think that he believes a lot of what he writes -- it's as if he wants to puff himself to be some badass cartoon character, always foregoing reality and truth in search of an oh-so-pithy one-liner.

    I've written before at Slashdot (sufficiently long ago enough that it's no longer in my User Info) that I think BSD usage will increase and that Linux will see a downturn. And I say this as someone who, while preferring NT and Sun machines, is both an owner (at home), an administrator (at work and home), and a fan of Linux, as well as someone who has never even touched or seen BSD except as a user. If not BSD, then something else, but a lot of people who actually use their computers as a means to get things done, and not as a religion or fetish, are both irritated with and embarrassed by the Linux zealots out there. Count me as one who is considering the switch as well. Plus, BSD will also unfortunately get a lot of the zealots who are currently an embarrassment to Linux -- who will in turn be an embarrasment to BSD -- because they won't feel so 31337 anymore when "the clueless" are able to install Linux.

    The King is dead. Long live the King...

    Cheers,
    ZicoKnows@hotmail.com

  • Perhaps you would call them both Linux, but I wouldn't; were somebody (perhaps Microsoft, to squelch irrelevant "you can't do that, the source isn't available" arguments) to implement a full-blown Windows environment atop a Linux kernel, without providing a userland that looks anything like that of a Linux distribution, I wouldn't call the resulting system "Linux", because it wouldn't feel like Linux, either to a programmer or to a user - I'd call the kernel a Linux kernel, but that's it.

    Yeah, perhaps you could then add a Linux userland atop it - that'd be the moral equivalent of Interix [interix.com], which provides an environment with a UNIX API atop the NT kernel. Once you added the Linux userland, I'd be willing to call the resulting system a Linux system (just as an NT system with Interix is still an NT system)...

    ...but that's not solely because it has a Linux kernel; it includes all the other code that makes a Linux system look like a Linux system.

    Similarly, a FreeBSD userland atop a Linux kernel wouldn't be a Linux system to me unless the Linux userland was present as well.

    Of course, in some cases the userlands would collide - would the FreeBSD-and-Linux userlands atop a Linux kernel have, say, a FreeBSD-style or a Linux SV-style or a Linux BSD-style "init"? Were the system to present both flavors of userland where it was possible to do so, but chose one particular flavor of userland for the stuff where it wasn't, if that was a Linux flavor, I'd call the system "Linux with an XXX compatibility package", and if that was a FreeBSD flavor (or an NT flavor), I'd call it "a hybrid, neither fish nor fowl".

  • You must think Linux users sit around fretting about conpatibity with other distro's

    Are you *BSD users this clueless.

    Slackware Linux as a system has been incredinly safe and stable (hasn't crashed on me yet!). None of the utter confusion (including fragmented kernels), which IS happening in *BSD.

    My impression is that no-one (except developers) really cares about fragmentation all that much. Most programs written for unix are written well enougth that fragmentation isn't a problem. The interface does not seriously change from unix to unix.
  • noone listens to me it takes ESR before they listen.. maybe I need to feed him my thoughts .... there will be less UNIX versions next year. True64 will probably become part of Linux, as it is moving that way, and IRIX is already planned to go that way. In order to defeat M$ this must happen. Over the next few years there will probably only emerge 1 or 2 unix or UNIX like versions probabaly comprising Linux code in it. It may or may not be linux, but it will contain much of the kernel code, and many of the OS parts. I imagine it will contain a much improved Java parts too. Possibly some sort of Java-Linux-Unix mix. where the OS runs on most hardware (Linux) and the apps run on most OSes (Java).
  • The GPL means that companies who adopt linux have to be very public about it, because the GPL requires them to make their source & changes publically available.

    "Publically" doesn't necessarily mean "put out a big press release"; it could just mean "put the text of the GPL in your documentation, along with a URL people can go to download the source". I have the impression that one or the other of TiVo or Replay use Linux in their box, but, if so, I haven't seen either of them announcing this broadly (which cannot in any way be taken as a certain indication that they don't use Linux).

    Yes, the GPL does require you not to keep it a complete secret that you're running Linux inside your box, unlike the BSDL. However:

    1. a lot of the interest in Linux is from general-purpose computer companies, who might have to work harder to hide the Linux or *BSD derivation of their systems (and note that Apple is touting that a lot of MacOS X Server comes from BSD, albeit not so vigorously as those touting their moves to Linux);
    2. as I said, the GPL doesn't require press releases saying "Linux Inside(TM)".
  • I've talked to our CTO about using Open Source stuff, and he thinks linux is "cool" and OSS is definatly viable, but there are simply too many things which would have to be ported from Solaris.

    That's the problem... There are many companies which might want to switch, but there are very uncertain paths of migration. I know for us, that at least our "core" environment would pretty much re-compile under Linux, and we use Sybase 11.5 for our database, but we also use OEC Entera for middleware (I think BEA Tuxedo would do that equivalent job under Linux), Rougueware tools, and quite a few C toolkits for various things.

    Actually, the systems where I work are very heterogeneous, every developer uses what they want to get the job done (I have a linuxbox that I use for development, but I need to compile on one of our development Sun servers).

    What would be nice to see would be some sort of migratory database, showing what apps and functions under different Unicies could be replicated under Linux and how.
  • First bear in mind that both KDE and CNOME are work in progress.
    Gnome does not deserve to be beyond version 0.1, while KDE just
    made it to 1.0 with 1.1.1 release. Neither can be taken as a good
    example of Linux desktop. KDE 2.0 promises some real apps, and
    so does GNOME 2.0. They will probably have enough features to
    be competitive by version 3.0, by which time they may run same
    CORBA backend and same dnd so coding for one would be roughly
    the same as for another (esp. if KDE adds more language bindings,
    regardless of how many people need it).
    Most people in Linux world do use zip (gzip), so I am not sure what
    the difference is, except that winzip is not available (bfd).
    Most compressed programs you'll see have extention .gz, so there is
    quite a bit of uniformity there.
    As far as APIs, it is not clear that it is a good thing to have only one.
    Besides, they are in no way a part of the "end-user world".
    I do think LSB is good, and it would be better if it were folded into
    posix, so that noone out there could ignore it. But one set of widgets?
    Yuck. If people listened to you, we'd be using Motif without any
    alternatives. IMHO, that's worse than all Windows crashes times 100.
  • That's, like, ten minutes of work.
  • For the desktop user the advantage of FreeBSD is that it is more 'consistent', easier to install, configure and maintain than any distribution of Linux I know, but the differences are probably big enough to switch, if you have a system that already does what you want it to.

    Read what Daemon news [daemonnews.org] has to say about this issue, also chech the back issues.
    ---
  • I'm with you on this one, though maybe for different reasons.

    I really don't believe that Eric Raymond represents us (for almost any value of "us" you might care to choose). He seems like an OK guy, but it's increasingly obvious that he has lost track of what we (well, I) thought he was doing - representing us - and is now doing something very different - preaching to us.

    Matthew.

  • Why was this guy moderated up?

    If by that you mean "moderated up to 1", the answer is "because he didn't post as an Anonymous Coward"; see this Q in the Slashdot FAQ [slashdot.org], which says

    ...Anonymous posts start at 0, Logged in Users start at 1....
  • I rarely shutdown my machine, but I don't keep it on the net either.
  • we will see some very natural fragmentation in the community which is the fragmentation that occurs when developers realise that Linux isn't bleeding-edge anymore and goes on to work on something else which in time will probably replace at least the Linux kernel.


    Go Hurd! :)

    Alejo.
  • What's with this title? This article has nothing, whatsoever todo with "The Re-Unification of Linux" ... the re-unification of Unix maybe, but even that's stretching things a bit. How many Linux distros are there? What are the irritating differences between them? Which ones are better for which applications? Answers to these questions would make an article that could justify the buzz-word Linux in it's title.

    I've been a Linux guy for a looong time and it's this kind of misleading hype that makes me want to switch to xBSD just so I can hang out with those seemingly less loud and obnoxious BSD people. Is this kind of crap going to drive us linux users into the closet and make us develop secret handshakes and stuff due to the embarrassment?

    Hey.... if I write an article about changing a tire and put the word Linux in it, will it get posted at Slashdot?
  • by Azog ( 20907 )
    You have a weird way of choosing what operating system to use.

    Why don't you just use whatever works best for you, instead of using whatever OS has spokespersons you agree with?

    And what about the "half-truths, omissions, and outright lies" told by the spokespersons for Windows, your current chosen OS?

    How bizarre.

    Azog
  • The name is Linux Standard Base. You can find more about them at http://www.linuxbase.org/ [linuxbase.org]. They have mailing lists for you to contribute. I suppose it is `taking so long' because the number of persons asking why it is taking so long is far bigger than the number of persons actively working on it.

    Alejo.
  • Well, I shut down my Linux box regularly. I live in a small
    apartment, so my computer is next to my bed and I can't sleep
    with fans being as noisy as they are (OK, so the cover on my
    case is permanently off :).
  • Actually Sun was discussing open sourcing or community licenseing Solaris, but the problem is there is too much OEMed code that makes it nearly impossible. I suspect that is why IRIX and others drop their Unixes instead of openning them up.
  • The Linux world would be a better place if all distributions used the same package format, the same system management tools and followed the same filesystem standard. The fact that they don't will not stop Linux growth or acceptance, but it won't help it either.

    As for ESR's speculation that Compaq will eventually move from Tru64 UNIX to Linix, well, I'm normally with ESR on most things, but on this one I think he's way off base. For the next few years at least. My reasons for this are:

    * The Linux kernel has some way to go yet on the scalability front before it could be considered a potential replacement for Tru64's kernel. Compaqs next generation Wildfire systems will be out soon, with possible configurations of up to 256 CPU's. Tru64 V5.* can scale that high and make good use of it. The Linux kernel will need to be able to match that.

    * No one does clustering like Compaq's VMS clusters, and now Tru64 UNIX is getting the same functionality. This puts Compaq's UNIX way out ahead of any other UNIX with the rest of the field (and NT) left behind with failover style clustering. Porting TruCluster V5 functionality to Linux would be a big job. New drivers for the cluster software, new hardware (Memory Channel), the new Cluster common FileSystem, the advanced filesystem (AdvFS), etc. I just can't see Compaq wanting to Open Source any of this, as it's what will set them apart from the competition.

    Macka

  • Does Red Hat not make its own changes to the kernel distributed with the rest of their distribution?

    Also, you can say that each distribution is a fragmentation. You can't run Red Hat programs on Slackware due to libc problems, that sort of thing.

  • Does Red Hat not make its own changes to the kernel distributed with the rest of their distribution?

    Also, you can say that each distribution is a fragmentation. You can't run Red Hat programs on Slackware due to libc problems, that sort of thing.


    Well, ever heard of *recompiling*?
  • >Linux and the Linux community are both too >immature to make a dent in places where 24/7 is >a requirement, not a feature.

    And you call NT or its ilk 24/7?!??! I've seen Linux boxen running for months without fail, longer, even. Find me an NT server that's run for more than a week, two at best.

    If you are referring to legacy Unix distros, however, they still have the upper hand in this regard.
  • I can't picture it. Sun has made too much of a commitment to Solaris and is still profiting from it. And as much as I like Linux in particular and open source software in general, I must admit that Solaris is a quality product that doesn't need to be abandoned.

    But then, you have to remember, IBM ditched its own web server software in favor of Apache, even though it had invested tons of money and resources into its own web server software.
  • According to their latest press release which describes their restructuring, SGI will continue to develop and support IRIX on MIPS hardware. They also go on to state that they will continue to develop MIPS processors through 2002 when they will transition their high-end machines to IA-64 and Linux.
  • I'm saying, write a piece *for* Slashdot. If it's well-written, original, and on-topic, it'll run.
  • And I'm sorry, Mr. Raymond, but Cathedrals are things of beauty. Your bazaar vision, well... the peasants can roll up the tents and booths and move on when the weather goes bad. 200 years later there is still a beautiful Cathedral standing. There's a bare patch of dirt over there were the bazaar once sat.

    Heh, that's a nice one. What is left standing when an earthquake occurs? The peasants kan rebuild the bazaar in a day, the cathedral will need another few hundred years to be repared. And in the case of a flood, the peasants can pick up the wood from the bazaar, swim to land and build a new bazaar.

    Point being, don't take the metafoor too far.

    TeeJay
  • Yes! Think of it!

    Your Slashdot Title (Score: 3 or higher!)
    Your name and cute link

    Yes! I can envision myself at the top of the Comment listing, I mean hell! I've got /. on a 2 min refresh so I can be in the GNOW!

    This is it! This is the one! This post right here will make me famous, and I alone shall represent the community! This is going to hurl me to the top where my dog and I can manage Linux with a birds eye view and I'll never be interrupted by and ICQ message ever again!

    Yep. This is the one. I've made it. :-)
  • wow...you must really be something amazing to be so much better than everyone else. Perhaps you would like to demonstrate your alleged brilliance sometime?
  • da you have that backwards.

  • Strange that ESR didn't mention Sun and Solaris, I would like to have known if he expects them to join the trend.

    I can't picture it. Sun has made too much of a commitment to Solaris and is still profiting from it. And as much as I like Linux in particular and open source software in general, I must admit that Solaris is a quality product that doesn't need to be abandoned.
  • When I first started using linux, I had the same opinion alot do. It was "all this fragmention in the differnet distros, wms, etc. isn't good"

    After you USE it for awhile, you see it's a strong point, not a weak one. It's not like Windows(not trying to bash it here), where using Win95 for anything but a desktop is like putting a round peg in a square hole. A linux distro can be centered around easy to use(redhat, caldera), very configurabe(debian, slack), or anything else. But at the core, it's still linux. Linux gives you the power to shave off the sides of the peg for a perfect fit.

    Where linux DOES need a standard is on file locations. Like have a program that's follows the "linux standard" will use libraries in /usr/lib, install it's binary into /usr/bin, etc. You can talk to 10 different people, and they will tell you 10 different places to put things. Redhat will dump every binary in /usr/bin. People that compile like to put things in /usr/local/bin, /opt/bin, and other odd places I can't even think of.

    I like to use both gnome and kde aps, I like to change my window manager every couple weeks. I don't like the fact when I go to compile something, it can't because of a library in a different place then where it's looking for.

    So I guess that the fragmention is a two edged sword. But at least it's alot sharper on the good side. I think the sharper side helps shaving that peg (^_^)
  • I didn't miss his point at all. I offered Sam a chance to do something positive instead of complaining. You can do the same. ESR makes his points in a logical manner and signs his name to them. If you don't think ESR's piece was intelligent, go ahead and write one that is. Be original instead of reactive, stick to your chosen topic, and keep it between 800 and 1500 words. And be prepared to get flamed if/when it runs, because no matter what you say, *somebody* will disagree with it. ;-)
  • I'm not sure why, but I never had to do that. make install took care of everything for me.

    --
    Interested in XFMail? New XFMail home page [slappy.org]
  • Apparently SGI is not "dropping" IRIX, nor are they spinning off a subsidiary. They do, however, feel that "from an applications standpoint[,] Linux is the right answer".
    -russ
  • Well, I can say again, Thank You ESR for reminding me why I use Be. I am sick and tired of the fighting about who has the best distro blah blah blah. Look, I use my comuter to get WORK done, not installing, patching, building and debugging the OS just to run a simple app. Be gives me a superfast OS that has a simple and clean feel right out of the box.
    I just test drove two different distro's, RedHat and Caldera. I am sorry to say, unless you have a plain vanilla box, each install craps out. Now, I install my Be in under seven minutes and one reboot. No six-millions questions and blinking screen and scripts that give cryptic responses.
    If any of you out there just want a clean, FAST and easy OS, just give Be a shot. You will be pleasantly surprised.
    Eric, I hear what you are saying, but you really need to wake up and see the world as it is.
  • >I am aware of at least 20 distributions of Linux

    Um, you're wrong. There really aren't at least 20 distributions of Linux. What you are seeing is releases of Linux that's been customized for particular purposes and given away to anyone who may have a use for them. For example I know of 4 or 5 "distributions of Linux" as you put it that are really linux on a floppy disk to be used as a rescue disk. A lot of the other "distributions" are basically RedHat that's been customized for a particular language, like spanish or chinese. This isn't the defination of fragmentation.
  • That's, like, ten minutes of work.

    I don't know what kinda supa-dupa-fly computer you have, a 1000 node beowolf cluster? I know for a fact it takes a bit more time on my PPro to recompile all the packages that come with my RedHat. ;-) Really, I would hate it if I had to recompile my XFree!

    Also, I personally thought the standard RedHat desktop was a bit messy and the folks at Mandrake did a nice job pre-configuring the KDE desktop (add this to the list of differences between Redhat and Mandrake). This could also have saved me a lot of time, I now did it by hand, after which I ended up with a nice (and a bit bloated) mix between Gnome, KDE and Window Maker.

    It just looks like Mandrake is the more polished RedHat. As if Mandrake is the RedHat RedHat should have made themselves. Thank GPL and RedHat that this thing is possible.

    TeeJay
  • Comment removed based on user account deletion
  • by witz ( 79173 )
    More mindless elitism. Nice "community".


    -witz
  • Well, I'd rather not be part of a group of people that have an exclusionary and/or elitist attitude.

    You *did* say that you were considering FreeBSD, didn't you? :-P

    But seriously, running an operating system doesn't mean you have to be part of any particular group. You just use it, and get on with your life.

    I just don't get it when people say, "I won't run , because advocates are jerks. In case you haven't noticed, advocates of all OSs are mostly jerks (probably including myself). So, by your standard, you'd have to go back to paper and pencil.

    Who cares what people who advocate Linux say? Measure it by its usefulness, not by its advocates.

    If you prefer Windows, that's fine. I'm certainly not going to tell you that it doesn't suit your needs; it just doesn't happen to suit mine. As others will no doubt point out, X has no interface, since it is a protocol. There are various interfaces that you can use via X, including GNOME, KDE, WindowMaker, fvwm, and, of course, twm. KDE 2.0, slated for early next year, will probably be the most advanced UNIX desktop environment to date. Check it out when it's released. In the meantime, save often. ;-)

    --
    Interested in XFMail? New XFMail home page [slappy.org]

  • Well, ever heard of *recompiling*?

    Yes.
  • Microsoft works. Some think they are evil, some don't.. I don't care personally. Microsoft WORKS.

    Define "works". I'm using Windows 98 on a souped-up Dell PC as my workday computer, and several times a day I deal with crashes and lock-ups. Even simple tasks like writing a document in Word are so prone to crashing that I hit File-Save every few minutes as a hedge against my operating system.

    They might not be totally uniform (WinNT->95/98/2000). But they are generally 90% compatible and uniform. They allow people to create programs which run everywhere (Windows is closest to everywhere). Write once.. run everywhere.

    While that is Microsoft's definition of "write once, run everywhere," it isn't shared by many people outside of the company. A Visual Basic program compiled on Windows 98 isn't going to have much success on a Linux system or a Macintosh. On the other hand, I'm executing the same Perl script on a BSD system and my Windows 98 machine.

  • Actually, Linus said that you can call it GNU/Linux if you want. I don't know where you got that "the official name of Linux is GNU/Linux" crap. Give me a URL or retract it.
    -russ
  • OK, that subject sounds like FUD, but it would make one hell of an attention-grabber if used as a headline for an article. :)

    I believe that a strong argument can be made that the oft-reviled fragmented nature of Unix was one of the driving factors behind the emergence of the "share and share alike" Unix software culture which many of us have enjoyed long before the term "Open Source" was invented.

    (I would also list a second driving factor, namely the fact that the development of Unix was so fundamentally tied to the development of the Internet, and that Unix users therefore had a means to form a close-knit community from the start.)

    For instance, compare MS-DOS. Why did the "shareware" concept take over on that even more prevalent platform and not on Unix? Why did commercial software become the norm on DOS, while we Unix users were used to the fact that whatever we really needed, we could find "out there"?

    Because developers could get away with spreading their apps as binaries, that's why.

    Binary code as a means of software distribution would never have worked during the early days of Unix, when almost every single installation was so highly tweaked by its local operators as to be a flavour unto itself. If you wrote something cute, you could only spread it as source. Or keep it to yourself.

    Even in the early 90's you had to be the size of a Netscape Corp to be able to develop your app for several flavours of Unix simultaneously and distribute the binaries for all of these. Show me the home developer who has a Sparc, an Irix box, an HP workstation and an AIX box sitting on his desk.

    If my argument makes sense, then it begs the question whether the emergence of One True Unix (read: Linux) won't have a potentailly very negative effect on what is now called Open Source software.

    If it becomes easy for anyone to spread a Unix (read: Linux) app in binary format, won't we see the greed factor (profit motive?) taking over and commercial apps (or shareware or some other form of binary distribution) become the order of the day?

    Or is the open source genie out of the bottle once and for all? Will the community factor mentioned above be enough to prevent this from happening?

    Just wondering...

  • -I won't run , because advocates are jerks.
    +I won't run (os), because (os) advocates are jerks.

    Tried to use angle brackets. Arrgh.

    --
    Interested in XFMail? New XFMail home page [slappy.org]

  • Why is it that having a dozen vendors shipping products which are only mostly compatible is worse than a single vendor shipping a product utterly unlike anything else? I'll grant you that a dozen vendors being only mostly compatible isn't as good as a dozen vendors being completely compatible, but that wasn't the choice.

    No, the "Windows Invasion" had nothing to do with Unix's balkanization (nor would a lack of balkanization have aborted it). Instead it was a consequence of IBM granting Microsoft a monopoly on the PC OS market- a decision no one in the Unix world had any effect over. The Unix Balkanization problem was raised after the fact as an excuse ("it's not Microsoft's fault- really!").

    If Balkanization is a problem, then Windows has it as well (NT/2K, 95/98, and CE being three _different_ and only mostly compatible OSs). And thus is primed for a (un-Balkanizable) Linux invasion...
  • ...who said he expects 3 Unix to survive, Solaris, Linux, and something other than IRIX. Sounds pretty clear to me that its long term future has been de-assured.

    --
  • My redhat 6.0 boxes use 2.2.5-15 SMP. It's 2.2.5 with some ac patches. I find it very weird that several people seem to think redhat maintains their own "proprietary" kernel or something. Huh? If anyone thinks they have real evidence of this, please come forward with it.
  • > [...] why don't *you* write an intelligent piece espousing *your* point of view and send it to me?

    OK. Chances are it won't be covered by Slashdot, but it'll make me feel better to write it.

    -Sam
  • "a pathetic weenie like Torvalds"? All right, Mr. Eponymous Troll...Let's see that whiz bang kernel you've got there. I'm sure it's a thousand times better than anything some "dime a dozen" developer could put out on his spare time. Put you code where your mouth is.

    Why should Linus be the only one to decide what goes into the kernel? First of all, because he started the whole shebang. Second, because he's done a pretty good job of it so far. Third, because he has no particular axes to grind, like the hundreds of ppl from HP, SGI and IBM who could "think him under the table". And finally, because you don't have to be a f*cking Einstein to realize, "Hey! JoeDeveloper from HP just submitted a patch which triples the performance of the scheduler. Should we add this to the next release?" All it takes is a reasonably intelligent individual who knows what they're doing, has their priorities straight and their head screwed on right. (IMHO, I'd say Linus qualifies)

    Heaven help us if these decisions were made by committee....
  • Obviously you did not read his paper carefully. Nobody stepped up to take his place so here he is. That was the whole point of his "take my job please" paper. The moral of the story is if you don't like how he advocates open source then do it yourself.

  • May Linux be harming the 'unification' of unix?

    First of all, Linux seems to be pushing the commercial brands of unix out of the x86 market. SCO is in trouble, and I wouldn't be surprised if Sun stops supporting Solaris86. Linux does not seem to have much effect at all on the sales of Windows NT (correct me if I'm wrong, and please include some links inbewteen your insults to back it up).


    Besides that, the avalanche of media attention Linux has been getting lately, in combination with the Halloween document (am I the only one who suspects MS may have leaked this to ESR on purpose?) must be greeted with cheers by a certain company that is currently in court trying to convince the US government that it does not have a stable and untouchable monopoly in the OS market. Since none of that company's direct competitors seem to be getting any richer thanks to Linux, it is probably not seen as a real threat...


    Of the available unix variants, Linux seems to be one of the 'strangest', least standard (and perhaps least compatible?).


    If Linux really does unite the unix world by simply replacing all others, I very much doubt that regular users of HP-UX, Digital Unix (erm..), IRIX, AIX, Solaris etc will see this as an improvement.


    SCO, IBM, Sun, HP, SGI and others must be supporting Linux -one of their own potential competitors- for a reason. My guess is they are so afraid of Win2k-on-Merced, that they will support anything thay may slow win NT even little, and are quite happy with the successful FUD campain against NT by the Linux community.


    Telling everyone that something good (linux) is actually the very best thing that has ever happened in the history of the universe may eventually make it look like a disappointment.
    ---
  • ESR is right in that the huge number of *nix variations are slowly being abandoned. Over the years there have been hundreds of *nix variations, and it got to be ridiculous to try and support an application on more than a few of them.

    Its a good thing the *nix vendors realize there is more money to be made in service and support, rather than tricky features and special proprietary hardware. As more of them are being absorbed by the OSS model, they realize exactly where the profit comes from and focus on it.

    It would be a bad thing if there were too few *nix variations, as many knowledgeable slashdotters point out whenever there is a melissa style virus sweeping thru the media. If there were only 10 or so variations of *nix just like there are only 10 variations of Windoze, then an exploit could hurt many more people with less effort.

    I doubt there will ever be only 1 version of unix in the future, but it would be nice to see no more than 20 or 30, with most of them touting their adherence to a common standard for libraries and structure.

    the AC

  • I am amazed at all the Microsoft FUD. Microsoft has what, now, 30 people who do nothing but read Slashdot and spread FUD?

    I work for a software vendor. We make commercial software for Linux. It works on all distributions. It ain't pretty to make it work on all distributions (we basically have to distribute a statically linked copy, along with a dynamically linked copy in order to comply with the LGPL), but so it goes. We run our entire internal infrastructure off of Linux, and our developers have various Linux distributions (heck, my desktop is FreeBSD!). WordPerfect and Applix are our internal word processor and office suite, and both work on every machine in our office, even on my FreeBSD box.

    In other words, we're talking pure FUD. Yes, it takes a bit of care to make your software work on all Linux distributions, but a commercial vendor can do it without much problem. WordPerfect does it. Applix does it. BRU does it. If vendor X doesn't do it, that's vendor X's problem, not Linux's.

    -E
  • ESR's comments here truly outdo the negative attitude seen following the RHAT IPO. In fact while taken in comparison with suck.com's article a few days back, it shows how the community is (or at least should be) reacting to the IPO vs. how the rest of the world views what occurred in the community during the past 2 weeks.

    He concisely addresses the whole "shareholder demands" argument by showing that these publicly owned companies are seeing that the advantage in adding to the unix codebase via the linux community.

    I argued the other day (in response to the suck article) that shareholders outside of the community don't mean squat in the matter of development. And this is precisely due to the way that linux evolves. However, I do believe that shareholders within the community now realize the importance of their contributions since it breaks down monetarily.

    Finally, I believe that the end result, once we've looked past the IPO, will be more of the same. And this is good. The group that did not get the letter will still (hopefully) continue to contribute. Some naysayers say the contributions will be due to the promise of tomorrow's IPO and this may very well be the case for some. But I say the contributions will continue since people enjoy contributing.

    If RedHat or any of the other companies must develop something to meet the demands of shareholders, then the product must also meet the demands of the community for two reasons.

    1. It must be useful for the community for our own reasons or adoption will not occur, causing the doomed fragment to be weeded from the standard Linux distribution.

    2. It must be well developed within the community or someone will be compelled to develop something else to compete. And the competing project may indeed have an advantage simply due to the "anti-establishment" vibes that are prevalent within our group.

    Well, I'm glad to see another article in which I can agree with ESR. Sometimes they seem far and few between.

    And that's my whole take on things.

  • Hey, dude, you need to read the FUD 101 HOWTO [tripod.com], you're doing a lousy job!

    You did okay on your FUD Method #1 (exaggerating weaknesses) with the "mediocre device support". But you need to apply some FUD Distraction Methods for the FUD #2 (outright lies) where you state "no meaningful GUI", since Linux has at least two meaningful GUI's (GNOME and KDE). I suggest that next time you try more extensive "Sandwiching" (distraction method #1), preferably by using FUD Method #1 to attack various attributes of those GUI's. Same goes with the 'ages of cruft' and 'so-so performance', you really need to use some distraction methods to make your FUD stick. If you're stuck with how to do that, go to Microsoft's very own "Linux is a poor value proposition" [microsoft.com] page, which is a masterful blend of FUD#1 (exaggerating weaknesses), FUD#2 (outright fabrication), and FUD#3 ("spinning" a strength as a weakness).

    Sheesh, how much is Wagged paying you anyhow? Whatever it is, it's too much, 'cause you're doing a LOUSY job of FUD! I know you can do better, after all, your firm did an excellent job on the "Linux is a poor value proposition" page...

    -E

  • Go to http://www.estinc.com

    Yes, our shrink-wrapped commercial software product will run on every commercially available Linux.

    I am happily running my 1997 vintage Applix Office on SuSE, Red Hat 6.0, and FreeBSD with no problems. So it's not just BRU that runs pretty much everywhere.

    Next FUD, please!

    -E
  • Linux is fragmented moreso than any commercial flavor of Unix ever will. (snip) We have GNOME and KDE (need I say more?). umm.. Yes. You need say more, as I don't understand what you're getting at. KDE and Gnome, while different beasts, interoperate pretty damn well. I recently installed Gnome on my (previously) KDE system, and had no problems at all. KDE software ran very well in Gnome (including the KWM!) If you prefer to code for KDE, use QT, if you prefer Gnome, use GTK. Software from one will run in the other.
  • Yeah. My only real grief with the dependency tree right now is with Mesa, GGI, and all these other things where dselect keeps on insisting I want mesag3+ggi when I don't. It seems such a foreign concept to it that since I have GGI installed and I have MesaG3 installed, I must therefore want MesaG3+GGI instead of plain MesaG3 (which, of course, then breaks other stuff, such as xscreensaver and other things that depend on MesaG3).

    Why the G3, anyway? I've never quite figured that out... I thought it was called 'Mesa' or 'Mesa3D'. It's v3.x right now. Maybe that has somethng to do with it.
    ---
    "'Is not a quine' is not a quine" is a quine.

  • I'd have to disagree with that. The main allure GNU/Linux has for me is that it's free software. On its technical merits alone, I don't like it. I've used it before (Slackware 3.0), and I was not impressed. X especially is pretty shitty for a windowing system, unless you need to do networked stuff (which I don't). The whole "configure everything by editing textfiles" thing doesn't impress me either.
  • Well, the main reason I was going to use Linux in the first place was because of it's advocates. The GNU project and free software seem like worthwhile causes. If I ignore the advocacy from both sides, then I really don't have a need to switch. I don't need extensive uptime, and this box isn't a server.

    As for X being only a protocol, yes, but it's also the foundation of the whole windowing system. Many of the problems are caused by X itself, and the various window managers try kludges that sometimes work around them, but usually only partially. Even something as basic as cut and paste proves problematic in X, while a 1984 Macintosh can cut and paste between apps without any problems.

    Lots of other things seem strange, nonintuitive, or just downright dumb to us Windows users. Why can't you configure X within X itself? What's with the separate XF86Setup? Why do you have to run XF86Setup to run X? Why doesn't X have good auto-detection routines and some decent defaults so you only need to run XF86Setup if you wish to further customize X? Why is installing a new kernel an 8-step process? Why isn't there a decent archiver (one that lets you extract a single needed file, like zip, rar, arj, zoo, ace, etc., rather than tar.gzip which requires you to extract the whole thing)? Why isn't there a decent simple text editor? (no, pico doesn't count, and if you consider vim "simple" you're insane) Why is ppp so damn hard to configure?

    I can think of a few more, but that's enough for now.
  • SGI IS dropping Irix on many of it's platforms. Instead, it will focus on hardware platforms in the workstation market, and on graphics chipsets in the PC market (SGI's real strong points IMHO). That does have a unifying effect on the Unix world.

    On the big iron, SGI is staying with Irix. That's also a good call. Making Linux ready to go for those platforms is still several steps away.

    I certainly DON't want SGI to go away. XFS and other things for Linux are all GOOD THINGS! I also hope my next machine has SGI graphics chips and bus archetecture.

  • All Linux distros use the same kernel.

    No they don't. Red Hat Linux distros use a custom Red Hat kernel. Usually, it's an older kernel than the current latest "stable" kernel, but with some of the newer features and bugfixes added in, to make for a truly stable kernel (the "stable" kernel tree itself is somewhat of a misnomer).
  • by Trepidity ( 597 )
    I'd say ESR is one of the main reasons I'm still using Windows rather than Linux. Articles like this, filled with half-truths, omissions, and outright lies are what's kept me away. Of course, ESR isn't the only guilty party, much of the Linux "community" behaves likewise.

    IRIX is not being dropped, nor is it being replaced with Linux. IRIX is still being supported and developed for SGI's high-end servers, which Linux cannot, and most likely will not, run on. Linux is for low to mid end computers, not enterprise-class servers. That's what IRIX is, and will continue to be, for.

    Linux is not "re-unifying" UNIX. There are still many different fragments of UNIX, ranging from Linux to FreeBSD to Solaris. The various BSDs seem to mess up ESR's arguments, so he just omits them. Typical.

    Anyway, RMS's writings had almost convinced me to switch to Linux. Bruce Perens has done a good job as well. Unfortunately, the rest of the Linux community, along with ESR, has done the opposite. That, and the fact that I REALLY dislike X, is going to keep me in Windows, at least until I get some spare time to install FreeBSD.
  • ESR really needs to check his facts before he goes spouting off.

    That's what I've thought after nearly every single article of his I've read. Apparently he'd rather generate good PR than be accurate and truthful.
  • I wasn't implying that the Red Hat kernel was different because it was a different version. It's not even part of the Linux kernel development tree. It's not *any* kernel that you can find on ftp.kernel.org, but a custom kernel of their own, that differs not just in how it was configured, but in the code itself.
  • I wrote a comment to this on LinuxToday too and I don't want to duplicate the effort, but I think it's important to point out that we will see some very natural fragmentation in the community which is the fragmentation that occurs when developers realise that Linux isn't bleeding-edge anymore and goes on to work on something else which in time will probably replace atleast the Linux kernel.

    Eventually, the Linux kernel will be kept alive by corporations who has an interest in the kernel because they can make money off it. These companies might be working together to reunify Unix, but we'll see some fragmentation between companies and the bleeding-edge hackers. And I think we'll see this very soon.

  • I wouldn't recommend switching operating systems based on what members of the "Linux community" (whatever that is) have to say. It seems to be to be a pretty dumb way to choose an operating system.

    What I did was install Linux (many different times). When it got to the point where I found it more useful than OS/2 (around 2.0.29), I switched.

    If it works for you great. If not, oh well. But, really, when it comes right down to it, who really cares what ESR, RMS, or anyone else has to say? (Don't get me wrong: I think they have interesting, valuable things to say; I just mean that their writings don't have much effect on your productivity.)

    Put another way: I don't use Windows, not because of anything Gates or Ballmer have to say, but because it's a steaming pile of dog shit.

    --
    Interested in XFMail? New XFMail home page [slappy.org]

  • Open Source is anything but central control. It means nobody can force a single vision on the market, anyone can branch out at any time. Control is totally given over to market forces (i.e. the users).

    However, as long as the individual need of a majority of the users are better served by options in a single development tree, that is what most users will get. When the users are better served by divergent trees, that will become more widespread.

    That is the difference between free software and proprietary systems. With free software, control is in the hand of the users. Including control over when to fork the project. With proprietary software, control is in the hand of the company owning the software.
  • ...has nothing to do with configuration of packages. It's a nice (if not sometimes aggrivating with certain packages on the bleeding-edge release) "GUI" frontend to the various package management tools, namely dpkg and apt.

    Thus, your response had nothing to do with the message you were responding to. :)
    ---
    "'Is not a quine' is not a quine" is a quine.

  • I suspect suspect the smarter the technology press journalist follows /. closely. So memes originating here might end up in the press. Especially if your name is ESR.
  • I can't figure out how to install a new kernel in Red Hat -- the standard process appears broken, and I'm not interested in working a way around it.

    This is what worked for me:

    make xconfig
    make bzImage
    make modules
    make install
    make modules_install
    xemacs /etc/lilo.conf&
    lilo

    YMMV. HTH. HAND.

    --
    Interested in XFMail? New XFMail home page [slappy.org]

Professional wrestling: ballet for the common man.

Working...