Let's Make UNIX Not Suck 355
The above is a title of the talk that Miguel de Icaza of Gnome and now Helix Code fame gave at OLS concerning the look and feel of the UNIXs. From what I've heard from attendants the talk was great - and now you too, in the privacy of your own home/cube/lean-to/car can read it.
Re:What's this got to do with UNIX? (Score:3)
But Unix doesn't use those things enough! The philosophy hasn't carried over to the graphical applications, so we have a schitzophrenic Unix where the little text tools try to do one thing well, and the GUI applications are monolithic and try to do everything.
If you like the idea of building a specialized text-processing app by combining some generic tools (e.g. grep, awk, sed, etc), then wouldn't you like to be able to build a specialized GUI app the same way (e.g. HTML renderer, image editor, calendar, etc)? Don't you see a difference between the beauty of grep and the disgusting bloat of Netscape Communicator?
---
Insufficient Coffee (Score:2)
Insufficient Coffee in Operator - System halted.
Sorry, got the band name wrong. That should read "Great Big Sea" (I always get it wrong), and the song is on their album "Up".
Re:Proposal: Linux Unified Model (Score:4)
The point is that there needs to be some sort of policy that is loose enough so that anyone can use and understand it, but rigid enough to enforce a metaphore that works and in understandable by non-computer-cluefull people.
Bonono not GNOME Dependent (Score:2)
Check out the full description at
http://www.helixcode.com/tech/bonobo.php3 [helixcode.com]
Just to quote from that page if you can't be bothered :
Bonobo is a set of CORBA interfaces that define the interactions required for writing components and compound documents. These CORBA interfaces are not bound to GNOME, the X11 windowing system, or the UNIX system.
The Bonobo distribution as shipped by the GNOME project includes the Bonobo CORBA interfaces, and a GNOME/Gtk+-based implementation of these interfaces.
Re:First make GNOME not suck (Score:2)
TM'd title (Score:3)
Posix system's aren't really aimed at beginners - that's what people keep forgeting. It was designed for use by people who know what they are doing and how _they_ want to do it - not the way a Redmond base drone wants them to...
Richy C. [beebware.com]
--
Re:TM'd title (Score:2)
No disagreement here -- I just don't uinderstand why we have to reinvent the text file layout for every application, and why some of the basic function (like cut and paste) can't be described in a single text file, like the win.ini file of yore.
Having a binary registry with everything dumped together is the most collosal mistake of the entire Win32 system these days, but that's not to say some values shouldn't be centrally configurable (and if you had a standard XML or whatever format, you could easily make full-featured config editors!).
Note that due to the standard file layout of the MS
I'm an investigator. I followed a trail there.
Q.Tell me what the trail was.
Re:TM'd title (Score:2)
Microsoft dose not (and dose not want to) sell powerful and flexable computers/interfaces. They want to take advantage of the fact that computer hardware manufacturors sell a universal turing machine to allow themselves to sell black inflexible easy to use boxes (like wordprocessors, web servers, web browsers, spreadshets, etc.).
This is just like a TV company saing You can buy a computer with a large monitor cheeper then our TVs, so I'll sell you a computer with a large monitor and software to make it act like a TV to save money.
Anywho, the moral of the story is that black boxes and uneccisarily specilized systems are bad. The power of a computer comes from the fact that it is not an unnecissarily specilized system.. and a powerful user interface must inherently be a programming langauge. Now, it may be an easy to use and learn programming langauge (like that scripting langauge Macs had), it may even be possible for a really stupid person to use a computer for years without noticing it's additional abilityies, but it must have the full power of a turing machine.
Note: I'm not talking about having a scripting langauge allongside the user interface. i'm talking about the very underling user interface being a scripting langauge. This is necissariy to make the transition for people from user to programmer as painless as possible.
Anyway, it's only when everyone has some limited programming instincts that people will not waist time doing the same thing over and over agian. This should be the goal of user interfaces and Unix got a nice start on this goal (via schell scriting), but it's necissary to expand this today to include GUI user interfaces too.
Re:UNIX's "problems" are really flexibility (Score:3)
Bollocks, pure and simple. The first Real-World deployment of Unix (i.e. outside ken's lab) was to the secretaries in the patent department of AT & T to enable them to write up patent applications, hence the emphasis on text-management and typesetting software in the early releases of the Unix system.
As for Unix being intended as a server OS - quick reality check required. Unix dates from the late 60s/early 70s. The client-server separation was fifteen years in the future - back then there were no clients. Unix was designed to support multiple terminals (tty0 - n) hanging off a central computer (originally a PDP8 IIRC) which displayed plain text in black and white: termcap files, full-screen editors and the like didn't arrive on the scene until the mid-70s and prior to that people edited with ed.
Get yourself a copy of the Unix Programming Environment and read it, then come back and post an informed comment.
--
Cheers
Re:Policy and lack thereof (Score:2)
There are all sorts of things that you can expect to be able to do on any Unix-like OS. You expect X-Windows, NFS, vi, etc. Yet at one time or another most of the Unix "standards" had some sort of competition.
Miguel is proposing to standardize Linux (and Unix as well) simply by creating Free software that is so cool that it becomes ubiquitous. Instead of writing their own HTML widget, XML parser, Address book, text editor widget, etc. programmers will simply borrow the GNOME widgets. Programmers will do this because GNOME is free, comes with source code, and because GNOME can be installed anywhere.
GNOME will simply become one more layer in the existing framework that is UNIX. Some people won't use it in much the same way that some Linuxers don't use X-Windows. But most of the new software will happily use these shared components in much the same way that they happily use the existing shared Unix libraries.
UNIX trademark (Score:3)
Re:TM'd title (Score:4)
No, people keep forgetting that everyone is a beginner at some point.
Making a system good for beginners does not mean crippling it for advanced users. I think we should be past the point where complexity for complexity's sake should be attractive...
I'm an investigator. I followed a trail there.
Q.Tell me what the trail was.
Re:Component systems and versioning (Score:2)
Saying that CORBA's interface versioning solves this problem is like saying that a hammer solves the problem of nails. It's a tool, valuable to the extent that people know how to use it properly by making sure that the interface definition accurately and completely describes the behavior of a component and that the "actual interface" doesn't change without corresponding change to the "official interface" - something that doesn't just happen on its own. Arguably, what you really need to solve this problem (and, even more arguably, what some languages or programming models give you) is a way to ensure that all behavioral dependencies are captured in the interface definition and that nobody can even begin to depend on something that could change.
Re:Great Article... (Score:4)
There has to be a way that a system can serialize data, provide access control to serialized objects, and store thoose objects. This is what Linux is missing in a big way.
Componets are cool, but if we are going to add componets, lets add a constant model or policy, and flexability in. Let's also make it so that there is a common configuration interface, a block or object interface, and some heirchal or non-heirchal way of looking at it.
*Smack* (Score:2)
So what's the problem? What DOES suck is the attempt to make Unix into a friendly operating system. Great.....IF you know what you're doing.
The Microsoft Windows 95 interface is successful because they had Human/Computer interaction experts working to design the interface. (Don't start with the 'Microsoft copied Apple copied Xerox' crap. Each interface is different enough that SOME design went into it.) The problem here, is that we have the Gnome people, and the KDE people, etc., trying to make a Graphical Interface - and they're trying to be too much like the other interfaces out there, and the rest of the design is just being assumed.
UNIX doesn't suck. It was designed to be a powerful network operating system - and that's exactly what it is. Sure, 10 years down the road it's going to be different. However, the basic functionality will be the same.
To say an Operating System sucks because of bad design of an additional interface is ignorant.
-- Give him Head? Be a Beacon?
Re:UNIX's "problems" are really flexibility (Score:2)
``Unix Sucks'' was not the title! The title was ``Let's make UNIX Not Suck''. I don't think there really could have been a better title.
--
Ski-U-Mah!
Let's make UNIX suck like Windows (Score:2)
Code Reuse was the mantra in the eighties, not the Next Big Thing for the year 2000. There are obviously some cases where it makes sense, but its not a cure-all for derailed software projects. Code reuse is well suited to projects with a short development cycle, a short life cycle, and a small user base. Most internal IT applications fall in this category. But it doesn't make much sense for projects with ongoing, constant development, long life cycles, and millions of users. This is where fine-tuning "custom" code makes sense, despite the current aversion to hand-written code by people who consider themselves to be knowledgeable about such things. Every re-write is an opportunity to improve the code. Software reuse effectively eliminates this opportunity, and in time leads to an obsolete and brittle code base. Code reuse is a part of effective software development, but it shouldn't be the central tenant around which it revolves. That distinction belongs to good design and well-written code.
The 'no policy' policy of X is a Good Thing, in my opinion. There are some advantages to having a consistent UI, but they are outweighed by the disadvantages. A certain amount of chaos always accompanies innovation, but chaos is better than being subjugated to an inflexible policy. There are always going to be people who want to force their ways of doing things upon others, and UI policies are a great way for them to accomplish this. The best policies are the ones that are voluntarily adopted by large numbers of users over time. If it works for that many people, then there's probably something good about it. But if there isn't consensus in a particular area, trying to enforce policy is not going to meet with much success. MSFT gets away with this because they have a more or less captive user base.
Miguel has a few good points here, but overall this article could be more appropriately titled "Let's make UNIX suck like Windows".
Re:First make GNOME not suck (Score:2)
That is a misfeature of RPM then (and other packaging systems which have the same problem). Dependencies should not be an either/or thing -- different versions of libraries can and do coexist just fine. I agree that -U should not nuke old libraries by default, but an alternative would be --install --force, whereby both library packages are in place. Of course, one must use this with caution, as many packages include other files (headers, config files, etc.) with libraries, whcih IMHO is also a mistake. Libraries are LIBRARIES, header files and config files are something else. The -devel scheme tries to address this and (mostly) succeeds, but a more explicit approach might be better:
package-0.0-1.bin.rpm
package-0.0-1.conf.rpm
package-0.0-1.lib.rpm
package-0.0-1.devel.rpm
Whatever scheme is used, I agree it is a mistake (and disengenuous to some degree) for popular packaging systems to actually undue one of *nixes tremendous strengths: library versioning and peaceful coexistence.
Re:What's this got to do with UNIX? (Score:2)
The same could be said for StarOffice, which is a big bloated mess, but can't share its code.
The real future of Linux is looking towards things like Mozilla and galeon, where galeon can reuse the code from Mozilla to create a small browser.
Re:Missing the point... (Score:2)
The primary difference, in my eyes, is that with open source, there's a chance of breaking this cycle. If people can be persuaded to support a larger project, or combine efforts when their applications are similar in purpose, then they can learn from each other's mistakes, and come up with a much better product. And, the little projects that fork off from the "core" codebase sometimes evolve into something that's better suited to a particular niche.
Would a tool like thttpd still exist if it had been written exculsively for Windows, and the source had been closed? Sure, Microsoft might have simply bought the rights to it, if users wanted that functionality, and rolled it into the IIS codebase. But then it would just be another checkbox on the same monolithic monster, instead of a tiny, hella-fast "ace in the hole" for certain applications.
I would hardly say that Windows programmers don't "recycle shit that's already written," since they largely put together their projects out of Microsoft-supplied code. The Windows platform basically forces you to use the "blessed" libraries. That may help to limit duplication of effort, but it also limits your choices.
Some real examples please? (Score:2)
Your examples of software projects that fail to have code reuse are mostly (or all, I'm not sure) propriatary non-free projects?!?
I've seen studies on the amount of code reuse among free software projects, and it's quite surprising how much is reused. For libraries you don't mention like readline and regex to small bits and pieces here and there.
"The only common denominator on those applications is libc and Xlib."
Perhaps you aren't aware of a little program called `ldd`. Try this:
ldd /usr/bin/* 2>/dev/null |grep "=>"|sort |tee /tmp/ldd.1 |cut -f 1 -d " "| uniq > /tmp/ldd.2; for i in `cat /tmp/ldd.2` ; do echo -n "$i : "; grep -c "$i" /tmp/ldd.1 ; done
That's what my "computing experience" is about, and it's not something I would ever want to try with a drag and drop interface.
Re:TM'd title (Score:2)
Proposal: Linux Unified Model (Score:2)
Instead they should be looking at a unified policy. The reason that Linux will never rule the desktop the way things are is because Microsoft is simple. You want to change a setting for the system? Goto the control panels. Change a setting for a application goto options->preferences.
I am not saying that these are the options to use, but instead imagine the following. Envision something like the
Better yet, do it in RDF and let any arbitrary program write to the RDF tree like a filesystem. For example, a translator could map the filesystem into something like
Use translators to set options for apache etc. Use mozilla or IExploder to set your configuration. You can probibly serialize objects into the RDF tree and bingo, you have poor-man's DCOM and object sharing. Wouldn't it be cool if you could write something like this in perl
#!
use UnifiedModel::RDF
my $print = new RDFNode("/world/www.ahost.com/objects/default_pri
$print->isA("Printer") || die 'Could not Open\n';
$print->print($text);
Why should we do this? By having a loose policy implemented via RDF or something, we have a single simple policy for storing information and a powerfull way to share componets.
I am seriously considering coding something like this. Anyone else interested?
Re:Wanted: Killer Apps for World Domination (Score:2)
>Sorry, but you've got no defense here. Balsa, Mutt, even emacs will read mail. Gnome folks are even building an Outlook clone.
>4. Editor. Uhh, I use vi and emacs when there is absolutely, positively, nothing else available. Don't get me wrong, I first learned emacs over 8 years ago. But there are some basic functions which I rely upon that don't exist in emacs. Give me something like HomeSite on a linux box and you've got a convert.
Try Screem.
That's whatI liked about the redhat distro. It came with Pico and pine. 2 great tastes that taste great together. If Corel Linux was made for the native win95 users for its fancy interface, why was pico and pine left out? Those to me are the easiest to use console mail/editor apps that I have used since 1993.
Just my 2 cents.
Re:Missing the point... (Score:2)
Middleware (Score:4)
What he's really pointing out is UNIX doesn't have a modern middleware layer.
The history of modern middleware begins with Visual Basic "buttons", which were invented by Alan Cooper. (Microsoft bought this technology; it wasn't developed there.) Visual Basic made it possible to write medium-sized business applications with graphical user interfaces without much pain. Code reuse worked well in that environment, and it was easy to access a database. You could have the good programmers write a "button" and let the lesser programmers drag and drop the button into their app. This was a major driver in moving corporate America off the green-screen IBM mainframe terminals and onto Windows.
Programmers were pushing the "button" idea way beyond its intended uses. So Microsoft expanded it into COM, DCOM, and Active-X. It turned into a huge proprietary do-anything object system. But a lot of work gets done with that toolset, even though it's become ugly.
de Icaza has correctly identified the lack of a comparable middleware system as a serious problem in the UNIX world. Whether he can fix it remains to be seen. It's very hard to get this right. The past is littered with failed middleware environments: OpenDoc and NextStep come to mind.
A big problem is that if you let the object fanatics design the thing, it ends up too abstract and complex. If you let the UI designers design the thing, it ends up not powerful enough. CORBA and Prograph are at opposite ends of the spectrum here. (If you let the hackers design the thing, it ends up like Perl. Perl, remember, started as a tool for reading text logs. It's a special-purpose language pushed way beyond its design basis.) This requires really good engineering judgement.
For some insight on how to make design decisions here, read Weaving the Web [amazon.com], by Tim Berners-Lee (you know, the guy who invented HTML and the Web), page 182, where he discusses why HTML isn't a programming language like TeX. He says it better than I can, and I'm not going to repeat him here.
I hope de Icaza can pull it off. From reading his article, he has the basic good sense needed to get it right. Best of luck to that project.
library dependencies (Score:3)
This is exactly one of the reasons UNIX is in so much trouble -- even something simple like the concept of shared libraries (at least in practice) is a relatively new thing in much of the UNIX world. It's not very well understood, let alone welcomed.
People bitch and moan about having to deal with getting a whole bunch of libraries because of runtime dependencies. Deal. It's the cost of componentizing software, and it's not like there aren't tools that can manage the complexity for you (e.g. apt-get).
And, for those of you who don't think you need shared libraries, try replacing all the binaries on your system with statically linked versions and see how much disk space you have left...
There is also a good deal of management flexibility and efficiency that you gain through using higher-level component systems, as well. I know the Unix pipeline has been offered as a viable model, but it's not really complete enough.
Unless you extend the role of the filesystem abstraction as in Plan 9, the UNIX file/pipeline metaphor is simply not a viable component model in and of itself -- if it were, every application could be written as a shell script (which you can mostly do in Plan 9, btw, even if it would be a bit slow).
There's a point where you just have to stop managing all of the complexity yourself, and delgate some of it to software. The untyped data streams of the Unix pipeline (without the additional policy/flexibility imposed by Plan 9) don't sufficiently allow that.
Failing a Plan 9-esque level of abstraction, you're going to have to settle for a higher-level layer like Bonobo to really function in a modern environment.
Re:First make GNOME not suck (Score:2)
Miguel
Re:Policy and lack thereof (Score:2)
For instance OpenGroup has required CDE as part of a distribution in order for it to be called UNIX. Although I personally do not like CDE, the use of a standardized window manager that allows for multiple flavors of UNIX to look and feel similar is a necessity for its survival. It allows for application developers to hit a target that is not moving, they may have to deal with some library issues but they don't have to deal with customizing the UI to several different standards too.
At the same time we do need to keep the flexability that *NIX enjoys. That is one of the reasons why I switched to Linux. But I have quite a few friends who want to switch to Linux and just want a standard GUI. They don't want to deal with which window manager, which application manager, etc... They have enough problems deciding what distro to pick. They don't want to take sides in the geek religious wars that are being waged, they just want it to work and consistantly.
Linux needs more standards, especially in the GUI arena. I say keep the religious wars in the background, but allow them to be accessable to those who want to participate. Decide on standards and then send them to the forefront. This will make companies trying to support Linux happier, it will allow for easier customer support, and will draw in a larger group of users.
UNIX's flexibility is linguistic and powerful (Score:2)
The biggest source of flexibility in UNIX is that everything can be manipulated with linguistic tools. If Microsoft ever ships a truly easy to use from-the-command-line scripting system that can easily interact with and manipulate COM objects, they'll have achieved much of the flexibility of UNIX. If they ship Perl/Python, then they'll have a language rich enough to generate its own hashtables and lists of data that can then be analyzed/treated as if it were a text file, and regexp's and the like will be available to work with their full COM suite.
I've spent almost five years now developing a GUI, object-based sysadmin project for UNIX [utexas.edu], and it has taken almost that much time to convince some die-hard UNIX traditionalists here that the very poweful consistency safeguards, error checking, privilege delegation and support for n-user simultaneous editing that it provides was worth giving up the ability to do grep on the passwd file.
I'm with Miguel all the way on this.. something like COM can mean *more* flexibility, as long as we have good scripting tools that make working at the higher level easy, and as long as the COM-style interfaces are designed with a *lot* of thought towards flexibility and security.
If we get COM-style interfaces that prevent us from doing things that the designer thinks we shouldn't do (such as setting a pre-hashed password for a user account, which NT doesn't provide an API for), then a COM would become a barrier. If the interfaces are designed to be as open as possible while still hiding implementation -specific details and providing safety and error checking, then a COM becomes a tremedous strength.
I sure wish Miguel would have talked some about security, though.
When I was a beginner... (Score:2)
Soon I was looking up man pages on various commands like vi, as I tired of the limits of pico. I was looking up various things on how to customize my shell and use it better. A little while after that I was installing Linux on my home system (this was a few months before kernel 1.0), and a couple months after I managed to get that working with X and PPP, I wiped out my DOS partition. I've been learning much more since.
I don't think I would have started out liking UNIX at all, if I didn't have some sort of guide like that character based menu, that told me exactly what was happening behind the scenes in the shell.
Something of this concept, perhaps in a GUI even, would be very good for beginners, and would do nothing to cripple things for advanced users.
Do we need both Gnome and KDE? (Score:2)
Re:Proposal: Linux Unified Model (Score:2)
(damnit Rob, i have the sentence "use CORBA" and that fucking lameness filter freaks.)
Re:First make GNOME not suck (Score:2)
I totally agree that GNOME, KDE, Linux, and Posix based systems should try to avoid being like Microsoft (with the execption of COM / CORBA type stuff). Actually, I would say that user interface developers should say "Microsoft did it this way, that means it must be bad, unless there is real academic research backing this idea up." (real academic research means research not being preformed by the Microsoft did it all right morons who sometimes get tenure in CS departments.
Unfortunatly, I do not see how the complexity you describe relates to that (execpt the associate files of this type complaint). I do not think I'm the only one since all your other replies go to answering the library question without answering the associate file question or the larger question of copying Microsoft. This means you really need to express yurself more clearly.
Anyway, I agree with you, that there are many stupid things which GNOME (and KDE) get from Microsoft. They shuld really try to emulate a good user interface like Plan9.
Re: (Score:2)
Re:UNIX's flexibility is linguistic and powerful (Score:2)
Right, I agree that WSH is some pretty nice stuff, but so far as I know, Microsoft hasn't completed the picture by shipping out-of-the-box a Perl or Python (or anything even remotely similar) runtime that can be used from the command-line. I know you can write VBA, JavaScript, or VBScript code that interacts with COM, but until that capability is as ubiquitous as the gawdawful .BAT file handling, and doesn't require a GUI application, then all versions of Windows will be really limited.
I want to be able to ssh into a Windows NT system and run a script to interact with the operating system.. create users, synchronize account groups, and the like. Today I have to install a third party RSH/SSH service along with ActiveState's Perl and the Win32::NetAdmin and Win32::AdminMisc modules. It should be possible to script things without any GUI tools on NT today.
And .BAT doesn't even come close to being the same concept as what I'm talking about here, of course. ;-)
Re:Components at install level (Score:2)
Some very good points here. When I get moderation I'll remember to come back here and bring this up.
However, I'm not convinced that XML should be used. I would insist that the configuration file abstraction be one layer up, meaning any configuration format is allowed as long as the common parser can parse it into whatever the tree might look like in memory. Then your at the same point you would be if the file where xml. IOW the memory model is the same but persistent storage is cofigurable.
I have looked at XML quite a bit. I haven't written code for it, but think about what a typical config would look like in XML. Try something as trivial as syslog and you'll see how ugly it can get. Not all configs look like they map into XML as well as apache.conf or smb.conf.
KidSock
Re:Miguel is sadly mistaken (Score:2)
Re:No. Go read the docs. (Score:2)
It is based on COM or whatever name they give to COM these days (COM+
miguel.
Re:*Smack* (Score:2)
I never said it was bad. I said it was bad when it's not done WELL.
-- Give him Head? Be a Beacon?
Re:OFF WITH THEIR HEADS !!!!!! (Score:2)
Miguel
Re:First make GNOME not suck (Score:5)
lynx -source go-gnome.com | sh
We take care of the library issues for you, and you can focus on compiling Galeon (which we plan on including on Helix GNOME as well in the near future).
Miguel.
UNIX's "problems" are really flexibility (Score:2)
That said, Miguel has a point about code reuse, and my guess is that many people reinvent the wheel because they enjoy the challenge of coding. We certainly are on the way to reuse with things like Bonobo, and I think we'll see more reuse in major projects than we have in the past because it makes architectural sense.
Miguel also should have picked a better title. "Unix sucks" is *not* going to get the mainstream to read the article, they're too used to one-line sound bytes.
What's this got to do with UNIX? (Score:3)
Or maybe I just don't try to use UNIX as a component-based system, and as such don't see it suck. Maybe I'm not fitting a square peg in a round hole (or vice versa). When I want object-ness, I do use BeOS. UNIX!=User-friendly object-mish-mash-component-SOAP-XML-Hype. UNIX is a way of thinking that's different than other paradigms, and because of this UNIX sucks? I hardly don't think so.
Re:Miguels Ignorance (Score:2)
The same reason people rarely say "GNU/Linux" - it's not that we don't love the GNU tools, but give me a break, it's Linux.
Coca-Cola would have an easier time trying to convince people to stop calling it "Coke". We like shorter, easier way to say things...
I'm an investigator. I followed a trail there.
Q.Tell me what the trail was.
Re:TM'd title (Score:2)
Posix *wasn't* designed for beginners, but surely there is no reason why it should not *now* incorporate GUI aspects that enable users to access it without the need to become properrerheads; and without stopping PH's from doing how they want to do it.
There is no real excuse for elitism except for insecurity.
EXACTLY! (Score:2)
The ability to mix and match objects.
Fairly transparent object invocatation, and the ability to access your information anywhere.
What if we had some kind of system that stored everything in a browsable tree. Ie, any object could serialize itself to the tree, and anyone (provided they had access permisions) could access, store, read and run parts of the tree. Even more so, what if everyone could access their tree locally.
For example, In c++
umGeneric = new obj_from_tree("oft://objects.blah.com/World/objec
assert(obj_from_tree->isA("printer"));
printer->print(string);
Parts of the tree could be simple dictonaries (in java, or hash's in perl) that have configuration files etc.
Sorry, but wouldn't this be cool?
Anyone interested?
Re:Slashdot shows it's bias again.. (Score:2)
Part of the problem is that there is no "Unix", in a concrete sense. Unix is more of a Platonic ideal than a specific system. Solaris is a kind of Unix, HP-UX is a kind of Unix, Irix is a kind of Unix, Linux is a kind of Unix (OK, maybe not legally, but philosophically), but none of them is Unix. To say "Unix does foo" is misleading; Unix doesn't do anything - specific implementations do things.
Miguel uses examples like ssh, Samba, and Apache. Are those "Unix"? To my way of thinking, no; they're applications written for implementations of Unix. Bad applications do not make a bad OS - you can write a Windows application that doesn't use any reusable code and breaks every standard, but that doesn't change the strengths or weaknesses of Windows. Now, Miguel is right when he says (or implies, at least) that certain systems make it easy to write component-izable stuff, and Unix isn't one of them. But I think he's wrong in implying that a monolithic component architecture is the answer.
Unix applications don't use reusable code or talk to each other? Sure they do - and I'm not talking about pipes, either. Does your web browser care what network card you have? No, because it only talks to the layer of the stack directly below it, and each layer of the stack is a black box only providing a fixed set of services.
Part of the problem with Unix GUIs, I think, is that they've broken with the stack-oriented model to some degree. The application is responsible for too much, which is part of why X apps tend to look very different. Sure, you can link to Motif without worrying about Motif's internals, but that only exemplifies the problem - imagine if your web browser was "linked" to a specific network protocol!
I guess I'm partly agreeing with Miguel here, in that you should be able to code without worrying about how services are provided. The difference is that Miguel seems to think that centralization is the way to go, whereas I think that decentralization with proper understanding of responsibilities is the way to go. It looks to me like in Miguel's world, you'd be stuck with one model for everything, whereas in a protocol-layer world, you can change any layer without affecting others.
Re:TM'd title (Score:2)
And I agree, choice of interface is good. BUT - when dealing with specific apps, it is very difficult with today's technology to create 2 UIs to the same underlying code. Perhaps with Glade and similar apps this will become more convenient in the future, but for the moment programmers usually need to try to satisfy both groups of users (newbies, power users) with 1 interface.
What's wrong with Unix-like systems??? (Score:2)
Re:Missing the point... (Score:2)
And my patience is thin. I need to get laid.
Re:Great Article... (Score:2)
Is it so strange that the GNOME team primarily write GNOME components? GTK+ is a small and fast widget set and I think huge complex components like the GNOME canvas is the last thing it needs.
Example: In the previous article about Miguel speaking (sorry, no reference), one poster mentioned how he had gotten flamed for taking the GNOME html component and removing the GNOME dependencies. Clearly, an html component that everybody can use is a good thing. Requiring GNOME to use this html component is not a good thing.
AFAIK the only "GNOME html component" around is the GtkHTML component (used for example in the Helix Code Installer, Updater and some wizards). I'm pretty sure it works in GTK+ only apps too.
OODS (Score:2)
What you describe is similar to what the OODS [oods.org] team is trying to put together. Their idea is to have a single API for interacting with any kind of data held in a directory-style structure, with the OODS software providing client access, permissions controls, and back-end data store drivers.
They are just at the point of planning and discussing everything.. it's not clear that any signficant amount of code will be written in the foreseeable future, but if you want to join a discussion with people who are working on this sort of thing, check it out.
Re:Miguel is sadly mistaken (Score:3)
I was asked once at Usenix, once at OLS "How long have you been using Unix?". At OLS someone just assumed I was a newcomer that had used a Mac all its life, that I had no idea of what I was talking about, and that I would be better of clicking icons on my Mac.
I have been using Unix since the early 90s. My first contributions to free software was in 1992.
I was the main author of the GNU Midnight Commander, a file manager that was a clone of the DOS file manager called the Norton Commander.
Later, I started working with David Miller on the SPARC port of Linux: I worked on the kernel and on a bunch of device drivers that made the system usable. I also ported three libcs and did significant work on the various dynamic linkers used on the port (the libc4, the libc5 and worked partially on the GNU libc port).
Afterwards, I worked with Ingo and Gadi on the Linux software RAID-1/4/5 implementation. Ingo later perfected it to the beautiful levels you see now.
Later I joined the Linux/Indy team in which I worked on various tasks to bring up a complete system to Linux on the Indy. I abandoned the work when I began working on GNOME, three years ago.
Miguel
Re:Complexity isn't the only problem here (Score:3)
No one is forcing you to use GNOME or KDE or XFce, and some people genuinely like them.
Something it's not made to have (Score:2)
Some OS's weren't make to be pretty, and I think unix is one of them. I much rather have it's power then any other other operating system's GUI any day...
Re:UNIX's "problems" are really flexibility (Score:3)
If you only write stuff for hackers how do you think Linux is going to succeed in the desktop market ? You can't just ignore your user's needs.
There has to be a standard way of doing things.
Re:Proposal: Linux Unified Model (Score:2)
CORBA is slow, overweight and has no policy. These are problems with it. DXPCOM would be my first choice, but it does not exist yet.
If I were to actually code this, I would probibly use XPCOM. I have enjoyed using it for the most part.
Re:TM'd title (Score:3)
I'm an investigator. I followed a trail there.
Q.Tell me what the trail was.
Re:Interesting, but ESR dissagrees (Score:2)
Miguel
ESR says:
These are opposing viewpoints?! It looks to me like Miguel and ESR are on the same page.
---
Re:um, one foot before the other, you know? (Score:2)
I would love to do something like this.
Features which don't exist in Emacs? (Score:2)
There are things to complain about in Emacs, but lack of features? You must not be looking hard enough. There are always at least 2 or 3 different implementations of any conceivable feature...
Made sense until he praised MS's gui. (Score:2)
Re:UNIX trademark (Score:2)
The Downward Spiral (Score:2)
I, for one, love the way applications are written now. He mentions how things like inetd and ssh have no code reuse. That's because they don't really do anything similar! libc does most of their work: socket(), connect(), select(), etc. There's no reason for these applications to share--and hence depend on--components.
Stay the fuck away from our Unix.
Re:Component systems and versioning (Score:3)
One of the real weaknesses of the MS way is that (as it has been explained to me), there is no way to extend a COM interface - any new functionality requires creating a completely new interface that exists alongside and is (usually) a superset of the old one. Of course, you must still support the old interface for backward compatibility, but this isn't always done. (This really makes some sense, since the alternative is code bloat, but it breaks things, especially if app vendors "update" a MS-supplied DLL.)
The DLL hell problem is quite serious, and has some significant and largely unknown side effects - here is one big reason why even W2K isn't up to enterprise duty: The DLL problem prevents running test and production versions of the same application simultaneously. Of course, this is something the Unix folks have handled forever simply by starting in another directory and/or tweaking the search path variables for executables and libraries. (For those of you MS folks that think it can be done, I have it on good authority (Microsoft's) that it cannnot be. It is possible to tie a particular DLL version to a particular app, but there is no way of ensuring that you will get the right DLL if another version of the same DLL has already been loaded into memory by another application (or another version of the same application.))
This sort of behaviour *MUST* be avoided at all costs!
As an aside, although I'm starting to be quite impressed with GNOME and it's rate of improvement (although it's an inexcusable resource pig), I still wonder how much farther we might be if this had all been done in Java, leveraging all those other components that are already built? (And yes I realize the freedom issues of a year or two ago. I also think they're almost totally fixed and/or irrelevant today - there are a lot of alternative implementations out there.)
It just pains me to see so much effort thrown at reinventing the wheel yet again, but without the benefit of portable binaries and the attendant abilty to automatically and dynamically define the client/server(s) split point(s). This ability will eventually make Java or something like it the winner, since you can only pull that trick off with binary code that runs wherever you decide to send it...
Miguel, if you read this, I'd be interested in your take on this latter point in particular. And keep up the good work, you may convert me yet...
COM for UNIX??? (Score:2)
One question, though... Are there many *n?x people out there who feel this is something we really need? Or is it just an accute case of code envy on the part of Miguel?
UNIX was one of the first component-based systems (Score:3)
Just because something's based on a command line with independent processes running in separate address spaces, and isn't object oriented, doesn't mean that it's not a modular, component-based architecture.
Re:TM'd title (Score:3)
Basic typeing and arrow keys work as expected.
Then I use search a lot: C-s.
undo is shift-underscore.
replace is (OBVIOUSLY...) M-%
cut can be achieved by C-space at one and and C-w (wipe) at the other. Copy is the same but M-w. Paste (yank) is C-y. Any M-y's after C-y's paste things that used to be cut/copied, very neat as you never lose any text. C-k cuts the rest of the line.
And of course quit is C-x C-c
then there are some long commands I use a lot
M-x enters minibuffer fo command typing
line-number-mode shows line numbers
sgml-mode, cc-mode, c-mode, html-mode etc go into those appropriate modes so you can use:
font-lock-mode for syntax highlighting.
Obviously there's much more, but that's 99.9% of what I use. Can be learned in 15 mins, really. stick it to the side of your screen.
Also useful is the tutorial C-h t
Re:Some real examples please? (Score:2)
All well and good, but pointing to closed source projects and claiming they don't reuse code between themselves detracts from his message.
There is actually quite a lot of code reuse. Perhaps not as much as he would like to see, but let's start by looking at examples of open source products instead. Projects where code reuse is at least an option.
There are also no bricks floating in the clouds. Duh.
Can we get a damn linking standard? (Score:2)
I swear, reading slashdot is like playing myst some days.
Re:Vicious racism on /. (Score:2)
Hah, just the sort of West Undershirtian reply I'd expect. You are people too? Not from what I can tell. You sub-human visigoths are what give America a bad name abroad.
Go back to your single-wide, artificial stucco, outhouse-using life. You pig.
<SARCASM type=":)">
Re:Middleware (Score:2)
Tim Berners-Lee book without violating the
Amazon boycot [noamazon.com] you could try Fatbrain:
Weaving the Web [fatbrain.com]
Just a minor question: (Score:2)
Icaza is ignorant. (Score:2)
First make GNOME not suck (Score:3)
If I want to use some little GNOME program (say, Galeon), what do I need to do?
Download the program.
Figure out which libraries are needed for the GNOME stuff.
Figure out which libraries are needed BY the GNOME stuff.
Locate and download all those libraries.
Find a place to put all those libraries.
Debug all my existing applications because I just upgraded all my libraries (can you say "DLL hell"?)
Occasionally: Answer "NO" to a program that wants to "associate files of type ABC with this program"
I'm all for making things easy to learn. I am NOT in favor of making them just like Microsoft.
--
Re:Miguel is sadly mistaken (Score:2)
Do me a favor. Go to a shell and try this:
$ cd
$ find . -name '*.[ch]' | xargs grep Miguel
Take a long hard look at what you see. Then think about your statement.
Perhaps you should let Miguel know about some of your concerns. You can easily reach him at miguel@kernel.org. Or miguel@gnu.org.
--
Ian Peters
Posix (Score:3)
Posix is not the generic term for UNIX because even NT is Posix complient (barely, but it is) and we all know that NT is not UNIX.
As someone already mentioned in this thread, the UNIX trademark was sold by AT&T after the anti-trust ruling, AT&T had some major restrictions on anything not related to long distance communications. AT&T sold it to Novell, who sold it to SCO. From what I have been told SCO gave that trademark to some non-profit standards orginization, or something along those lines.
UNIX is not just a trademark but a standardization. In order for a product to legitamitly called UNIX it must follow certain conventions.
A more generic term is *nix, which refers to UNIX like. It covers UNIX, Linux, Minix, and several others.
Re:Wanted: Killer Apps for World Domination (Score:2)
You've tried to find a gui for unix equivalent to Windows, which last time I checked doesn't exist, and when you failed you started ranting. If you want to keep using the Windows gui, stick to Windows. Really lots of people use it for businss and leisure, it certainly isn't that bad.
But let's look closer at your beef with unix, and why in my opinion it is totally irrelevant. Has it ever occured to you that the majority of people actually using unix on the desktop are happy with it, or they wouldn't be using it, and they'd probably hate being forced to use Windows, or anything else for that matter, because it would be totally unfamiliar to them just like unix was to you?
Has it ever occured to you that these users couldn't care less if there's a gui & set of apps functionaly & visually equivalent to Windows, since they don't *need/want* that functionality?
And before you step in and say that if more people are to start using unix on the desktop, such a gui must be created, has it ever occured to you that such people minght not give a damn if more people start using it or not?
Why should they care what other people use on their home box. I'm using someting that I know, like and am familiar with. If you don't like it, I honestly couldn't care less. I don't care what my users are using on our network either. They should be and are free to shoot their feet any way they please.
Lately I've been using windowmaker + wterms + gtk (gtkstep rocks!) w/ some gtk apps (gimp, gtksee, xchat...) and netscape etc. During the last year or so I seem to have stabilized on a consistent setup on my laptop, workstation at home & work etc., before that I would experiment often, try new things etc. Thing is, it's been almost a year since i last did a dramatic change on the setup appart from the odd upgrade here and there, the occasional tweak here and the odd background image change =P
I can say that I finally found the ideal gui for myself - i'm very productive, everything is in the right place, it's readable (contrary to 99% of the themes on theme.org). Granted this path to nirvana wasn't painless, but hey, there's no way an out of the box gui can please all people. Accepting anything prepackaged is bound to be a compromise, including windows.
Chances are, you're probably not gonna like my setup. My point is though, that I'm not gonna like your windows setup either. You miss some windows apps on unix, I'd miss some unix apps on windows.
Oh, and for the curious, here are some shots of my gui nirvana:
shot 1 [cc.duth.gr]
shot 2 [cc.duth.gr]
Component systems and versioning (Score:4)
Is the interface definition used to determine "compatability" of an object for a particular purpose? Can interfaces evolve? Can an object add functionality, but still be used by other, older objects for the older purposes? Must an evolving object conform to several interfaces (adding bloat), or can there be v2.0 of an interface, after the designer realizes there's a Better Way to do it?
These are hard problems, and ones I was not able to answer to my satisfaction. Evidinced by their software, it seems that M$ has not either. Do you really want to embed an editable spreadsheet in a document, and deal with the bloat and crashes that will occur? Or is there a Better Way?
Of course, I could probably answer all these questions by digging into the Bonobo and CORBA documentation, but stimulating discussion is good too.
--Bob
Policy and lack thereof (Score:3)
The core of Miguel's argument is that the Unix world is in chaos because the designers of Unix have failed to form and enforce policy down the years. A good point.
But let's look at the history of Unix here:
Now, Miguel, could you please tell me precisely how is one going to enforce policy on such a disparate user base, most of whom are going to react with instinctive loathing towards anybody attempting to throw their weight around, to say my thing is The Right Thing damnit, for whatever reason?
Unix has survived precisely because there is no hallowed policy handed down from above. It evolves. It changes to meet new needs. Those components of Unix that are shared, like glibc, have developed through consensus and bitter experience. If you want to develop in an enforced-policy environment, well, there's Windows NT or VMS or OS/390.
The Cluetrain has already left the station, Miguel. You on it or under it?
--
Cat Mara
Love me, I'm a liberal!
Re:Wanted: Killer Apps for World Domination (Score:2)
And none of them handle multiple POP accounts. Mutt certainly isn't going to display that scan of the new baby in the message, or show a company logo at the top. I guess no one ever included pictures in snail mail either or ever wrote on letterhead either. But YOU have no use for these features, so they're useless, right?
Re:CORBA isn't the answer (Score:2)
As for Java, remember that the GNOME guys are all rabid free software zealots and wouldn't dream of depending on a proprietary language like that :). Nor do they have the marketing
team to force people to switch to another superior language (see
what happened to ObjC, Eiffel etc). So C compatibility
is really the only way to go.
Re:Proposal: Linux Unified Model (Score:2)
XPCOM is nice, but it's in-process only and any attempt to use component server middleware would be a grotesque hack ala DCOM.
Re:Meanwhile, the "beast" lumbers on (Score:2)
A Dick and a Bush .. You know somebody's gonna get screwed.
Two questions (Score:2)
Re:other way around (Score:2)
Re:UNIX's "problems" are really flexibility (Score:2)
One advantage for both newbies AND experienced admins is that across ALL applications, the UI is consistent. This is a major complaint I hear about *nix. There is little to no consistency in UI layout between applications. Hell, there isn't even a common clipboard that supports more than just plaintext.
You aren't restricted to their toolkits, though. If you wanted in both OSes you could use alternate widget sets (GTK is available for Win32). But then you run into the problem that your UI is somewhat out of place (ie, it doesn't follow the user selected color scheme at times), and duplicating code unecessarily. What Win32 and MacOS are famous for, GUI wise, is the consistency of the UI. Makes life easier for everyone, including end users and developers.
CORBA isn't the answer (Score:3)
But I think neither COM nor CORBA are the answer. COM and CORBA are both rather complex systems because they are trying to patch up deficiencies in the underlying languages, C and C++. In an environment that encourages reuse, you should be able to just serialize and send objects to other components without lots of error-prone declarations. Such systems exist, and have existed for decades. But you simply can't build them reliably on top of C/C++.
Ten years ago, Objective-C was a pragmatic and efficient answer to that problem. Objective-C is simpler than IDL and gives programmers more power. Today, the obvious answer would seem to be Java, although even it is still more complex than it probably ought to be.
While I appreciate the short term utility of Gnome, I think in the long term, the effect of the Gnome project (and KDE, for that matter) is going to be harmful. It continues to encourage people to develop in and for an environment that is fundamentally not well suited to building software components and getting a lot of code reuse.
If people want to do something relevant for end users in an industry-standard environment, I think they should contribute to Java-based desktop application efforts. The Gnome programmers are smart and capable: if even a fraction of the Gnome effort went into open source Java implementation (e.g., kaffe [kaffe.org]) and Java desktop apps (e.g., JFA [javalobby.org]), we'd soon have a good environment that would be much easier to extend with new components than a big C/CORBA system.
Components at install level (Score:3)
Miguel touches on the mess of configuring services. He proposes a solution for working with existing configuration files using a perl backend and GUI frontend. This is an admirable short term solution for a larger, significant problem.
The inherent problem is that standard unix
In the spirit of the changes proposed by Miguel, I propose that applications and otherwise all packages be components even in the way they live in the system. Let every package have an arbirary, unique directory, and let everything owned by a package live only in that directory. Let there be a common system component that exposes packages and their configuration on request. Let all packages find and expose other packages only through this component. Let the system package component internally record at most where to find other packages. Further configuration is stored in the package's own directory.
There are a number of advantages to this model:
1. First order installation becomes trivial. Just dump everything into a directory. The system package component will automatically find it.
2. Complete uninstall becomes trivial. Just blow away the package's directory.
3. Exposing a package's configuration is standardized, stable, and protected through the system package component.
4. "Custom" packages and their configuration is trivially persistent across reinstalling the operating system.
This is a problem that has been clumsily attacked by both RPM's and the MS Window's reigstry. Both tried to solve the problem by making prodigious use of massive amounts of internal data - data that is subject to unneccesary and unwanted management and corruption. With the proposed system package component, the small amount of internal data is easily reconstructed by scanning the file system. If you assert that packages access even their own configuration data through the system package component (much like the interface to a registry), then each package's configuration data can be stored in something standard and sane, like config.xml.
I code. If you want help, I'll give it.
Down with global data! Down with
- Cory
Speaking of things that suck... (Score:4)
Miguel de Icaza seems like an otherwise intelligent guy, so I have to assume that CORBA is forcing the use of reference counting here. If that's so, then CORBA sucks even worse than I thought.
um, one foot before the other, you know? (Score:5)
You need something like DCOM implemented first before you can even think of implementing something like
Re:Component systems and versioning (Score:3)
Re:Great Article... (Score:4)
Miguel's article is spot on. I love everything about Unix except the fact that Component Based programming is so underused. If there is only one thing Microsoft has done right, it is the way they have developed and pushed COM. With COM, I can write a piece of software that performs a task (be it a Widget or piece of middleware) and COMify it.
Except that GNOME is going about this entirely the wrong way. They're writing a lot of useful stuff (the canvas, html components, etc.) except they can't figure out why somebody would want to use this stuff outside of GNOME. GTK+ could benefit from the standard inclusion of some of these things and it's likie fighting for a firstborn to move them out of GNOME into GTK+.
Example: In the previous article about Miguel speaking (sorry, no reference), one poster mentioned how he had gotten flamed for taking the GNOME html component and removing the GNOME dependencies. Clearly, an html component that everybody can use is a good thing. Requiring GNOME to use this html component is not a good thing.
Write the reusable software at the right level; don't GNOMEify everything in the name of "software reuse".
-Nathan
Scary! A Linux nerd bashing Unix! (Score:4)
Wow, it's always tough when a true Indian wanders off the reservation!
Well, he has a point. Unix should be the first OS to use modularized components with rampant code-reuse, not one of the last. Remember part of the Hacker Ethic: do not re-invent the wheel.
Imagine! Maybe Microsoft does do some things very well! (I know IE has much better support of CSS than Netscape does -- not to beat a dead horse, but Mozilla isn't looking all that great either on several fronts). Could it be that this modularity (even done as slipshod as it is on Microsoft OSes) is part of what encourages people to write software for Microsoft? Ease of development? (I'm not a True Programmer, so <TAKE type="salt" size="grain">
I wish the best for Helixcode -- just before you get carried away with making it "easy to use", try to get some UI experts in there to help design things. Just because it has a button doesn't mean it's easy to use. Where the button is placed is just as important as having the button.
Great Article... (Score:5)
Miguel's article is spot on. I love everything about Unix except the fact that Component Based programming is so underused. If there is only one thing Microsoft has done right, it is the way they have developed and pushed COM. With COM, I can write a piece of software that performs a task (be it a Widget or piece of middleware) and COMify it.
Once this is done, anyone can use it regardless of what language it was written in, fast XML parsers can be written in C++ and used in from Javascript or VB. This way developers of business apps do not have to make the choice between a.) putting up with a slow app or b.) writing one themselves with all the attendant bugs therein especially if they have little C++/C skills, also they can go on towards actually creating their application instead of worrying about if they malloced() enough space for their char*'s.
Lots of *nix people believe this implies laziness but fail to realize that reinventing the wheel dozens of times over is folly.
Example I:
I am currently designing and implementing a project management system on Windows(TM) for a small business with a few of my friends. two of them are *nix hackers and they balked at using an XML based protocol to transfer data between the client and server. Now instead of simply designing our protocol then using one of the dozens of available parsers [xml.com] to do this, they decided that we should invent our own binary protocol and write our own parser to parse it.
Our project involves code written in both C++ and Javascript/ASP. We could have used a single COM based parser to consistly interact with the data both from the C++ and the Javascript code but instead its been 2 weeks and counting and our homegrown parser is still being written, tested and debugged. In my opinion this is nothing but a waste of time. When I ask them why not just use XML and an already existing parser their replies boil down to "It just feels wrong.". The chances that a bug or two will slip through in testing or that there is a buffer overflow in our parser is not unlikely considering that most early versions of parsers written in C++ have a few bugs like this hidden somewhere. in this situation component based programming would have allowed us to focus on building and designing our actual application instead of focusing time and energy on a tangential application.
Example II:
At work a MBA intern asked me if it was possible to create an application that housed a search engine that searched a database of MBA students based on criteria like concentration, work experience, graduation date, etc. and then displayed results with links to their resumes in MSFT Word(TM) or HTML format which could be stored on a CD to give recruiters at career fairs. Their first attempt had been to use VB and Access which turned out to be a disaster because of DLL Hell [desaware.com] based issues. My simple solution was for them to store all the students in an XML file and to write a Javascript page that used the COM based XML parser (written in C++) to perform the search. Writing this page took less than 2 hours.
Now they have this search functionality they can press on a CD and give out at career fairs which any recruiter can view without needing more than MSIE 4.0 or greater.
Without Component based programming their request would have been impossible to fill in their time frame and would have also required that the recruiters machines would need to fulfill a stricter set of requirements (like a Webserver being installed or they'd have to install an app).
In conclusion my question is "Why has it taken so long for a major *nix push towards component based technology?". After all we've had CORBA for almost a decade [omg.org] but there hasn't been that much a big push towards components. Frankly I am eagerly awaiting MSFT's
FOOD FOR THOUGHT
Wanted: Killer Apps for World Domination (Score:3)
Linux on the desktop: does indeed SUCK.
I've been using Unix in a server environment since 1992. Never had any major problems. On the desktop, I started with Mac, fiddled with NeXT, tried Sun and DEC workstations, and eventual moved to M$ Windows (for gaming, nothing else compares).
All of those OS's have their strengths and weaknesses. And, in hope af creating a better world, last week I bought an extra hard drive and installed Linux (RedHat 6.2, am told Debian is better but no CD available) on it to play around.
In general, a less than fulfilling experience. Here are my observations:
1. I have to choose a desktop environemnt? GNOME or KDE? I'm supposed to know which has better Apps? Great idea - split a limited developers pool among two environments - so instead of getting one set of applications that work well, we get two sets of applications that are in perpetual beta.
2. Web Browser. At no time while using a PC do I have less than 4 or 5 browser windows open. Trying to work without a functional browser is difficult, if not impossible. I just don't enjoy opening NN and seeing my available memory disappear. (Last week, Mozilla was declared dead - how could this happen when it hasn't even been born yet?)
3. Mail Client. I spent days looking for a mail client for GNOME which supported multiple POP mailboxes. I found a few, but they ended up in wild-goose chases for libraries to replace those which where outdated, too new, etc. Never actually got anything to compile. Heard there's a good mail client for KDE, which means I made the wrong choice back at #1.
4. Editor. Uhh, I use vi and emacs when there is absolutely, positively, nothing else available. Don't get me wrong, I first learned emacs over 8 years ago. But there are some basic functions which I rely upon that don't exist in emacs. Give me something like HomeSite on a linux box and you've got a convert.
5. Word and Excel. Regardless of how much other Microsoft software sucks, these two products are hard to beat. Also, they are practically industry standards. If you work in any office environment, you'll be sure to get these sent to you all the time. Of course, you can read them from your linux box - but if you want to edit them, it's lilo:dos yet again.
I use my computer to work. It is a tool which I need to function efficiently. I played with my new Linux Desktop for a few days, then when I had real work to do, I rebooted back to DOS. A real disappointment.
I know, it's open source, help and code it instead of complaining. I do code open source software, but for web applications. I don't code for the desktop. To grow, linux needs the desktop. To win on the desktop, Linux needs the killer apps - at least a browser, a good mail client, and an editor.
To get there, I'll argue that Linux needs less developers rather than more. I'm tired of seeing 2000 new apps which are v.0.0.0.1beta0.0.5-unstable. The paradigm of "release early and often" needs to be rethought. Release when you have a functioning application. If you have an idea for a new app, look around to see if anything else is out there first. If someone is already working on the same application, join them rather than creating a new tarball which will never get out of beta.
Open Source can and will take over. But it won't do so without the Desktop. And the desktop is all about applications.
Missing the point... (Score:3)
What I see as one of the points here is that a lot of people are wasting a lot of time by writing a bunch of support code for their application because they are not reusing code. How this hurts us is the fact that this time could be used more effectively on working on the logic of the application, rather than rewriting yet another html parser or whatever.
I know on a few pieces of software I have written I ended up using glib, because there are just so many nifty functions that programmers are constantly rewriting. And I can see his point after using what is still a fairly lowlevel interface.
Also, as far as a lot of people saying, well we have pipes and that's all we'll ever need is just silly. I mean yes, pipes are neat, but god damn, how do you really expect to write anything complex and have it be relatively fast when your swishing data via pipes and firing off a bunch of new processes via fork().
Modularity is really the key to have a extensible OS. Linux to some extent is modular, but not really. Take a look at the HURD for example, from the design viewpoint, its a beautiful kernel. Sure microkernels are a bit slower than a monolithic kernel at this point, but what difference does say a 3% performance hit matter.
Code sharing and reuse is really what open source programs should be about. There should be common APIs and interfaces. Lets let go of some of the baggage that has accumlated with us over the years and stop trying to be a UNIX workalike and do something innovative. Linux and GNU are really the standards that the rest of the Unix community are trying to live up to now, we should show a bit of leadership here.
Re:Component systems and versioning (Score:4)
For instance, the published interfaces in Microsoft Windows have not changed since they were published in the first version of OLE 2.0.
When Microsoft has expanded the funcionality they have created new interfaces or new methods, and they have retained the behaviour and previous interfaces.
The DLL problems in Microsoft applications are of a different nature, and can not be attributed to faults in their component system. It is a separate problem, still a problem for end users, but a separate one.
Miguel.
Meanwhile, the "beast" lumbers on (Score:5)
First of all, to those of you who are criticizing Miguel by saying "Miguel is wrong because being Object Oriented isn't necessary", or "Miguel is wrong because XML isn't necessary", I hope you're keeping this in mind: Miguel's comments can be broken down into two parts("You know, there are two kinds of people in the world..."
1)We should be thinking about ways in which the UNIX philosophy is deficient, rather than continually reassuring ourselves that it's all okay. Look at it pragmatically: Who's got the biggest market penetration? Who's system is easier to learn to program in for the beginner, ignoring cost?
Okay, these are total flamebait questions, so please, please don't respond to these in particular. Use your imagination, and think of some ways in which Windows is better than UNIX, rather than touting all the advantages of your pet operating system. Otherwise, you're just brainwashing yourselves with your own marketing.
The question here isn't which way we should take things, it's how we should think about them. If you want to respond to this half of the question, address what the community should expect of UNIX, not how it should be done.
2)UNIX needs standards, reusability, etc. This is a set of recommendations to the community about where things should go specifically. If you agree to Miguel's motivations in the first part, then read on. His argument is based on looking at "the competition", and I can give you a concrete example.
He mentions IE, and how it's actually made up of a large collection of components rather than being a monolithic application. True. If I want IE's rendering capabilities in my application and I'm using something like Delphi(example because I actually had to do this once), Hell, I'll just draw myself a window and drop the browser component into it. You can argue about whether it's bloated code or not, but the end result is that I didn't have to reinvent the wheel to get something pretty momentous done. Further, I can now focus on doing something with this browser component that hasn't been done before.
For those of you who aren't interested in looking into it, Microsoft is working on something called dotNet. There's a lot of argument about what it all is, and whether it's useful, a product of the devil, etc. The thing that excited me about it is that components from one language can be used in another. And here's where I must admit that I didn't read the details about Bonobo. But my point is that Microsoft is going to have a fully operational Death Star of interoperability between languages pretty danged soon. Miguel rattles off a list of languages:
And this is exactly what isn't going to be the case with dotNet.
I know most of you have lost interest by now, and are happily moderating me down, flaming me, etc., but I have an appeal to those serious programmers and geeks amongst you who bore with me this far. It doesn't matter who came up with it, but isn't that just a bitchin' cool idea???
As you know, everyone who writes about their new features admits that you can already do the same thing in plain old C, but you also know how the rest of it goes.
By now, I've totally lost track of any other points I was going to make, if any. Please fill in the blanks with anything relavent you see: