Autonomic Computing 152
pvcpie writes: "The New York Times has a story today about Autonomic Computing, which is described as "a biological metaphor suggesting a systemic approach to attaining a higher level of automation in computing;" and they published a paper (pdf) on the topic. Apparently there are already some universities signed up on Autonomic Computing projects, more info was available on the website and in the nyt article. It also appeared in CNET."
The future (Score:5, Interesting)
The more that can be done automatically, the more of the IT staff's precious time can be dedicated to more complex tuning tasks, and/or new development. This will make IT more effective, not obsolete.
Is there a life expectancy? (Score:3, Interesting)
For that matter would there be analogous doctors, hospitals and life support systems? How about gymnasiums for keeping in shape? (and I ask that last one only half-jokingly...)
Gordon
What do you think you are doing, Dave?
Control (Score:2, Interesting)
I would hate to see my web server decide to bump up the number of allowed simultaneous connections in response to a denial of service attack, or decide that the ogg encoder in the background is indeed more important than domain control services.
... and of course the manditory gripe - that my system decides that it doesn't like my pirated MP3s and deletes them automatically.
If computers become smarter than the people who design their software, how are they any use as a tool anymore?
Repackaging the future (Score:5, Interesting)
In the end it turns out that the most complex problem arise in trying to coordinate a collection of "autonomic" (?) components. Distributed systems with unrully objects... This is what the autonomous agent community is mainly concerned with ( see the UMBC [umbc.edu] agents page or this very useful overview paper [liv.ac.uk] for example).
Of course IBM pushing this it might mean a kick up rear for the academic to actually get some of this potentially cool stuff working. Chances are you never want the end user to know how it works anyway.
Re:Is there a life expectancy? (Score:3, Interesting)
The computer's life-expectancy doesn't change much due to the self-tuning properties, but of course these self-tuning properties will stress more the machine (more usage of CPU, disk and I/O in general), and hardware fails after some time. It may take long, but it fails.
Now consider a self tuning database system which includes a shelf of backup tapes and a robot arm to switch tapes (or CDs) as part of its maintenance. Moving parts add to the stress, which reduces "life expectancy".
Just to add to the mess, human life expectancy is also related to environment conditions. Being hit by a meteor or burned in a fire is just as bad for a computer as it is for a human.
Derisive laughter coming from the Mac lab techs (Score:2, Interesting)
Re:What virus writers have to teach.... (Score:4, Interesting)
Actually, I have, and I know that many are very amateurish, but you come across the occasional gem - I once found a very cunning polymorphic macro virus lurking round. Funnily enough, those ones are the ones that tend to do the least damage - correlation?
Re:Is there a life expectancy? (Score:2, Interesting)
There is also a phenomenon called apoptosis, which is the spontaneous death of seemingly healthy cells. It is part of the body's self-regulation -- cancer seems to be, in some sense, a failure of the apoptosis mechanism.
So we may have software vendors building, instead of planned obsolescence, apoptosis into products. They could even make it a feature -- if nothing ever dies, evolution stops.
Autonomic Virus? (Score:2, Interesting)
1) An autonomic virus, written with the capability to "heal itself" once installed. Does this make sense? It seems to me that some existing virii already have some self-healing properties, such as those that hide a copy of themselves on a user's HD and insert a registry key in a Winodws registry to have themselves restored at reboot time. Thoughts?
2) A virus designed to insert itself into an autonomic system would conceivably be able to use the system's "self-healing" properties to protect itself (a funny memory springs to mind. I went to remove Outlook Express from my Win2K box at work, and discovered that Win2K does not have the option to uninstall Outlook Express. Undaunted, I went into the folder the executable was in and deleted it. Within five seconds, the system detected my "user error" in deleting a system file, and restored it. It took me a while to figure out how to prevent this, but it really threw me for a loop when I first saw it happen).
Finally, somebody gets it. (Score:3, Interesting)
That was in the original Apple "Inside Macintosh".
The hardware side is reasonably close on this. All the newer interfaces (USB, IEEE-1394, PCI, PCMCIA) have identity info on all devices. And it's been that way for a few years now. It's time to pull the plug on the old stuff and insist that everything autoconfigure.
This is key. And again, Apple almost had it right, once. The original Apple model was that the system had two main repositories of system state - the Desktop file, and application preferences. The Desktop file could be regenerated if needed (and had to be, due to lousy database design), and application preferences were cosmetic only - you could delete preferences at any time, and just went back to the defaults.
Apple never faced up to checkability, though. And it hurt them, because they were running an unprotected OS with a tendency to trash its internal data structures.
Broken things must not contaminate other things.
It's unacceptable to ever get bad data from a disk. Reported errors, yes; undetected errors, no. Everything must have error checking. Memory parity must always be on. (And ECC ought to be standard.)
Re:What's next? (Score:1, Interesting)
Stafford Beer has been writing about this topic for three decades. There was the Chilean Experiment in 1973, which was an experiment in autonomic computer control systems. This is nothing new, it's just something that almost everyone is ignorant of. Perhaps because the US government staged a coup in Chile in order to stop the experiment. In all my years on Slashdot I have yet to mention Stafford Beer and have anyone say, "Yeah, I've heard of him".
No, it's really nothing new, and it's not as complex as the IBMers would have you believe [and it has nothing to do with XML!]. It's not some new way of writing software - Beer's system was implemented I believe in straight COBOL. It's a new way of designing software, and it is indeed a paradigm shift in the true sense of the word - which is how the research has gone unnoticed for 30 years.
Re:There is no shortage (Score:3, Interesting)
Art is a fairly rare field, and the pay generally sucks. Acting, music too. Sucks, unless you bedome popular, then you become a natural monopoly on "you" and your pay skyrockets.
about 30 years behind the times (Score:3, Interesting)
Homeostatsis and self-regulation are not properties that you implement once in some abstract data type and that henceforth works for everything, or that require breakthrough new technology, they are design goals that you need to take into account when you design each and every part of a system. Biological organisms have been forced from day one to deal with these issues. The reason real software systems don't do this is not that people don't know how to, it's that software developers don't bother and aren't trained to do it, and that they can get away with it because there are always smart humans around to help it.
So, next time you write a new piece of software, think about how you can make it more self adapting and less reliant on numerous environment variables and other arguments supplied by the user. The pathsearch library is a simple example of this.
Re:Evolution proceeds towards what works... (Score:2, Interesting)
My high school biology teacher must've said a thousand times, "Evolution proceeds towards what works, not towards what is best."
Erm, sort of true. Evolution actually works to do what is 'best' in terms of the fitness function... i.e. seeks to maximise or minimise the result of some metric. If you pick your fitness function correctly, you can make the system optimise towards any required goal.
Just make sure you don't have any bugs - because GA's and GP's will find and exploit bugs that give higher fitness metrics faster than the programmer.