Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Autonomic Computing 152

pvcpie writes: "The New York Times has a story today about Autonomic Computing, which is described as "a biological metaphor suggesting a systemic approach to attaining a higher level of automation in computing;" and they published a paper (pdf) on the topic. Apparently there are already some universities signed up on Autonomic Computing projects, more info was available on the website and in the nyt article. It also appeared in CNET."
This discussion has been archived. No new comments can be posted.

Autonomic Computing

Comments Filter:
  • Karma Whoring (Score:2, Informative)

    by Skynet ( 37427 ) on Tuesday October 16, 2001 @09:46AM (#2435777) Homepage
    IBM has done quite a bit of research on autonomic computing. IBM Just keeps getting cooler and cooler IMO. Although I have to say that this wquote frightens me:

    "Civilization advances by extending the number of important operations which we can perform without thinking about them." - Alfred North Whitehead"

    Anyways, here is the link to their Autonomic Computing R&D site:

    http://www.research.ibm.com/autonomic/ [ibm.com]
  • Re:Hmmm... (Score:5, Informative)

    by saridder ( 103936 ) on Tuesday October 16, 2001 @10:13AM (#2435895) Homepage
    A program/database is only only as good as the knowledge of the person who has input and wrote it. If a PC thinks that to wipe it's hard drive is the best way to fix itself, I'd blame the author who wrote the maintenance program. No expert would wipe a hard drive as a general fix for a PC, and would not write these instructions into the PC.

    In my opinion, part of a autonomous PC is to be self-sufficient, not act like a lemming and follow other PC's just to follow.

    Plus, just as human have a basic survial instinct to survive, I think you'd write this instinct into the PC as well and not have it destroy itself (unless it was doing major harm to its master, etc. Remember Asimov's rules for robots).

    Finally, I agree humans will never be replaced as the final decision maker in fixing and running PC's, servers, networks, etc., but when I was a sys admin, I'd have killed for PC's to be smart enough to do some of the basic, mundane, man-hour, labourious tasks such as upgrading Service Packs if I told them all to do it, install programs, etc. Then I could have done more fun stuff. Plus when I had to fix a problem, people weren't glad to see me, because I was only there when something went wrong. Granted they were happy someone was there to fix the problem, but all would have preferred that there was no problem in the first place (PC fixed itself).
  • by MarkusH ( 198450 ) on Tuesday October 16, 2001 @10:24AM (#2435933)

    "The only way to get efficiency gains in information technology is to take some of the people out."


    They're called managers.

  • What's next? (Score:2, Informative)

    by clone304 ( 522767 ) on Tuesday October 16, 2001 @10:47AM (#2436038)

    I think most of you, so far, are missing the idea here. I also think the good Dr. from IBM is too, but that is beside the point. The point here in redesigning the way systems work from the ground up is to make them more capable of doing what YOU as users/admins actually want them to do. The idea being that YOU set the policy and the computer learns how best to implement it.

    I, personally, don't like this very much. It sounds like the next step in closing off the workings of the "operating system" from the user. What happens to Linux and open source when Windows starts to dynamically rearrange it's code to optimize for your preferences and specific uses? It gets left behind is what.

    I've been thinking about where operating systems are headed and what I want in an operating system, lately. I had pretty much defined what I wanted, when I started to run across projects like this: TUNES [tunes.org], and ideas like this: Flow-Based Programming [http]. I then realized that I wasn't entirely original. People have been thinking about the same things and trying to work them out for some time. But there has been little mainstream work done to get things to happen.

    In my opinion, the design of TUNES and the ideas expressed about Flow-Based programming are a perfect fit for open source programming. And, there's no reason that autonomic computing couldn't fit right into the mix as well, as long as it's an open-source feature rather than a built in proprietary unified piece of the system.

    The new system I'd like to see would be completely dynamically restructurable, and reprogrammable from the ground up. I think this would be a prerequisite for full-blown autonomic computing, but I have a feeling that the corporates are going to slip it into Windows in such a way that Windows stays the same on the surface, but just tells you less and makes more decisions for you than it already does. Problem is, that's what most users think they want. What I suggest is doing it in such a way that each user has total choice about how his system is designed and operated. Of course there would be predefined templates for certain types of systems (web servers, web/e-mail clients, gaming system, desktop publishing workstation, etc). So a user could pick one or more open source templates on which to base his system and then modify it to his needs as he goes. These templates would define what optimum scheduling and resource allocation should be done for specific tasks and merge this at the lower level with the needs of other tasks and the priorities set by the user or learned dynamically by the system.

    I think we'll see some very interesting advances in the next 10-15 years. Let's hope the open-source community doesn't miss the boat. Microsoft sure as hell won't.

  • by n-baxley ( 103975 ) <nate@baxleysIII.org minus threevowels> on Tuesday October 16, 2001 @12:57PM (#2436721) Homepage Journal
    The first is, will IT workers embrace this? I know quite a few Oracle DBAs who are none to happy about turning over the optimization of their database to the computer. Maybe these people are just afraid of change and having to learn a higher art than what they've got, but if they tell their manager that they can do much better than the computer can, who will the manager believe? For this to progress, you must find a way to not scare the geeks.

    Secondly, how do we accomplish this without advancing machine tecnology too far? If a machine becomes self aware and protective of itself, what happens when we want to shut it down? What are you doing Dave? I know there are ways of preventing this, but will they work, and will we be able to find out if they work before it is too late, so to speak. I'm not trying to be paranoid, but this is something that is a real concern.

    Another piece of this that someoneelse mentioned is if the computer is maintaning the basic stuff, what happens when the computer dies and no one knows exactly how it did what it did? A very real example is the ubiquitus (sp?) of calculators. How many of you can still do long division in your head? There was some story I read in High School where this guy who could do simple math without a computer was such an oddity that he became a king or something like that.

    Keep doing those math problems.
  • by Bobo the Space Chimp ( 304349 ) on Tuesday October 16, 2001 @02:50PM (#2437331) Homepage
    The most likely cause for aging and death are that evolution never bred for that. By the time things start winding down -- late 30's to early 40's -- a dozen children have been squoze out, thus instantiating "evolution" as highly successful for that individual -- and with it only the genes that guaranteed "youthfulness" into the late 30's to early 40's.

    Yes, continued breeding into the 40's, 50's, 120's, etc. would indeed add to the age via genetics, but it is relatively minor and must overcome the energetic, wanting-the-females up and coming next generation to actually get the females.

    Furthermore, we do probably gain the slow advantage of increased age from evolution. Do any other mammals live as long as humans, even given ideal nutrition? Close, yes, but we should be dying in our 40's or 50's of old age after a healthy life based on our size, not 70's and 80's. For thousands of years, some people, mainly kings and the wealthy, have lived to old age even by modern standards, and they have continued to breed up into that great age, passing along their genes. I will bet this is the source of what appears to be the relatively unnaturally long healthy lifespan vis-a-vis other mammal species.

    We'll probably get the first big forays into extended life (well, second after good nutrition) via replacement parts. More $$$ for acephalous cloning experiments now! After that, chemistry (or other stem cell research) into preventing/reversing brain breakdown. More $$$ for cloning research now!

  • by KidSock ( 150684 ) on Tuesday October 16, 2001 @02:51PM (#2437342)

    It tells your heart how fast to beat, checks your blood sugar and oxygen levels, and controls your pupils so the right amount of light reaches your ...

    There's an OO principle called dimeter which advocates as few dependancies as possible between objects. This sounds like a lot of hooks all over the place which is not a model of simplicity. It would be better for "it" to step out of the way and let each object adjust itself based on its surroundings just as in natural systems. Nature has a tremendous advantage over computers. It is far more efficient because everything is happening literally in parallel. Computers can really only do a very limited number of things at a time although sometimes the user perceives concurrencey due to very rapid time-slicing.

    As a result, programmers are forced to make tremendous compermises given the comparatively limited medium with which they have to work. It will take well established techniques and objective analysis to determine the be way to utilize bits on silica.

    Over the years I have recognised one principle that transcends this issue -- the issue of dealing with complexity. Oversimply it is Recursive Composition. This "pattern" or OO construct as it is sometimes referred to does not have a Class or particular set of relationships between objects. It's completely arbitrary. The idea, is to recursively delegate the responsibilty of another part of the system to yet another module. At the leaves of this tree you have the primative operations and at the root you have one simple instruction for triggering a potentially very complex cascade of instructions. Thus you have reduced the complexity of the overall system. The key difference between this and just another group of functions calling one another (and thus target to reduceing complexity of programs and in real-life systems) is parameterization.

    As a simple example, imagine trying to encode or decode a database file. The database file has a header, a record list, and data chunks. Like this one on PalmOS PDB files [palmos.com]. If one were to apply the principle of Recursive Composition the API for this PDB codec would be, at the top level, PDB_decode(char *src). At the next level down you have operations like Hdr_decode(char *src) and Record_decode(char *src). At the leaves you have dec_uint32be(char *src) to decode an unsigned 32 bit integer in Big Endian byte order.

    If you can parameterize cleaning exactly what is required to perfrom a task and delegate it to another module you have broken the problem into at least two smaller problems which reduces the order of complexity. Simple! ;-P

For God's sake, stop researching for a while and begin to think!

Working...