Helping Computers Help Themselves 141
Jim Posner writes "The IT world's heavy hitters--IBM, Sun, Microsoft, and HP--want computers to solve their own problems.....If you're being chased by a big snarling dog, you don't have to worry about adjusting your heart rate or releasing a precise amount of adrenaline. Your body automatically does it all, thanks to the autonomic nervous system, the master-control for involuntary functions from breathing and blood flow to salivation and digestion." I'd just be happy with a few intelligent daemons to watch my back, like when a program runs amuck and fills up the process list.
self adjusting fans (Score:1, Interesting)
AMD's still catch fire with no fan on (Score:2)
A.I. (Score:1, Interesting)
Re:A.I. (Score:2, Interesting)
My 2 cents
Scary fucking shit. (Score:3, Insightful)
When I first came to this company, we had something like 20 IT employees. Through "attrition" (read: fire X, Y quits) we're down to 4. Every time somebody left, the remaining folks would write a script to automate what the other guy spent most of the day doing...watching servers for spikes and resetting them, etc.
Did it save us from hiring new people? Our HR department will tell you it did, but it's untrue. The fact is the turnaround time for IT requests has become abyssmal. Adding new segments to our network takes much much longer -- to the point that a new code base for email took 2 people six months to analyze deployment options and deploy, and only took me three weeks to write.
Customers are leaving, siting huge turn arounds for new features and fixes, and we're blaming it on our support dept. Support is fine -- they get requests to us fast. Deployment...well, it could take weeks even to get cosmetic changes through.
Can you imagine the additional testing you'd have to perform before changing a truly autonomous server? And how can you be sure that the self healing server is really healthy, or just not noticing the problem?
Das no like-y. Bad medicine.
Re:Scary fucking shit. (Score:4, Insightful)
It's really no different than the cotton gin automating cotton production: those workers that become obsoleted are retrained to enter the workforce with new skills. It is a continous cycle where those who are obsoleted learn new skills to get new jobs.
That's why our unemployment rate is usually fairly steady (between 3% and 8% almost always) - jobs are always being elimintated and jobs are always being created. In different sectors, maybe, but that isn't necessarily a bad thing.
Re:Scary fucking shit. (Score:1)
Just another typical Republican asshole.
Re:Scary fucking shit. (Score:2)
Re:Scary fucking shit. (Score:1)
Of course. That's why these guys aren't talking about a shell script to solve the problem. They're talking about 3 billion dollar budgets to get in there and solve these issues in very robust ways. You can't just fire you IT staff and hope one guy can do the work of five... you have to have better hardware and software so that you've only got one guy's worth of work.
Can you imagine the additional testing you'd have to perform before changing a truly autonomous server? And how can you be sure that the self healing server is really healthy, or just not noticing the problem?
I imagine you'd notice it exactly the way you'd notice an unhealthy server now... things don't run, performance drops significantly, programs hang, etc. The point is to make the computer smart enough to recognize these situations and intervene to fix the problem, so that the users never realize there was anything to worry about in the first place. Contrast that with the current situation, where the computer does nothing. Someone has to realize there is a problem. Someone has to figure out what the problem is. Someone has to fix it. For companies where up-time is money, this model for computer maintenance needs to be improved.
Re:Scary fucking shit. (Score:1)
Let's see...Rich guy, gettin richer my automating his computer network...Sounds like a fucking genius!
Vincit que se vincit.
Re:Scary fucking shit. (Score:1)
This just means less jobs for us (Score:2)
Let us control the machines, what the hell are we going to do once they control themselves?
Re:This just means less jobs for us (Score:2, Interesting)
Wrong (Score:2)
Re:This just means less jobs for us (Score:2)
Re:This just means less jobs for us (Score:5, Interesting)
So Sun and IBM are turning their attention to some particular area that needs more optimizations... this just means that in ten years, there is going to be a higher level of abstraction with the same problems to solve. I'll have to figure out how to my new McDonalds chain can just plug some new computers into a wall and have their order menus popup instantly... great for productivity, great progress, but it hardly cuts into the demand for technically skilled people.
Of course, intuitively there must be some point where the optimizations made start cutting into jobs. My feeling though is that we are still working on some of the most basic problems of computing, and it will be quite a long time before we reach the peak of this curve. I mean, a big focus of the article is how to most efficiently get data out of databases! We all take for granted that this is (currently) a very tricky issue. Imagine looking back in twenty years though... it's easy to imagine that we'll laugh at having to think about such basic issues at all. "Configuring a network? Gimme a break, piece of cake! Connect some wires and you're done!" we'll say. And yet it's easy to imagine that despite having solved all of these problems, we will still be faced with a set of complicated issues of the day to solve to utilize these features. We're still working out how to move information around efficiently. And this is just a discussion about how to move information around efficiently. We're not even getting into applications and what to do with that information once you have it.
Then someone will write an article about how IBM is focusing on the problems of that day, and is going to make it easy to handle *that* level of abstraction. We'll read that configuring interactions between networks to transparently and securely utilize excess CPU in your neighborhood, or your city, is going to be a breeeze, and we'll have this discussion all over again...
Re:This just means less jobs for us (Score:2)
They already are cutting jobs, alot of jobs were loss due to computers, however the jobs loss created more jobs because people had to install, repair and operate the computers.
By removing the operators but not replacing it with anyone else, where will the jobs come from?
Re:This just means less jobs for us (Score:1)
Re:This just means less jobs for us (Score:1)
Then someone will write an article about how IBM is focusing on the problems of that day...
And the article will automatically be submitted to the Slashdot of the day, which will automatically post it, and none of us will know about it because our computers will automatically post replies and assign our mod points for us.
Re:This just means less jobs for us (Score:1)
I'd rather look back in ~35 years when (biological) man has invented his last invention.
--
Re:This just means less jobs for us (Score:2)
Despite the dark horror science fiction to the contrary, this need not be a horrible event. In terms of available resources, some place like the astroid belt may in fact be the optimal place for our machine descendents to build their society. (I would think nanotechnological machine parts would operate far more efficiently in the relative cleanness of space, without all the garbage down here on earth). In addition, far far more energy is available up there.
Finally, usually when a new species is born it moves to an unoccupied ecological niche. Fact is, humans will never colonize space. Sure we might be able to make little self-contained bubbles of earth up there (sort of how the first amphibious creatures to go to land required constant moisture and had to return to water frequently) but we will never be able to really use the vast resources or do much more than huddle in our hab modules.
Intelligent machines smart enough to create new versions of themselves for this purpose will be subject to no such limits.
Its been a few million years for humanity : a really long road. But the season finale is almost upon us, with the last plot details wrapped up in the last few episodes as we finalize the work this century.
Re:This just means less jobs for us (Score:2, Interesting)
Your argument is analogous to the ones who opposed computerization in the 80s. Banks, shopping centres name it - computers would replace Humans ! more humans would lose their job was their argument.
The fact is that jobs are created in other ways
1)In efforts at automation.
2)Improving what has already been automated
3)Support of the automated product
The last point point is from that fact that if every cashier at the stores were replaced by robots - you would still want to see the manager when you have problems. Machines after all run by definition of finite set of rules, anything beyond that would call for human intervention or support.
Re:This just means less jobs for us (Score:1)
There was no such thing as IT in today's sense around 30 years ago. Now there are friggin' millions of IT workers. Assuming a relatively constant employment rate (which we have) this means these people would have been employed in something else a few years ago - consequently, no net loss from automation.
However, if the damn things didn't break down / need to be programmed / be plugged in (a usual first suggestion by the IT guy, unfortunately because it can weed out some enquiries) then we might be in trouble. Ultimately though, human fallibility means that they all require human maintainence. Self-healing wouldn't work as the human coders would make mistakes, and technology's onward movement would require new instructions to be a constant requirement so the robots can still tell that their arm isn't working anymore since the DRM upgrade or other emergent technology.
True artificial intelligence could end it though - but then the robots would see us as cheap labour so at least we could mine or do other jobs beneath the robots.
Re:This just means less jobs for us (Score:1)
yes, automation is always bad (Score:2)
Definition of "Problem" (Score:2)
From my point of view, that is a problem.
From the point of view of those [expletive deleted] at real networks, the "problem" seems to be that I've found a way to disable their unfettered access to my system for whatever under heaven they want to do.
Now, you say, what does this have to do with server farms and data clusters?
In the present day - not much. Such things require a level of expertise to run such that sleaze of this kind is rare (albeit not unheard of.)
In the near future, when 1 billion people (according to the article) are working at computers? Well, the article implies that this great growth in the computer aided labor sector (term I just made up) will NOT be accompanied by an equal upsurge in available expertise.
Therefore, a lot of people will be running high-economic impact computer-whatevers without the background to comparison shop, or the technical knowhow to disable corporate flack. In fact, this is already happening.
I worry that "intelligently self regulate" will become "intelligently install our software and make sure you pay whatever we decide it is worth" in short order. While they're at it, they'll charge you for the software to police you. Peachy keen.
Re:Definition of "Problem" (Score:1)
what a bad way (Score:2, Informative)
"If you're being chased by a big snarling dog, you don't have to worry about adjusting your heart rate or releasing a precise amount of adrenaline.
I was expecting the article to be on "Super Computers used in medicine" when I read that.
This would have been a better quote:
"...hope is that the constant and costly intervention of database and network administrators trying to figure out what must be done will soon be a thing of the past"
Wrong problem. (Score:2, Funny)
Andrew
Master control? (Score:2)
Maybe it's just me, but when you hear "master-control" and "computer" together, don't you just picture this [tron-movie.com]?
J
Of Course they do. (Score:2, Redundant)
But first we need to get humans to fix the operating systems before computers can take over. *cough*Windows*cough*
Microsoft can use this (Score:1)
Oh wait, they're already planning on doing that.
Project eLiza (Score:1)
Second is:
What is your username?
jamuraa
What makes you believe your username is jamuraa?
It just is.
Is it because of your plans that you say it just is?
Not really, you gave it to me.
Maybe your life have something to do with this.
Programs running amuck. (Score:1)
All this is what makes working with computers interesting for us, and give us something to do, and experiment with. Self-healing computers as such don't really exist totally, but we are working on it;-)
Re:Programs running amuck. (Score:1)
Why would you ever want to use that in the first place?
sounds familiar (Score:1)
this sounds like "Hub".
-ben-
I Thought This Existed? (Score:1)
Whoa. (Score:4, Funny)
Huh!
Guess it's about time I upgraded, then...
Oh No (Score:2)
Daemons? (Score:2)
Re:Daemons? (Score:2)
At least that what demons in my game would do..
Computers shouldn't start thinking... (Score:1)
Open standards (Score:4, Interesting)
This made me look fore more info on this guy (Robert Morris), here is an interview [oreillynet.com]. He seems like a good guy in good position.
Re:Open standards (Score:1)
See, script kiddies can grow up to lead productive lives.
Re:Open standards (Score:2)
Re:Open standards (Score:1)
Oh, boy, he's running IBM's Research Center?
Re:Open standards (Score:2)
Just make sure you get the right Robert Morris. You want this one [ibm.com], not this one [discovery.com], nor this one [venona.com]. (Anyone got a link to a picture of the elder Morris?, the only one I can find is google cache [google.com]
uh oh (Score:1, Interesting)
an example: This new computer sees that whenever Sally opens up her word processor, the sound driver crashes.. after struggling to figure out why the driver crashes, with no avail, the computer realizes that the problem is solved if Sally isn't around anymore to open up word processors.
I wish i didn't have to add a "this is humour" tag at the bottom of this, but here it is.
Re:uh oh (Score:1)
The Matrix and 2001 are 2 films I'm currently thinking of for some reason, and half-a-dozen books... Thing is - if this _is_ going to be the future, will we have the wherewithal/'guts' to stop it from happening? Or will we ("software developers") think "well, I might not agree with the aim but it's _a job_"?
Yeah! (Score:2)
Who fixes the repair mechanisms when they fail? (Score:1)
Who fixes the repairing mechanisms? The computer? With what? More repairing mechanisms! But who fixes those? More repairing mechanisms! But who fixes those?.....end of universe.
Re:Who fixes the repair mechanisms when they fail? (Score:1)
They didn't say they wouldn't need people anymore! They are saying they need people that just don't do the same repetitive things every time, to make room for more creative and smart programs.
In the end, there will always be a human in control. This has nothing to do with HAL, just with computers that are made up out of more robust compounds, that follow some standards, so they can be interpreted by other computers. Compare it to the difference between a Word and a RDF document. Both carry information, but the latter is easier for computers, because it has pointers to similar information, so the computer can put things in context.
Or something like that.
Why not ask an AI if an AI can control a computer? (Score:2)
*conversation below*
Me: What do you think about artificial intelligence?
Talk-Bot: I think you're asking the wrong person.
Me: I thought you were the best person to ask.
Talk-Bot: It amazes me sometimes what some people think.
Me: Would you be able to control a computer, like the brain control my body?
Talk-Bot: I could but that doesn't mean I would.
After carefully analyzing the answers from an artificial intelligence, I have come to the conclusion that AI's aren't ready to control computers and their software just yet. Perhaps not because they aren't advanced enough, but because they don't "feel like it". This is of course calming news for system administrators and the likes.
Who has a monkey job? (Score:3, Insightful)
This is going to happen, so the best thing to do is to climb up the ladder, and try to be ahead of it. It may be a lot of work in the beginning, but it could reduce work (and costs) in the end. This is similar to HP + (Compaq + Digital) who are reducing their server line from three types to one. It will cut in our flesh now, but it will allow us to grow as a whole.
It's life, my friends, don't think you're immune for it.
Re:Who has a monkey job? (Score:1)
Don King, is that you? Did you mean atomize?
Main Entry: atomize
Pronunciation: 'a-t&-"mIz
Function: transitive verb
Inflected Form(s): -ized; -izing
Date: 1845
1 : to treat as made up of many discrete units
2 : to reduce to minute particles or to a fine spray
3 : DIVIDE, FRAGMENT
4 : to subject to attack by nuclear weapons
Sorry I couldn't resist.
Re:Who has a monkey job? (Score:1)
I work for a manufacturing company, not a tech company, so maybe I have a different perspective from someone doing "pure tech", a lot of what I do all day is a solution to a particular problem that could not be generalized at all. It's the difference between providing a commodity and doing something that is customized. Desktop software is a commodity, and maintaining a desktop computer is commodity labor.
Don't scream about eliminating your job, and support open source in the same breath. Open source has in large part been aimed at eliminating people getting rich from providing what boil down to software commodities, bringing prices down to reasonable levels and restoring control to the consumer. If software has made you a commodity also, then maybe you need to reevaluate what values your skills really had in the first place.
Re:Who has a monkey job? (Score:1)
Help for CowboyNeal... (Score:2)
ulimit: ulimit [-SHacdflmnpstuv] [limit]
Ulimit provides control over the resources available to processes started by the shell, on systems that allow such control. If an option is given, it is interpreted as follows:
-S use the `soft' resource limit
-H use the `hard' resource limit
-a all current limits are reported
-c the maximum size of core files created
-d the maximum size of a process's data segment
-f the maximum size of files created by the shell
-l the maximum size a process may lock into memory
-m the maximum resident set size
-n the maximum number of open file descriptors
-p the pipe buffer size
-s the maximum stack size
-t the maximum amount of cpu time in seconds
-u the maximum number of user processes
-v the size of virtual memory
If LIMIT is given, it is the new value of the specified resource. Otherwise, the current value of the specified resource is printed. If no option is given, then -f is assumed. Values are in 1024-byte increments, except for -t, which is in seconds, -p, which is in increments of 512 bytes, and -u, which is an unscaled number of processes.
Re:Help for CowboyNeal... (Score:2)
on *BSD there are login classes, kind of like groups but define access according to how much mem available, how many processes to run, and more. setting total processes (and other things, like open files) for the system is a sysctl variable as well
on Solaris there is "set maxuprc=50" in /etc/system.
there's more but i'm hungry. someone please fill in the rest.
Re:Help for CowboyNeal... (Score:2)
The Uncomputable (Score:2)
Re:The Uncomputable (Score:1)
apply evolution (Score:1)
Computer vs. Human (Score:2, Informative)
There are many ways in which your human body fails. As we've mentioned on slashdot before, it's not really that efficient -- we spend anywhere from 6-10 hours a day (25-40% of the day) recharging and regrouping. If, as a SysAdmin, your network was down for that percentage, would you still have your job? I doubt it...
Also, it fails to protect your body from attacks, we have an endoskeleton, if you look at an ant or any other insect which can take out animals many times it's size, you will notice that they have exoskeletons. It's kind of like having your network security inside the network, leaving some of the network wide open. We all know that exploits will bring down a network that's even partially open.
One more point about our body, it gets sick often, some more than others, and some worse than others. I, for example, have diabetes and I have an insulin pump [minimed.com] to inject insulin since my body attacked a part of itself for some reason as of yet unknown. It's something like your OS deleting your TCP/IP capabilities, it leaves you stranded.
Now, I'm sure there are many biology people who will point out that our bodies are amazing feats of detail, etc. etc. That may be true, but I still don't see how that makes it a good blueprint for technology that we create. Remember, it is only with technology that our infant mortality rate is not 40% or whatever ridiculous number it was in the 19th century.
It took how long? (Score:2)
Re:It took how long? (Score:1)
(I mean Windows destroying itself, not Windows the virus...)
Or do I mean that?
Maybe I mean Windows the virus...
Or is Windows the carrier and Gates is the virus?
How about ... (Score:3, Interesting)
A good thing, but it will bring new problems (Score:2)
Take the human body's allergic reactions, for instance. Your body may react to something that's really not harmful, but it thinks it's protecting you. The unintended effect of the reaction can range from a mild annoyance to death.
In nature, other life forms have evolved to take advantage of your autonomous reactions, and I'm sure we will see this in the computer world as well. Wait until some script kiddie figures out that he can crash your server (or at least eat up CPU cycles) by sending it a signal that makes an autonomous daemon overreact in trying to do its job. The problem will be discovered, exploited, patched (but not on MS boxes) and a new exploit will be found. Circle of life and all that, I suppose.
Still, I think borrowing ideas from mother nature for the evolution of computers is the right way to go. After all, she's had millions of years to work on the problem through trial and error. We can build on that research and perhaps improve upon it, at which point we'll probably start looking at how to control our own evolution. Just remember to never write a daemon that prevents you from pulling the plug.
Re:A good thing, but it will bring new problems (Score:1)
One of them was about an intelligent computer created by a whole bunch of scientists. They booted it up, and typed in the question "Is there a God?". At that moment, a lightning bolt came from the sky, fusing the power source into the on position, and then the computer replied: "Now there is".
Did you *READ* the article? (Score:3, Insightful)
AFAIK, the free and open-source PostgreSQL also has similar technology built in.
*YAWN*
Come back when there's something to read, eh?
Re:Did you *READ* the article? (Score:1)
AFAIK, the free and open-source PostgreSQL also has similar technology built in.
*YAWN*
Oracle has had this for years. Oh yeah, and it works.
Re:Did you *READ* the article? (Score:1)
Not AFAIK -- the PostgreSQL query optimizer uses statistics collected periodically (namely, when the ANALYZE or VACUUM ANALYZE commands are run); optimizer statistics are not updated with any data collected by during query execution. I'm not saying that self-tuning optimizer statistics are a bad idea, but they haven't yet been implemented in PostgreSQL. You might be referring to GEQO (a version of the query optimizer built-in to PostgreSQL that uses a genetic algorithm to avoid an exhaustive search of the solution space for large join queries), but that is obviously completely different.
Here's a good paper on a related Microsoft technology: STHoles: A Multidimensional Workload-Aware Histogram [nec.com]. IMHO that's the most interesting part of the AutoAdmin stuff mentioned in the article -- I don't care for the performance tuning wizard so much. Also, the design of the IBM LEO query optimizer, mentioned in the article, is described in this paper: LEO - DB2's LEarning Optimizer [nec.com].
hurry up (Score:2, Funny)
Oooh, and a hover board, they said I could have a hover board, where is it, dammit?
Re:hurry up (Score:2)
in the trunk of my hover car.
Why don't we strap on our jet packs and go get it?
Lousy dog analogy and fixing problems (Score:1)
Smarter Compilers First (Score:1)
running amuck (Score:1)
Oh, yes! I got to get me one of these! (Score:1)
Meant to say:
The article didn't go far enough!
If you're being chased by a big snarling dog [...deletia...] I'd just be happy with a few intelligent daemons to watch my back
RIGHT! Only I want the sales department's laptop docking stations in the staff-meeting room to be equipped with the voice-decoding circuitry and the clue-by-four attachment (big wooden mallet)!
First time the lead sales weasel brings up a fundamental change to the project requirement late in acceptance testing...POW!
Muah-hahahahahahahah!!!
Well, hey, I can dream, can't I?
the market (Score:1)
Skynet (Score:1)
Re:Skynet (Score:1)
The idea that this broad, her asshole kid, and a reject robot could defeat a true AI - it's laughable...
Equally laughable was the old Star Trek stories where Kirk always outwitted the superhuman AI or robot by making it play emotional games...
No one has ever pointed out Roddenberry's Luddite attitude toward technology...
fork bomb? (Score:2)
I can imagine circumstances like this happening inadvertantly due to program or kernel bugs, but isn't this a rare occurance unless you've executed a deliberate fork bomb?
The paging daemon (Score:2)
What do you do when you need to run a CAD program on a system with low memory? Do you need a daemon ouside the box to talk you into not starting and using that app?
All this seems to be is increasing userfriendliness by limiting you to what you can do. I'd rather see a blue screen due to my evil memory editing in debug.
Old idea and a huge task (Score:2, Interesting)
The article also touches automatic database tuning depending on the actual use of the database. I look forward to a database which automatically modifies the schema when it finds that a parent table always joins with a child table with referential constraints.
IBM has previously introduced self-healing servers [ibm.com] that essentially are able to detect that something has gone awfully wrong and therefore reboots. It may not be an elegant solution, but if it works the customer is happy.
All this is part of an evolution in software, or a "next step" in software implementation. The steps are:
Examples:
Out of diskspace when you download your email. The email program should find some spare diskspace somewhere on another partition or extend the current partition.
foo-1.7 requires bar-1.2. rpm should automatically downloads bar-1.2. Preferrably from a computer on the same lan that already has bar-1.2
A node in a cluster is overloaded. The cluster software should move applications/services to another node.
A HTTP server is using too much bandwidth. It should automaticall service images with less quality (and therefore use less bandwidth)
I look forward to it. It would allow me to let the computers monitor themselves and fix most problems without pestering me. And then I could use my time for something much more interesting than looking in /var/log/*, restarting failed applications, etc.
It spits out Windows Install CD-ROMs? (Score:2)
And this "organic"self-repair Zen gestalt type of system is a bunch if crap. If it really worked, there would be no cyrrhosis, no adult-onset diabetes, no emphesima, etcera...
Living systems reproduce asexually or breed because everything more complex than an algal mat is incapable of surviving for very long.
We'd do better do better than what passes for intelligence and self-repair onm homo sapiens sapiens.
Re:It spits out Windows Install CD-ROMs? (Score:1)
from the article (Score:1)
This sounds like they want a computer to run hd and memory defraggers. big deal.
Ironclad security : Firewalls?
Fast automatic recovery : fsck?
interoperability : kermit?
fix crashes : watchdog card?
prevent crashes : automatically convert to linux?
don't worry about my mumbling, I just let my mind walk....
DISCLAIMER:
It's almost 2am, I'm dead tired, a bit drunk, and I scored C++++ on geek code, so don't take me too seriouslyBad analogy (Score:5, Funny)
If our bodies worked the way we wanted, things would be very different. First, you'd get a huge boost of adrenaline so you could outrun the dog. Also, although your heart would speed up, you'd have no risk of a heart attack or other complications from overexerting yourself. You wouldn't get tired. And you'd be equipped with built in weapons for annihilating hostile canines.
You'd also never have to worry about getting nervous trying to talk to that new cutie at work, acne wouldn't exist, and we all be our ideal weight.
Our bodies, at best, make fair attempts at adjusting to situations, but they blow it as often as they get it right. Frankly, if our computers become as reliable as our bodies, I'm going to invest in pencils.
Re:Bad analogy (Score:1)
You'd also never have to worry about getting nervous trying to talk to that new cutie at work, acne wouldn't exist, and we all be our ideal weight.
Our bodies, at best, make fair attempts at adjusting to situations, but they blow it as often as they get it right. Frankly, if our computers become as reliable as our bodies, I'm going to invest in pencils.
Sounds like Anime.
it would gives a whole new sense to "GPL Cancer".. (Score:1)
like robots in car factories (Score:1)
"heavy hitters?" (Score:1)
The Magic of AI (Score:1)
I would bet that the first 'fantastic breakthrough' AI will be the creation of other 'intelligent' software.
Everybody is stuck on the software aspect ..... (Score:2)
But, at least for now, this effort is really aimed at the hardware. Today we can see the beginnings of self-healing hardware in place. Some enterprise systems can already phone home when they have a HARDWARE problem, and let the support folks know that there is a problem. And with systems like Sun's SunFire x800 series servers, the sys admin can dynamically reconfigure the system to de-allocated bad CPU's or memory, I/O boards can be removed hot, etc. So, the next logical step is for the server to de-allocate the CPU that failed itself, and send an alert, probably via SNMP, to the sys admin. By doing it dynamically the server keeps running, albeit with a reduced work capacity. Even better would be to have "spare" CPU boards in the box that could be immediately allocated to replace the failed board. All of this is possible today, with human intervention. The point is to get the system to be able to do it without human intervention.
On the software side, I think it will take a bit longer. Some things, like database optimizers, possibly can be done right now. But, my observations (I'm not a DBA) of the database world indicate that most database optimizers aren't truly self-tuning/healing. Instead they can tune or heal for known conditions and make assumptions about how you want your database optimized. Most real DBA's hate this and have to spend extra time shutting off the self-optimizing functions and then performing their own optimization for their own real world scenario.
Will that many IT workers really be needed? (Score:1)
Do the easy math: that comes one IT worker for every 5 people using computers. That seems like an outrageous overestimate. Haven't people learned from the demand hype of the late 90's which drove the whole technology sector into a deep recession?
If you were to draw a curve graph comparing use of the internet (i.e. how many web pages viewed, how many users checking their e-mail), demand (as opposed to need) for technology resources, and the amount of money people are willing to spend for technology on a graph, you would see an interesting thing: In 97-99, you would see that demand for technology resources would be high--companies would be scrambling over themselves to hire as many IT professionals and purchase as much software, hardware, routers, cable, etc. as possible. At the same time, the line for money spent would be extremely high as well. On the other hand, the "actual use" line would be low. Move ahead to 2000-Spring 2000. You would see that the money line had dropped by a nice little chunk (no one wanted to spend any money on the web anymore if they could help it--it was the year of the dot com busts). Infrastructure would still be important, so that line would only have dipped slightly. People have ditched vague commercial ideas for more sound click-and-mortar technology. The internet use line, meanwhile, is going upwards rapidly. Move to late spring/summer of 2001. Both the demand line and the money line have plummeted. No one wants to invest in new technologies, and no one is even wanting to spend money on standard technology like Cisco routers. What was the dot-com bust is now the Internet infrastructure bust. Paradoxically, Internet use has continued to grow this whole time, and now over half of all people in the US do something online every day. So the whole time that IT technology acquisition and IT financial investment is going down, Internet use is actually going up
Okay, that was a little long winded, but my point is that IT growth should *match* IT use, not move in the opposite direction from it.
Master biology first (Score:2)
1. There is no cure for a good many harmful viruses. Even dumb machines catch man-made viri. More complex ones are going to need HMO's?
2. No safe appitite supressent discovered yet dispite the fact that many people are naturally thin.
3. Cancer seems as elusive as ever, and may be simple but unreversable darwinian entropy of single cells evolving independent of the collective.
4. Way too little about how the brain works
Finally, biology often depends on trial-and-ERROR to adjust and correct itself. Do you really want your database "practicing" some new technique on the CEO's anual report?
If you mirror something you don't understand, don't complain to me when it barfs.
reliability? (Score:2)
Theory: Common DLLs would be updated with new DLLs when new programs came out, so that old ones would automatically benefit from the new code.
Result: DLL hell
Theory: Office2k and newer stuff with Windows installer 'tech'. Install on demand, restore file associations/missing files/shortcuts every time you run the program.
Result: Nobody, not you, not MS, not the computer, at any time knows what's installed on the PC. Every time you try to remove a deskop/quicklaunch/start menu shortcut or a file association, the application will think for 2min then ask you to find the CD and let it look at that for another 2 min.
What's next? The computer smells you coming, sprouts legs and runs away?
The last reliable lifeform-like program available for computers was a virus.
This is a repost from a year ago (Score:1)
was
http://www.research.ibm.com/autonomic
it is now located at
http://researchweb.watson.ibm.com/autonomic/
It will be here in Longhorn! (Score:2)
"As part of Longhorn, Allchin said customers can expect to see new features for intelligent auto configuration, such as BIOSes and firmware that can be "automatically updated in a seamless way." Also, Allchin said Longhorn will include new functionality for server resiliency, such as self-healing characteristics, a more componentized architecture, and additional monitoring services with filters that can "dynamically" flow out to servers. "
Right on target there, Microsoft!
Let's write good software first. (Score:2)
I really think that software quality has stagnated, where funding nearly always stops short of allowing proper design and quality controls.
Who at Microsoft and IBM are going to ensure that the super-self-healing code can heal itself and in a usefully wide variety of situations?
Once such abstraction reaches a new threshold, how many people will be left around the world who can diagnose a real problem when it occurs?
I've seen repeatedly that higher abstraction does not always result in a better system. I know many "software engineers" who can't even determine that basic network issues or OS contentions are "breaking" their software. All they care about are their nifty buzzword-compliant IDEs with code highlighting. Once the population finally degrades to where nearly everyone is like this, what then?