Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Software Technology

Computer Software to Predict the Unpredictable 287

Amigan writes "Professor Jerzy Rozenblit at the University of Arizona was awarded $2.2Million to develop software to predict the unpredictable — specifically relating to volatile political and military situations." From the article: "The software will predict the actions of paramilitary groups, ethnic factions, terrorists and criminal groups, while aiding commanders in devising strategies for stabilizing areas before, during and after conflicts. It also will have many civilian applications in finance, law enforcement, epidemiology and the aftermath of natural disasters, such as hurricane Katrina."
This discussion has been archived. No new comments can be posted.

Computer Software to Predict the Unpredictable

Comments Filter:
  • bullshit flag (Score:5, Insightful)

    by Anonymous Coward on Wednesday October 17, 2007 @07:16PM (#21018097)
    Go ahead and predict the weather for a week. I will be impressed.

    Predict it for 2 weeks, I will blow you.

    You cannot predict something with so many variables that you don't understand. You certainly cannot do it regarding how people will react.

  • Ridiculous (Score:5, Insightful)

    by Reality Master 101 ( 179095 ) <<moc.liamg> <ta> <101retsaMytilaeR>> on Wednesday October 17, 2007 @07:19PM (#21018129) Homepage Journal

    Apparently there are those that have forgotten the old computer law of "Garbage In, Garbage Out". Even if we had a perfect model to predict these sort of things, we don't have any way of supplying the required data to model the prediction. What's the computer going to do, go undercover in secret groups? Read the web sites? Listen to radio chatter and analyze their conversations?

    Maybe someday when we have a real science of A.I. something like this might be possible, but all it shows is that this university professor will happily take government money for delivering absolutely nothing.

  • Trantor (Score:2, Insightful)

    by danilo.moret ( 997554 ) on Wednesday October 17, 2007 @07:32PM (#21018301)
    You just need to find one single planetary system complex enough, some basic axioms, a lot of spare mathematicians and Hari Seldon to come up with a solution for predicting the unpredictable, as long as the unpredictable isn't the Mule.
  • by orkysoft ( 93727 ) <orkysoft@myMONET ... om minus painter> on Wednesday October 17, 2007 @07:41PM (#21018431) Journal

    ...the program will still fail to predict it. By definition.

    But it's magic! It's a computer program, which is magic to most people.

  • Re:Ridiculous (Score:3, Insightful)

    by ChrisMounce ( 1096567 ) on Wednesday October 17, 2007 @08:01PM (#21018655)

    Maybe someday when we have a real science of A.I. something like this might be possible, but all it shows is that this university professor will happily take government money for delivering absolutely nothing.
    He has already perfected the software and is using it to game the grant system.
  • Re:bullshit flag (Score:5, Insightful)

    by zeromorph ( 1009305 ) on Wednesday October 17, 2007 @08:09PM (#21018739)

    bullshit flag

    I second that.

    The whole article is totally bizarre and buzzword populated begging for attention. Not only will it predict the actions of nearly every bunch of lunatics it will also "display data in graphical, 3-D and other forms that can be quickly grasped".

    Please! We have a highly complex situation, with a lot of different agents and a long genesis, and literally millions of different contextual factors influencing the situation and they take all this munch and crunch it a little with fancy buzzword concepts and put it in a pie chart?

    This is an insultingly brazen self-adulation.

    While the software ultimately could save millions of lives,...

    Ok, I changed my mind I'm gonna die laughing.

  • Re:Ridiculous (Score:3, Insightful)

    by Jeremi ( 14640 ) on Wednesday October 17, 2007 @08:17PM (#21018827) Homepage
    Apparently there are those that have forgotten the old computer law of "Garbage In, Garbage Out" [...] all it shows is that this university professor will happily take government money for delivering absolutely nothing.


    Unless, of course, garbage is what they are after. Last time it was "Curveball" that gave them the necessary disinformation to justify a war; next time they won't even need to bother with informants, they'll just look to their computer program to tell them an invasion is necessary. Accuracy would only get in the way of the political goals anyway.

  • by Kjella ( 173770 ) on Wednesday October 17, 2007 @08:30PM (#21018969) Homepage
    ...using a computer model is like driving by looking out the rear view mirror. I bet after the first major miss, they'll claim to have added that to the model. Then the next big thing will add five new factors and make three others irrelevant. Computers can't predict what they don't know what means. I'd much rather take a well reasoned human analysis over that unique situation than trying to find patterns that are spurious at best and plain out wrong at worst.
  • by Jeremi ( 14640 ) on Wednesday October 17, 2007 @08:33PM (#21019003) Homepage
    I always thought that was funny. If his prediction models were so good couldn't he just factor in the fact that people were aware of the results?


    That would give you an infinite recursion, no?


    i.e. then you'd have to factor in the fact that people were aware that you had factored in the results, and then factor in the fact that they were aware you had factored in the fact that people were aware you had factored in the results.... and so on until your head explodes.

  • You don't get it (Score:4, Insightful)

    by passthecrackpipe ( 598773 ) * <passthecrackpipe AT hotmail DOT com> on Wednesday October 17, 2007 @08:34PM (#21019021)
    It isn't about actually being able to predict anything useful. Think of it like this. As a "World Leader" [sic], how much would you spend on the Ultimate Cop-Out(tm)? yeah a few million is a *bargain* for what this thing can do. None of the people involved in this project are actually interested in the predictions. What they are interested in is that the *next* time they have a royal screw up, they can say: "well, its unfortunate this happened, but you see, we have really smart supercomputer. It has 3-D and stuff. And it tells us what is most likely to happen. This wasn't on the list. We only have limited resources, and this is the best way to focus those resources where they are most likely to be doing us good".

    Its the ultimate repudiation. As far as I can predict, they will spend lots and lots more money on this, get some buddies in on the gravy train somewhere to boot, and they still got themselves a bargain.
  • by Pedrito ( 94783 ) on Wednesday October 17, 2007 @08:35PM (#21019027)
    "...and the aftermath of natural disasters, such as Katrina."

    Dealing with the aftermath of Katrina wasn't a matter of applying rocket science. It was simply a matter of simple logistics and a government that gives a shit about people. Unfortunately, the U.S. government has shown time and again under this administration that it could care less for the lives of its citizens, let alone the citizens of other countries. These problems can't be fixed by software. They can only be fixed by real leadership, something the people of the U.S. haven't shown much interest in electing...

    It doesn't take software to predict that going into Iraq was a huge mistake. Just ask Chaney circa 1994 [noctaluca.com]. He knew it would be a major mistake, and he wasn't the only one. A lot of us were yelling and screaming to stop it before it started...

    Software can't predict the future nor can they predict what stupid leaders will do. On Sept 10th, could anyone (or more importantly, any software) predict what things would be like in this country today? Even remotely? The war in Iraq, a country completely disconnected from 9/11. Guantanamo, spying on our citizens and other erosions of liberty... I doubt it. A single event and the responses by inept leadership led to a variety of disasters that nothing and nobody could have predicted.
  • Ah well. (Score:3, Insightful)

    by Colin Smith ( 2679 ) on Wednesday October 17, 2007 @08:46PM (#21019147)
    You see.

    People will trust the answer a bit of software gives them when they won't trust exactly the same answer, calculated in exactly the same way but presented by the same expert who wrote the software in the first place...

    Particularly if the software system cost 8+ figures.
     
  • by Oldav ( 533444 ) on Wednesday October 17, 2007 @09:08PM (#21019347)
    On the 11/9, I wept not only for the tradgedy of that day, but for the inevitable homicidal overreaction that would and did come from the US, "somone must be punished, they are someone so they should be punished" If you didnt see this coming, you have no vision of the future at all. "On Sept 10th, could anyone (or more importantly, any software) predict what things would be like in this country today?"
  • Re:computer? (Score:3, Insightful)

    by Jame_Retief ( 1090281 ) on Wednesday October 17, 2007 @09:16PM (#21019429)
    You are, of course, presuming that Asimov was doing more than writing good FICTION. I sincerely doubt that any program will have any noticeable success in predicting anything, regardless of the wads of cash thrown at it to make it 'better'. Remember all the computers in the classroom priorities of the last few decades? How many of you used a computer in ANY bloody classroom that did not relate directly to the class? (ie- C+ programming {more likely FORTRAN}).
  • Re:computer? (Score:3, Insightful)

    by smellotron ( 1039250 ) on Thursday October 18, 2007 @12:14AM (#21020899)

    It is by no means obvious that human society is complex enough to be called unpredictable in principle... While human beings may not be predictable in a strictly deductive sense, most people are (for better or for worse) rather mundane in terms of how eccentric they can be (in a way that actually affects other parts of society).

    There's still the issue of dealing with the tail end of any distribution. I don't care about the 99.999% of people who, in the aggregate, fit a model. I care about that 0.001% of people who are going to completely blow it (because, as always, "past performance does not implly future performance").

  • by mveloso ( 325617 ) on Thursday October 18, 2007 @01:24AM (#21021275)
    Your analysis seems valid, for situations that are relatively stable. For system which are in flux (such as in a combat area), reality is substantially more fluid than, say, the traffic patterns in Queens.

    It takes events like 9/11, or the invasion of Kuwait by Iraq, to adjust the normal state of affairs. In a flux situation, small actions (and individual actors) can cause tremendous instability...or crystalization, depending.
  • Music (Score:5, Insightful)

    by Stooshie ( 993666 ) on Thursday October 18, 2007 @07:58AM (#21022957) Journal

    A nice analogy when people think computers can make decisions or have "Artificial Intelligence" of any merit.

    Pupil (Excited about AI):- I have just written a programme that writes music in the style of JS Bach.
    Tutor (Seen it all before):- Really? How does that work then?
    Pupil:- I programmed all of the known manuscripts by Bach and the computer uses that to write new compositions.
    Tutor:- Great, can it write in the style of Mozart?
    Pupil:- Sure, give me all the compositions by Mozart and I'll show you.
    Tutor:- You mis-understand, can it decide, of it's own volition, to write in the style of Mozart.
    Pupil:- Well, no it needs to base it's composition on something.
    Tutor:- It has the entire works of Bach, is that not enough?
    Pupil:- No, it needs the entire works of Mozart to write in the style of Mozart. Hell, even music students need to have heard Mozart in order to write in the style of Mozart.
    Tutor:- Oh, so how did Mozart do it then?

  • by Der Einzige ( 1042022 ) on Thursday October 18, 2007 @11:48AM (#21025883)

    Two money quotes from TFA:

    The software will predict the actions of paramilitary groups, ethnic factions, terrorists and criminal groups ...

    Since the end of the Cold War, our opponents have behaved in ways that defy what we would consider normal logic, pursuing actions that we find almost inconceivable ...

    Note especially the text I bolded. This assumes that "normal logic" predicts actions that conform to American cultural and ideological biases.

    This is not about predicting truly unpredictable behavior. This is about predicting behavior that is perfectly logical from the viewpoint of the people doing it. US planners can't admit that the behavior is logical, because that would be granting rational legitimacy to opposing US government policy, and that would mean admitting that the US is not always right.

    This project would never have been thought of if we tried to understand our opponents instead of demonizing them. This project is only necessary because the US is committed to pretending that only irrational people oppose US policies.

    If, for example, Americans understood how Muslims feel about the Holy Cities of Mecca and Medina being defended by infidel soldiers, they wouldn't be so quick to call bin Ladin and al Qaeda irrational.

    US writers like Thomas Friedman and Victor Davis Hansen try to diagnose militant Islam as "irrationality" and "immaturity," because they refuse to consider that anyone might have rational reasons for opposing US military and commercial power, or for trying to defend their traditional culture and religion from US cultural influence.

    Note that I am not saying that America's enemies are "right." This is not about right or wrong; this is about competing political and cultural interests.

    Whatever money is going into this project would be better spent on hiring more State Department people who speak foreign languages and understand foreign cultures.
  • no (Score:3, Insightful)

    by slew ( 2918 ) on Thursday October 18, 2007 @05:58PM (#21032389)
    Not true.

    Giving kids everything they could possible need makes them _spoiled_
    Giving kids everything they could possibly want makes them _entitled_

    Eventually, a kid will need to get things they need by themselves. Delaying a kid's recognition of this fact will make them spoiled (at least a little bit in the best case). In addition, giving everything they could need will deprive the kid of the ambition and self confidence they would gain from doing the things they need to get done by themselves. Of course as a parent, it's purdent to provide a safety net in case things don't work out as expected and it doesn't hurt to give them _some_ things that they need or want, but there needs to be something that kid needs to do to grow up and be a contributing member of society (and it seems to me that that's a parent's primary job).

    Of course another way to approach this is to want your kid to be dependent on you for all their needs the rest of their lives (I know parents that desire this type of outcome, so it's not actually a rhetorical statement).

    Feel free to substitute parent-kid with government-citizen at your convenience...

With your bare hands?!?

Working...