Wielding Supercomputers To Make High-Stakes Predictions 65
aarondubrow writes "The emergence of the uncertainty quantification field was initially spurred in the mid-1990s by the federal government's desire to use computer models to predict the reliability of nuclear weapons. Since then, the toll of high-stake events that could potentially have been better anticipated if improved predictive computer models had been available — like the Columbia disaster, Hurricane Katrina and the World Trade Center collapse after the 9/11 terrorist attacks — has catapulted research on uncertainty quantification to the scientific and engineering forefronts." (Read this with your Texas propaganda filter turned to High.)
Like Global Climate Change ? (Score:5, Interesting)
Seems to me all the supercomputer models are predicting the disaster called global climate change is powered by human CO2 emissions. We have predicted it. It has a decided human cause against which we can take direct action. Over the next 50 years billions of people will be displaced. Trillions or more of infrastructure will be lost to rising oceans.
Are we doing anything? Seems to me the whole prediction thing is useless if we are unwilling to take action on the results.
Is it because the results are wrong or is it because it involves money in peoples pockets.
We can make the predictions, we need to remove the barriers to action.
federal government's desire to use computer models (Score:5, Interesting)
What was old is new again. [usatoday.com]
The title is misleading (Score:2, Interesting)
The title is misleading and not really correct, because it doesn't describe the main thrust of the project. What the group at Texas is trying to do is change the way computer models make predictions, because they recognize that predicting events like Katrina or 9/11 with any kind of accuracy, based on essentially no data, is basically impossible, and that even when prediction is possible, it's still full of uncertainty.
They don't want the models to spit out a single answer (e.g. "There will be a sea level rise of 10 centimeters within 20 years"), but rather a probability distribution ("The sea level rise over the next twenty years can be modeled as a normal distribution with a mean of 10 centimeters and a standard deviation of 5 centimeters"). The distribution is supposed to be based on uncertainties that arise at various stages in the process of modeling, such as model assumptions and data collection.
Personally, I think in certain cases these techniques are great, and in other cases they are worse than useless. If you have a model that's supposed to predict terrorist attacks, it will happily tell you that with 90% confidence the probability of a terrorist attack in Location X in the next year is between 1 and 3 percent. This may be perfectly correct, but highly misleading, because fundamentally the event is not probabilistic, and the only reason it appears to be is that a key piece of data is missing. As such, what the computer should really do is say the following. "Dear DHS: I don't know whether the terrorists are planning the attack. If they are, it is very likely to occur. If they aren't, it won't happen. Please do your job and go find out whether they are or not, and let me do more interesting things. Sincerely, Computer."
Re:The title is misleading (Score:2, Interesting)