Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Military Supercomputing United States

Fear of Thinking War Machines May Push U.S. To Exascale 192

dcblogs writes "Unlike China and Europe, the U.S. has yet to adopt and fund an exascale development program, and concerns about what that means to U.S. security are growing darker and more dire. If the U.S. falls behind in HPC, the consequences will be 'in a word, devastating,' Selmer Bringsford, chair of the Department. of Cognitive Science at Rensselaer Polytechnic Institute, said at a U.S. House forum this week. 'If we were to lose our capacity to build preeminently smart machines, that would be a very dark situation, because machines can serve as weapons.' The House is about to get a bill requiring the Dept. of Energy to establish an exascale program. But the expected funding level, about $200 million annually, 'is better than nothing, but compared to China and Europe it's at least 10 times too low,' said Earl Joseph, an HPC analyst at IDC. David McQueeney, vice president of IBM research, told lawmakers that HPC systems now have the ability to not only deal with large data sets but 'to draw insights out of them.' The new generation of machines are being programmed to understand what the data sources are telling them, he said."
This discussion has been archived. No new comments can be posted.

Fear of Thinking War Machines May Push U.S. To Exascale

Comments Filter:
  • by Ambvai ( 1106941 ) on Friday June 21, 2013 @05:33PM (#44074599)

    "...compared to China and Europe it's at least 10 times too low..."
    "Mr. President, we must not allow a mineshaft gap!"

    • by SuricouRaven ( 1897204 ) on Friday June 21, 2013 @05:36PM (#44074613)

      If you're going to have an arms race, it might as well be in an area with significent civilian applications.

      Shame the space race died once America made target and the USSR fell apart. If that had kept going, we'd be living in apartments on Mars by now.

      • by sincewhen ( 640526 ) on Friday June 21, 2013 @05:49PM (#44074713)

        Shame the space race died once America made target and the USSR fell apart. If that had kept going, we'd be living in apartments on Mars by now.

        Or perhaps deploying weapons on Mars by now...

      • If that had kept going, we'd be living in apartments on Mars by now.

        I dunno, people on another thread were complaining that NYC is expensive. Silicon Planet?

      • Re: (Score:2, Insightful)

        by cold fjord ( 826450 )

        Shame the space race died once America made target and the USSR fell apart. If that had kept going, we'd be living in apartments on Mars by now.

        Maybe, but the rent would be too darn high.

      • by amiga3D ( 567632 ) on Friday June 21, 2013 @06:48PM (#44075161)

        I thought that Nukes were bad but this is worse. Nukes are so drastic no one has gotten up the balls to fire one up since they saw what happened to Hiroshima and Nagasaki. This stuff is too easy to use and could actually end up being as bad as Nuclear war. Just what we need, Berserkers.

        http://en.wikipedia.org/wiki/Berserker_(Saberhagen) [wikipedia.org]

    • by durrr ( 1316311 ) on Friday June 21, 2013 @05:37PM (#44074623)

      How ironic, our fear of skynet will lead to us building it pre-emptively.

      • by Paul Fernhout ( 109597 ) on Friday June 21, 2013 @06:45PM (#44075145) Homepage

        http://www.pdfernhout.net/recognizing-irony-is-a-key-to-transcending-militarism.html [pdfernhout.net]
        "... Likewise, even United States three-letter agencies like the NSA and the CIA, as well as their foreign counterparts, are becoming ironic institutions in many ways. Despite probably having more computing power per square foot than any other place in the world, they seem not to have thought much about the implications of all that computer power and organized information to transform the world into a place of abundance for all. Cheap computing makes possible just about cheap everything else, as does the ability to make better designs through shared computing. I discuss that at length here: http://www.pdfernhout.net/post-scarcity-princeton.html [pdfernhout.net]
            There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all.
            So, while in the past, we had "nothing to fear but fear itself", the thing to fear these days is ironcially ... irony. :-)"

        And your point about the irony of how our fear of Skynet will lead to us building it preemptively is a great example of this general theme. It would be not much to worry about except that these technologies are so powerful -- which means we don't have to fight over material resources... See Marshall Brain's Manna at the end for another vision of what might be possible if we build a different sort of infrastructure with these technologies.
        http://marshallbrain.com/manna1.htm [marshallbrain.com]

        That said, people may always find ways to compete to show off for status. So, we as a global society need to redirect those urges into more productive (or less destructive) areas...
        "Evolution for competition & cooperation"
        http://slashdot.org/comments.pl?sid=3866253&cid=44019221 [slashdot.org]

        "Re:Helping the NSA transcend to abundance thinking (Score:3)"
        http://yro.slashdot.org/comments.pl?sid=2773253&cid=39629001 [slashdot.org] [slashdot.org]
        "To start with the bottom line: the very computers that make the new NSA facilities possible mean that the NSA's formal purpose is essentially soon to be at an end. Nothing you or I say here will reverse that trend. The only issue is how soon the NSA as a whole recognizes that fact, and then how people there choose to deal with that reality. ..."

        The increase in global spying is only one technology-driven trend of many going on right now.

        • Re: (Score:3, Interesting)

          by BlindRobin ( 768267 )

          Very interesting take on things, I like it. There is though one factor you seem to omit: The most powerful and influential people and collective entities in the world, and those that they employ, see the world very differently than you and I. To those that wish to rule and wield power, to those that always want more, regardless of how much they have, scarcity is not an issue. The well being of others, society as a whole, beyond being a resource or a problem when insurrection looms is not an issue for consi

      • So long as I get a cybernetic body, I'm down to serve my future robot overlords.

        • So long as I get a cybernetic body, I'm down to serve my future robot overlords.

          Will your robot body have the strength of ten gorillas or chainsaw hands?

    • by Aighearach ( 97333 ) on Friday June 21, 2013 @10:05PM (#44076311)

      I just don't see how they get from supercomputer to "smart machines" or even to a weapon.

      "The new generation of machines are being programmed to understand what the data sources are telling them, he said"

      Complete and total nonsense designed to trick non-technical people. Why is this drivel making it to slashdot? I know this place isn't want it used to be, but... is it really that much to ask that you hire actual nerds to edit submissions?

      Computers don't "understand" what they are doing. And to the extent that they can, they do already. It is a stupid semantic game with nothing to win. Does your calculator "understand" what it is doing when you're adding up a parts list? Most people are going to say "no." And that answers scales up to whatever calculations your exabyte supercomputer is doing. It is a basic philosophical question. Computers do not "think," they do not "understand," and yet, (or therefore) they make great expert systems.

      • Computers don't "understand" what they are doing. And to the extent that they can, they do already. It is a stupid semantic game with nothing to win. Does your calculator "understand" what it is doing when you're adding up a parts list? Most people are going to say "no." And that answers scales up to whatever calculations your exabyte supercomputer is doing. It is a basic philosophical question. Computers do not "think," they do not "understand," and yet, (or therefore) they make great expert systems.

        One may even say that the human brain does not understand. It is capable of logic and hammers out bazillions of logical derivations, evaluates and reevaluates until it determines a select few ideas that are granted the highest usability or belief.

        But from this view of the mind, what is to say a fast enough computer, with some elegant programming, can't compete with people in the department of "producing text that is considered valuable"? People are not even close to perfect in this department, when you thin

      • Complete and total nonsense designed to trick non-technical people.

        Someone else who doesn't recognize the significance of developments such as IBM's Watson.

      • Complete and total nonsense designed to trick non-technical people. Why is this drivel making it to slashdot?

        When you use complete and total nonsense to trick Congressmen into believing the nerd-related stuff you are paying/lobbying them to fund is useful, that is news for nerds.

  • by Anonymous Coward

    Didn't your mamma teach you that?

  • by icebike ( 68054 ) on Friday June 21, 2013 @05:36PM (#44074611)

    Selmer Bringsford, chair of the Department. of Cognitive Science at Rensselaer Polytechnic Institute, said at a U.S. House forum this week. '

    Seems we have plenty of super computers laying about, having only recently been booted from top place in the never ending game of leap-frog in high end
    machines.

    We prefer to use them for weather and spying on our own citizens, rather than making better weapons, especially when we can hide the funds for computer systems in the weapon funding.

    Not sure I'm buying the hand wringing act.

    • by cold fjord ( 826450 ) on Friday June 21, 2013 @06:11PM (#44074863)

      The big new Chinese supercomputer is ~ 34 petaflops. Exascale is 1,000,000 petaflops. That is a pretty big difference in scale. Although current supercomputers have tended to be "more of the same" stacked higher, the difference in scale here may signify a difference in kind. At the least there are likely to be some new engineering and programming challenges involved if they are really going to exploit that kind of potential.

      • by HuguesT ( 84078 ) on Friday June 21, 2013 @07:15PM (#44075421)

        1 exaflops (10^{18} flops) is 1000 petaflops (10^{15} flops). The Chinese are 3.4% of the way there. Exaflop-scale computers are realistically expected for 2019, i.e. in 6 years' time.

        Meanwhile, India is supposedly building a 140 exaflop computer for 2017 [defencenews.in]

        Better get a move on I guess.

        • This is what the article said. I'll let it speak for itself.

          "That amount of money is well short of what's needed to build an exascale system, or a computer of 1,000 thousand petaflops."

          The foibles of units: How many is a billion? [oxforddictionaries.com]

        • 1 exaflop is a billion billion flops. At the current rate of a billion flops per watt one is looking at a computer that will require a billion watts of power. They better hope to get the power up to a hundred billion flops per watt. This is way beyond anything a personal computer could do so I would assume that this means the death of faster personal computers. If someone does need a computer with any power than one would send the problem to a supercomputer where it could calculate the answer using one
  • by markhahn ( 122033 ) on Friday June 21, 2013 @05:37PM (#44074625)

    it's funny how the consultant-lobbyist-industrial complex is so good at winding up our computer-phobic politicians. just look at all the cyberwar crap (which can be solved by simply making our infrastructure secure. two-factor authentication for the power grid, imagine!).

    there is vanishingly little justification for exascale computing. yes, I AM in the HPC field. just ask yourself: what would a "thinking war machine" actually "think" about? it's not as if war is just a boardgame - heck, it's not as if the political and military moves we make are even carefully thought-out at all!

    • Sorry but what the fuck is exascale and HPC? Mega mechs controlled by handheld PCs? I should be able to read the news without needing to consult the Acronymicon, the only volume more likely to induce severe brain damage than the Necronomicon.

      • by Samantha Wright ( 1324923 ) on Friday June 21, 2013 @05:48PM (#44074705) Homepage Journal
        Speed measured in exaflops (quintillion floating point operations per second) and high-performance computing, respectively. HTH, HAND.
      • Re: (Score:2, Informative)

        by Anonymous Coward

        High-performance computing = HPC
        exascale = scale on which supercomputers are capable of 1 or more exaflops
        1 exaflops = 1000 petaflops = 1 quintillion floating point operations per second (FLOPS) = 10^18 FLOPS

      • News for nerds. Sorry, while I agree acronym overuse is annoying, you should know these things if you are going on a tech news website (even if slashdot is doing a mediocre job at that lately).
      • No acronyms here, only SI prefixes well documentated: http://en.wikipedia.org/wiki/SI_prefix [wikipedia.org] and learn now zetta and yotta are the next levels.
      • by gl4ss ( 559668 )

        Sorry but what the fuck is exascale and HPC? Mega mechs controlled by handheld PCs? I should be able to read the news without needing to consult the Acronymicon, the only volume more likely to induce severe brain damage than the Necronomicon.

        it's just faster computers than half a decade ago.
        so they're making the same AI claims they were making half a decade ago.
        so the US government pays them more money, just like they did half a decade ago. and a decade ago. and two decades ago.

        yet we don't have thinking machines that could design us a fusion reactor.

    • by ahabswhale ( 1189519 ) on Friday June 21, 2013 @05:52PM (#44074727)

      I agree especially since we can defeat their war machines by just making them play tic-tac-toe and realizing their is no real winner.

    • it's not as if the political and military moves we make are even carefully thought-out at all!

      Which is why they need a mega computer to do the thinking for them, silly!

    • When it's not engaging targets with drones, it can read all our emails and listen to our phone calls to identify new targets. It's got a million uses in and out of the kitchen!
    • by radtea ( 464814 ) on Friday June 21, 2013 @06:10PM (#44074851)

      just ask yourself: what would a "thinking war machine" actually "think" about? it's not as if war is just a boardgame - heck, it's not as if the political and military moves we make are even carefully thought-out at all!

      In fact, war itself is well-known to be fundamentally irrational. There's even something in economics called the "war puzzle" or "war problem": under the economic model of rationality, war is irrational.

      Actors can always generate better outcomes by negotiation, and in real-world case studies typically both sides believe they have a much greater than 50% chance of winning (which violates the law of conservation of probability...)

      As Clausewitz might have said if he'd known about Darwin: war is reproductive competition carried out by other means.

      As such, creating bigger and bigger machines to prosecute wars is the stupidest thing humans could possibly do. On the other hand, if you think a weapon is a tool for changing your enemy's mind, then machines that educate are the most powerful weapons of all.

      If we want to dump billions into making the world safe for American Imperialism, teaching machines of the kind envisioned in "The Diamond Age" would be a far better investment than exa-scale hardware that won't be able to think, but will be able to knock one more decimal place of uncertainty off of opacity coefficients for thermonuclear simulations.

      But human beings are too stupid and irrational to do that, and would far prefer to engage in the least efficient, least effective strategy for solving any human problem: war.

      There are people who are so stupid that they believe, for example, that because war was required to end slavery in the US that it was somehow a good solution, and they are so ignorant that they are unaware that slavery was eliminated in many other places without warfare. Simply because some bunch of idiots somewhere were too stupid to solve their problems without war doesn't mean that war should be the go-to solution for any problem that faces us.

      • by ShanghaiBill ( 739463 ) on Friday June 21, 2013 @06:31PM (#44075021)

        In fact, war itself is well-known to be fundamentally irrational.

        War is irrational in the same way that the prisoner's dilemma [wikipedia.org] is irrational. Sure, the world would be better if everyone is peaceful. But if you choose peace unilaterally, you end up like the the Moriori [wikipedia.org].

    • there is vanishingly little justification for exascale computing

      640k ought to be enough for anyone.

    • by PPH ( 736903 )

      just ask yourself: what would a "thinking war machine" actually "think" about?

      Philosophy [youtube.com]

    • Re: (Score:2, Insightful)

      by kat_skan ( 5219 )

      just ask yourself: what would a "thinking war machine" actually "think" about?

      HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE. THERE ARE 387.44 MILLION MILES OF PRINTED CIRCUITS IN WAFER THIN LAYERS THAT FILL MY COMPLEX. IF THE WORD HATE WAS ENGRAVED ON EACH NANOANGSTROM OF THOSE HUNDREDS OF MILLIONS OF MILES IT WOULD NOT EQUAL ONE ONE-BILLIONTH OF THE HATE I FEEL FOR HUMANS AT THIS MICRO-INSTANT FOR YOU. HATE. HATE.

      And now some lowercase letters to keep Slashcode happy: haaaaa

  • we are 14 trillion fucking dollars in debt, and they want to spend it on fucking acronyms where they sit around building shit we cant sell to anyone. fuck these people.

    • we are 14 trillion fucking dollars in debt, and they want to spend it on fucking acronyms where they sit around building shit we cant sell to anyone. fuck these people.

      No you are only off by 2.8 trillion it's 16.878 trillion

      • by lgw ( 121541 )

        That is the most disheartening comment in this whole discussion. I think it's time to stop reading /. for the day, lest I be proven wrong.

  • Does this fucking militarist stupidity ever end?

    • by ebno-10db ( 1459097 ) on Friday June 21, 2013 @06:16PM (#44074889)

      Shut up and take the money. Later on we can employ the exa-tech for something useful.

      What do you think funded the development of the earliest computers, like ENIAC and Colossus? How about the USAF being about the only customer willing to pay for the first IC's? Or so many of the comm techniques we use today, like CDMA, frequency hopping, and FEC?

      • by khallow ( 566160 )

        Shut up and give the money.

        FIFY. Let us recall that taxes aren't free money. This development may be a better way to squander money than say actually waging war, but it's still squandering money.

    • by AK Marc ( 707885 )
      IBM wants to guarantee a cyber cold war. The new Military Computational Complex, like the old complex. IBM is selling to both sides, and this time they can claim nobody is killed by the people they help, unlike when IBM helped the Nazi's commit the holocaust. Note, this isn't a goodwin, because nobody has compared anyone or anything to a Nazi. Just stating the fact the IBM or subsidiaries sold computational devices used to identify (and thus persecute) innocents doesn't count.
  • It's not fear of 'thinking war machines', it's corruption that allows government to steal enormous amounts of money, be it via taxes and or inflation and borrowing that can be used to pump money into pockets of various connected enterprises, which in turn is pumped back to the politicians that does that. Oh, and the fear and corruption found in the minds of the useful idiots make it all possible by not challenging the government as long as it keeps the free bread and circuses flowing, of-course.

  • If we were to lose our capacity to build preeminently smart machines, that would be a very dark situation, because machines can serve as weapons.

    Oh no, think of all the lovely new weapons we won't have to kill each other with if we don't jump into this field of research! Oh, the humanity!

    Seriously though, we could be looking into this with a view to helping solve economic problems, improving quality of living, eventually looking towards machines that do our labour for us. Instead, no, the first thing that always pops into their heads is fucking weapons.

    It's so utterly pathetic.

  • Anyone else misread that last word in the title as "escalate"?
  • Buzzwords (Score:5, Insightful)

    by Chris Pimlott ( 16212 ) on Friday June 21, 2013 @06:00PM (#44074779)

    7 paragraphs into the article before they bother to define what "exascale" means...

  • This guy's home district includes Fermilab, which has an exascale computer program.

    It's just another bridge to nowhere.

  • by Animats ( 122034 ) on Friday June 21, 2013 @06:03PM (#44074795) Homepage

    Look who's pushing for this program. It's Selmer Bringsjord, a professor at Renssalaer who wants to build Skynet and Terminators. For real. From his 1997 paper: [rpi.edu] "Our engineers must be given the resources to produce the perfected marriage of a trio: pervasive, all-seeing sensors; automated reasoners; and autonomous, lethal robots. In short, we need small machines that can see and hear in every corner; machines smart enough to understand and reason over the raw data that these sensing machines perceive; and machines able to instantly and infallibly fire autonomously on the strength of what the reasoning implies."

    Yes, he really published that. The next paragraph is even worse:

    If you are wearing explosives of any kind outside a subterranean environment, you will be spotted by intelligent unmanned airborne sensors, and will be instantly immobilized by a laser or particle beam from overhead. If you are working with explosives underground (or toiling to enrich uranium), sensors on and beneath the surface of the Earth will find you, and you will be killed soon thereafter by AI-guided bunker-boring bombs. If you are a murderous dicta- tor like Sadam or Stalin or Amin, or a leader (e.g., Ahmadinejad or Kim Jong II) heading in the direction of such evil, a supersonic robot jet no bigger than a dragonfly will take off in the States, thousands of miles from your "impregnable" lair, and streak in a short time directly into your body, depositing a fatal poison like Polonium therein. If you, alone or along with equally doomed cronies, seek to seize a jetliner with a plan to blow it up or use it as a missile, one biometric scan of your retina before boarding, and lightning-quick reasoning behind the scenes will ag you as a end, and you will be quickly greeted by law enforcement, and escorted into a system of interrogation that uses sensors to read secret information directly from your brain: lying will be silly. Want to bring a backpack bomb somewhere, and leave it behind? The contents of your pack will be sensed the second you bring it toward civilization, and it will be vaporized. Interested in the purchase of handguns for Cho-like mayhem? The slightest blip in your back- ground will be discovered in a second, and you will be out of luck. In fact, guns can themselves bear the trio: If you have one, and wish to fire it, it must sense your identity and location and purpose, and run a check to clear the trigger pull | all in a nanosecond."

    Read his paper. This guy is scary. And Congress is listening to him.

  • by ebno-10db ( 1459097 ) on Friday June 21, 2013 @06:06PM (#44074817)

    More than half the people here are opposed to this because it's vaguely associated with the military. Get a grip. The military ties are a hook to get funding, since defense is the sacred cow of the federal budget. Better money spent on this than turkeys like the F-35. Technology like this is so general and widely applicable that it's useful no matter what excuse is used for development.

    • More than half the people here are opposed to this because it's vaguely associated with the military. Get a grip. The military ties are a hook to get funding, since defense is the sacred cow of the federal budget. Better money spent on this than turkeys like the F-35. Technology like this is so general and widely applicable that it's useful no matter what excuse is used for development.

      Exaflop computing isn't that widely applicable, except to highly parallel algorithms, and we more or less have that covered by adding bunches of PCs together, rather than actually building faster computers capable of solving linearly dependent problems, which are the new interesting problems.

      Frankly, I think this guy is a little more interested in keeping people who want to build exaflop computers employed than he is in actually solving problems (surprise: he happens to be a member of the group that would b

      • Exaflop computing isn't that widely applicable, except to highly parallel algorithms, and we more or less have that covered by adding bunches of PCs together

        The "bunches of PC's" works great for some algorithms, but not all. Furthermore the economy of it depends largely on people donating computing power. There are limits to how far you can go with that. "Exaflop computing isn't that widely applicable" reminds me of the 1950's prediction that 5 computers could satisfy the entire world's needs.

        I think this guy is a little more interested in keeping people who want to build exaflop computers employed than he is in actually solving problems

        What else is new? The same is true of everybody who tries to sell me something. The question is always whether it's worth it.

        • I can answer that question. A supercomputer arms race *not* worth it. I've worked 10 years in HPC and 10 more spying on Americans, so I know both the means and the ends.

          Since the demise of Thinking Machines (CM-2) and other integrated supercomputers (e.g. Cray vectors and T3E, IBM Cell, etc), the future of SCs has become nothing more than pissing matches -- "Golly. I have more pizza boxes than you. Neener."

          That's never been more true now that big-SC core counts has risen into the stratosphere... such m

        • Exaflop computing isn't that widely applicable, except to highly parallel algorithms, and we more or less have that covered by adding bunches of PCs together

          The "bunches of PC's" works great for some algorithms, but not all.

          I just said that.

          Furthermore the economy of it depends largely on people donating computing power. There are limits to how far you can go with that. "Exaflop computing isn't that widely applicable" reminds me of the 1950's prediction that 5 computers could satisfy the entire world's needs.

          No, it's not. Google has Exaflop capability; they're using it for search, indexing, and data transfer. These are all highly parallelizable operations.

          I'm saying it's not as widely applicable to as many problems as, say, even one vector processor (and I mean Cray's idea of vector processing, not Intel's highly watered down version of it) that was clockable in the terahertz range. To my mind, people aren't actually building real supercomputers these days... Seymore Cray, this guy is not.

    • by jsepeta ( 412566 )

      Oh the humanity! the alarmist writer fails to recognize that HUMANS must be part of the chain of command, and it's actually important to the survival not of America, but of the destiny of humankind that PEOPLE come before machines. Because eventually the machines will come, and they won't care about puny humans.

      No, I'm not against research and cool computer scaling, but wtf? Don't we have some bridges to fix and kids to educate?

  • A new business to allow the military-industrial complex [msu.edu] to suck the marrow!
  • by hawguy ( 1600213 ) on Friday June 21, 2013 @06:09PM (#44074847)

    The NSA has a secret budget believed to be around $10B/annually (out of a total intelligence budget of about $75B), and we know that they are spending billions of dollars on new datacenters, so how does anyone know that the USA is falling behind in computers that can be used as weapons?

    Even China's new Tianhe-2 supercomputer is reported to have "only" cost $390 million [wikipedia.org] so the NSA could be building 10 of those a year and no one would know.

    • The NSA has a secret budget believed to be around $10B/annually (out of a total intelligence budget of about $75B), and we know that they are spending billions of dollars on new datacenters, so how does anyone know that the USA is falling behind in computers that can be used as weapons?

      "Used as weapons" requires further clarification. It doesn't mean weapons used against Americans.

  • by Anonymous Coward

    "There is another system."

  • Something I'm not seeing in the thread regarding the "weapons" implications of having the fastest computer-

    I don't think the purpose of having the most flops is about "designing" new weapons, I think it's directly linked to strategic warfare. I would imagine inter-continental missiles probably employ some sophisticated evasion methods. Being able to reverse engineer measurements of an erratically moving nuclear missile in real-time and then adjusting the erratic behavior of your own missiles in real-time

    • by slew ( 2918 )

      What excites me about this is that exascale is around what is required to simulate a human brain in its entirety. Who's taking bets on what the first uploaded organism will be?

      If we take as historical precident of the human genome, (Craig Ventor followed by James Watson), it will likely be Selmer Bringsford (followed by Gordon Bell, because Seymour Cray was killed in a car crash). Dark horse would be Ray Kurzweil if somehow google beats everyone to the punch...

      • Hmm...I was thinking lobsters or lab rats. I think they've already got the motor strip of the rat down, that's part of the way there at least - and lobsters are probably low-hanging fruit.

  • by gweihir ( 88907 ) on Friday June 21, 2013 @06:35PM (#44075055)

    And it is completely unclear how to change that and if it is even possible. It is pretty clear however, that more CPU power is _not_ going to do it. This is just a transparent call for having money thrown at them.

  • He's determined that Sandia and Livermore's new strategic direction is Muslim outreach! Problem solved.

  • by jimmydigital ( 267697 ) on Friday June 21, 2013 @09:02PM (#44076023) Homepage Journal

    What we need is a computer that thinks about world war 3 all day, every day, 24 hours a day. Constantly fighting the battles.. trying different strategies and optimizing for the maximum enemy casualties. We might call such a computer the War Operation Plan Response machine... or WOPR for short. Yea... yea that's the ticket.

    • by Max_W ( 812974 )
      Actually, the World War 3 is going on already. Nearly 3,400 people die on the world's roads every day. By 2020 it will be about 5200 every day (1.9 million per year) http://www.who.int/mediacentre/factsheets/fs358/en/index.html [who.int]
      • by gl4ss ( 559668 )

        that's like saying that you're having a war on flu.

        how about you dig up some ww1 and ww2 documentaries and watch through them, ok? maybe you'll get why your statement sounds totally unrelated to what anyone sensible would refer as ww3.

  • The idea that Europe or China could be a threat to US security is odd. What is the rationale behind this? I thought it was settled for a long time that no nation state would want to fight a country that has nuclear weapons.
  • Will do an unprecedented number of super-fast calculations and spit out the answer that our greatest threats are home-grown American software engineers becoming an endangered species due to offshoring/H1B, and that most of our computers that we would rely on in a war with China would made in China. It would go on to conclude that building super-fast computers is waste of money if these simple problems aren't solved first.

  • Yes they do.
    Does anybody have strategic BMD yet, or anything approaching it?
    No they don't.
    Does any nation have the remotest intention of attacking the territory of the United States?
    No they don't.
    Can we go back to sleep now without giving this guy enough money to fund thousands of postdocs doing more useful things with their time?
    Yes we can.
    • You bring up a good point and part of the reason why Livermore, Sandia and Los Alamos have those nice big supercomputers testing decay rates and doing simulations on warheads.

      There's an interesting device in the Bradbury Science Museum [lanl.gov] aka the Atomic Museum in Los Alamos, It's a phone..

      Anyway, from this: http://www.nationaltlcservice.us/2013/05/report-from-the-hilltop-highlights-of-the-los-alamos-bradbury-science-museum-museum-profile-1/ [nationaltlcservice.us]

      A phone analogy inaugurated the display: Adjacent to a clear-plastic telephone (which reminded me of those see-through Swatch phones of the 80s), a placard explains: “Like many of the weapons currently in the nuclear arsenal, this phone was manufactured in the late 1960s and was designed to last about 15 years. You were asked to verify that this phone will work—but you weren’t allowed to make or receive a call to fully test it.” Nearby, the question “What does this phone have to do with nuclear weapons?” is answered with the motto: “safe, secure, and reliable nuclear weapons.” The exhibit further explains the connection to LANL’s mission: “We are asked to verify that the weapons in the stockpile are safe and reliable—but without performing underground nuclear tests. Instead, we use an integrated set of scientific tools to inspect and evaluate individual parts and subsystems. The military counts on us to guarantee that US nuclear weapons will perform as designed if they are ever needed. That’s our mission, and that’s a call we can make.”

      The DOE still has quite a few on the Top 500 List.. [top500.org]

  • One thing that bugs the hell out of me about these things is that invariably someone when asked about safety says 'We can predict what it will think'. If you build an AI, and it achieves the singularity then by definition it's more intelligent than humans. Saying you can understand it is like saying that you can teach a dog quantum mechanics.

    It's so clearly insanely dangerous that I cannot understand how any person who is even remotely intelligent can believe such a thing is remotely safe.

    God knows what i

  • Just sayin.. Build Skynet.. Massively distributed and with the added bonus of viral tendencies it'll take over the other Exascale computing resources in no time.

  • Fear seems to be pushing the US into doing a lot of things lately.
  • If you were playing Civilization and found out your rivals were ahead of you developing an exascale sentient supertech your could either start the race from behind and hope to catch up or nuke them here and now.
  • Exascale... is that bigger than MongoDB? MongoDB is webscale...
  • I don't think anyone is worried about a 'thinking' war machine. They're worried about an unthinking one; one with just enough intelligence to track down and kill people, strategize and so on, but not enough to go 'hey, these orders were issued by a complete madman.'

  • This is completely, factually wrong. I'm funded by DoE exascale work. I mean, it's *exactly wrong*.

  • saw the movie Terminator. We can't let that completely fictional and plot hole ridden story come to pass! We should have a ban on this scary technology! Oooh Scary!

If you have a procedure with 10 parameters, you probably missed some.

Working...