Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Government Supercomputing United States Your Rights Online

White House Warns of Supercomputer Arms Race 123

dcblogs writes "The White House's science advisors, in a report last week, said a petaflop-by-petaflop race to achieve number one on the Top500 could prove costly and divert money from supercomputing research. 'While it would be imprudent to allow ourselves to fall significantly behind our peers with respect to scientific performance benchmarks that have demonstrable practical significance, a single-minded focus on maintaining clear superiority in terms of flops count is probably not in our national interest,' the report said (PDF). It is urging the supercomputing community to expand its benchmark measures beyond the Top500's Linpack. It says the Graph500, for data-intensive applications involving the rapid execution of graph operations, 'will be more relevant,' but also acknowledges that it will difficult to rely on any one measure."
This discussion has been archived. No new comments can be posted.

White House Warns of Supercomputer Arms Race

Comments Filter:
  • Arms Race? (Score:3, Interesting)

    by Culture20 ( 968837 ) on Friday December 24, 2010 @01:59PM (#34660936)
    Supercomputer Race. Unless supercomputers start blowing up or growing arms.
    • "Arms race" is a single term that stands for competitors attempting to gain a technical or material advantage faster than each other. "Supercomputer race" would simply be a race between supercomputers.

      • "Supercomputer race" would simply be a race between supercomputers.

        Why is that a problem? It seems like a perfectly apt description to me.

      • Don't be silly, Supercomputers can't run.

        Anyway, we're moving to CUDA based hardware for a lot of things, so what's the big deal?

        • by oiron ( 697563 )

          Cuda != supercomputers.

          Apart from the fact that CUDA/OpenCL/whatever processors are useless for highly I/O dependent tasks, try doing a conditional on any of them and watch your performance drop like crazy...

          SIMD has its place, but a multi-core supercomputer is effectively MIMD, and can actually run parallel tasks (or parallely run independent parts of a task).

          They really address different segments.

          • Cuda has pretty good IO (although branching sucks) as long as your problem size is small enough - it's good enough to replace some of what supercomputers are good at.
    • Don't underestimate them.

    • Supercomputer competition. Unless supercomputers start high-speed drifting through Tokyo.

      • Computer competition. Unless computers can fly and work for the Daily Planet.

        • Computer competition. Unless computers can fly and work for the Daily Planet.

          I think Superman's robot doubles might qualify. I'm sure there has to have been a comic where he ordered a double to replace Kent for a day or two of work.

      • Now having decided not to use plasma based power systems, Tokyo is again on the verge of another round of massive destruction. I have been studying this, and I am fairly sure that these new supercomputers will emit electromagnetic frequencies that will attract Godzilla, and perhaps other dangerous monsters currently at rest. We would have to ask the twin fairies to be sure, but I think this is a dire threat to Shibuya, and surrounding areas. Having lived there, I would hate to see it destroyed again.
    • Supercomputer Race. Unless supercomputers start blowing up or growing arms.

      It's likely that the single largest driver of US government spending on supercomputers is for nukes. [wikimedia.org]

      • Well, at least those nukes budgets have some value: improving supercomputer performance, that can be used for actually useful applications.

        Maybe someday we'll get smart and use the computers for only that kind of apps, like weather and climate modeling, and energy physics research that doesn't make bombs.

        • Maybe someday we'll get smart and use the computers for only that kind of apps, like weather and climate modeling, and energy physics research that doesn't make bombs.

          Those budgets are for modeling the degradation of current nuclear weapons under the START and now the New START treaties.
          They are specifically to avoid building new weapons - if we can model the degradation we can be confident that the current arsenal is intact and functional so does not need replacement.

          • by rbmyers ( 587296 )
            Anyone who thinks that planning World War III based on model calculations is a good idea is probably already safely behind barbed wire at one of our bomb labs. Nothing short of the commencement of actual hostilities or the resumption of nuclear testing is going to resolve anything. We are stuck paying blackmail in perpetuity to a program whose success cannot be proven or disproven, short of Armageddon or a dangerous step toward it.
            • Resumption of physical nukes testing wouldn't resolve anything. And the only thing that commencement of actual hostilities would resolve would be everything. We are stuck no matter what we do, now that the cat's out of the bag. And as we develop ever more ways to quickly release lots of energy, especially at a distance, especially cheaply, we're going to be stuck with lots more cats.

              • by rbmyers ( 587296 )
                There have been specific issues that could have been resolved by underground testing on which a great deal of money and computer time has been spent. At a MINIMUM, LLNL would no longer have an excuse for the NIF boondoggle.
                • And there are specific problems that would be reborn if we resumed physical testing. Like the problems that N Korea is deliberately causing by doing so. All the budget boondoggles are worth avoiding actually detonating nukes, which always brings the world closer to detonating them near people.

                  • by rbmyers ( 587296 )
                    In order to believe that the sandbox exercises at the bomb labs serve any purpose whatsoever, you have to believe that, tomorrow, if you needed to launch a fusillade from an SSBN, you could do so with the reasonable assurance that the iffy warheads on the Trident D5's that would be launched would work to the extent that the possibility of a retaliatory response is negligible. I don't believe that any such assurance is currently available, and I don't see the situation improving in any way with the passage
          • I'd like to see a reliable budget explanation of how much is spent on modeling the degradation of the arsenal. Vs how much is spent the development of new weapons by modeling simulated tests on stored data from old tests, under the old weapons test ban treaties.

            I'm not at all confident that any of what we do, including the original production, has ever given us a working arsenal of nukes. But if our enemies and rivals believe we've got one, that's good enough for me. Indeed, given the proliferation, sabotag

            • I'd like to see a reliable budget explanation of how much is spent on modeling the degradation of the arsenal. Vs how much is spent the development of new weapons by modeling simulated tests on stored data from old tests, under the old weapons test ban treaties.

              The programs I linked to are purely simulation. I expect their budgets are public info.

    • it's an imaginary race this time.

      the reality is that lots of people can have legitimate and good uses for supercomputers. to try to add anything about "This takes away from research" when this is research by definition, is idiotic.

    • If the Supercomputer race wanted to race wouldn't they need to grow legs more than arms?
    • If you're familiar with EAR and ITAR you will know that software can be classified as a munition. It's not such a leap to think of computers as arms.
  • PETA doesn't want anymore arms used against defensless animals, or inevitably more animals will flop on the ground.

    • by slick7 ( 1703596 )

      PETA doesn't want anymore arms used against defensless animals, or inevitably more animals will flop on the ground.

      Humans are defenseless animals compared to SKYNET. All we need to do is to create a computer that sees humanity for what it is, we are doomed.

  • by mewsenews ( 251487 ) on Friday December 24, 2010 @02:06PM (#34660970) Homepage

    If another country starts to outshine you, try changing the rules.

    America's strength used lie in an immense manufacturing culture, and that's given way to "intellectual property". Instead of dealing with tangibles, America is content to sit behind a desk and let the Chinese labour.

    • by Anonymous Coward on Friday December 24, 2010 @02:13PM (#34661024)

      No. It's creating more fear of the outside World: terrorism, other countries "attacking" us, others getting "ahead" of us, an infinitum.

      Nuclear war and the Soviets are gone. Our leaders need other bogeymen - their version of "Goldstein" - to keep us in fear. Because those of us who have been educated outside of the corporate system - any type of education that doesn't train one for a vocation - fear is how you control the little people. Apparently the scare of terrorists and Muslims aren't enough.

    • If you honestly think that the US can't cable together thousands of US GPU's in order to set yet another meaningless Linpack milestone, then you are not that bright.

      • by jbssm ( 961115 )

        If you honestly think that the US can't cable together thousands of US GPU's in order to set yet another meaningless Linpack milestone, then you are not that bright.

        US GPU's? Funny, I thought they where all manufactured in China. Oh let me check my NVIDIA GTX 570 box ... yeah that's right, they ARE made in China.

        • The board is assembled in China. For semiconductor production (the actual GPU) NVIDIA uses Global Foundaries, a US company. Their newest facility and the most advanced fab in the world is not far from where my brother lives.

          http://fab2construction.com/ [fab2construction.com]

          • by Anonymous Coward

            Global Foundries is now owned by a UAE consortium, Abu Dhabi's Advanced Technology Investment Co.,, AMD sold it off. NVidia is manufactured by TSMC not GF anyways.
            Fab 1 is in Dresden, Germany
            Fab 7 is in Singapore
            Fab 8 is the one being built in Saratoga County, New York
            Other fabs used to be owned by Chartered Semiconductor.

            TSMC is in Taiwan.
            So far neither AMD/ATI nor Nvidia have had any GPUs made by GF.

        • China does the manufacturing--but the chips are designed in the US by US companies. Those companies chose to locate their manufacturing in China because labor is so cheap there.

          None of this addresses the main point, that Linpack isn't a particularly useful metric.

    • by betterunixthanunix ( 980855 ) on Friday December 24, 2010 @02:18PM (#34661048)
      Why would any American want to change that? Look at how people get to live right now: no need to choose between having a computer, having a cell phone, or having a nice pair of shoes; you can have them all, because they are cheap, because they are produced in countries where wages are low. Something is broken? Don't fix it -- just replace it! Cheap!

      Sure, eventually it will all come crashing down and we'll all get a rude awakening, but until that happens, I do not think anyone will want to change the current system.
      • It's already come crashing down, it's just that most of the country doesn't realize it and out News Orgs and political leaders are too cowardly to tell them.

        http://grandfather-economic-report.com/debt-gdp-1916-2008.jpg ...and if you're wondering what that first peak is in 1933, that would be the great depression.
        • by slick7 ( 1703596 )

          It's already come crashing down, it's just that most of the country doesn't realize it and out News Orgs and political leaders are too cowardly to tell them. http://grandfather-economic-report.com/debt-gdp-1916-2008.jpg [grandfathe...report.com] ...and if you're wondering what that first peak is in 1933, that would be the great depression.

          After looking at your graph, I made a correlation between it and when the the Federal Reserve came into being. The first big jump occurred around WWI, culminating in the depression. With the gold standard in place, even the Korean war and Viet Nam war (yes it was a war, not a police action) did little to affect the GDP. However, when the US dropped the coupling of the currency to the gold standard, whoosh.
          Just Google it and you will see what I mean.

      • How do you figure it'll come crashing down? Most manufacturing will simply move to automation locally (ever see the robotic system Caterpiller uses to build Diesel engines in Ohio with only a handful of people? Pure awesome).

        Building things can always be fully automated. Research and critical thinking? Not so much.

        • by Sique ( 173459 )

          But supercomputers are supposed to automate large parts of research. Imagine modelling a whole planet system coming into being from a stellar dust cloud without computers!

      • by Tablizer ( 95088 )

        you can have them all, because they are cheap, because they are produced in countries where wages are low.....Sure, eventually it will all come crashing down and we'll all get a rude awakening, but until that happens, I do not think anyone will want to change the current system.

        Typical of America: we overdo something until it springs back and tags us in the face.
           

    • by Doc Ruby ( 173196 ) on Friday December 24, 2010 @02:54PM (#34661240) Homepage Journal

      The advice to the president doesn't change rules for "fastest supercomputer". It tells the president not to be suckered into a supercomputer race measured on only the FLOPS, but rather on more useful performance measures. Because getting sidetracked into less useful metrics to see who's winning the race will waste US resources in winning the race, but not producing the most useful computer. And the US interest is in producing the most useful computer, not in nominally winning the race.

      In fact, that report says "let China dominate the Top500, if the US still has the better computers". Which is exactly what I want the US doing, and what I prefer China to be doing rather than leaving the US behind in actual usefulness.

      But if you want to get caught up in "the USA is dead" trip that leads into traps that actually would hurt the US if acted on, go ahead. You're not having any effect on the US supercomputer effort.

      • There is substance and there are appearances and perceived effect (within US and globally).

        Being #1 in supercomputer power is the 2010 perceived equivalent of technological superiority as was the 1969 landing of the first man on the Moon. I may not be an expert on NASA history, but I bet that there were big debates at the time whether the latter was actually a "useful performance measure" or a "waste of US resources".

        In such a perspective the boundaries between substance, appearances and theater (as is insi

      • I don't believe the US is dead int the water - far from it, they I believe they are still leaders in technological innovation - but it seems a strange time to come out and say "we're not racing" after you've just been overtaken.
        The U.S. has been a willing and active participant in this so call race for the "biggest and best" for a long time, they have just been over taken for number 1 spot and now all of a sudden it is not important.
        Far better to say nothing and concentrate on what you believe is the corr
        • by Doc Ruby ( 173196 ) on Saturday December 25, 2010 @01:12AM (#34664062) Homepage Journal

          The reason the US was overtaken in this particular metric is that the US is no longer devoting resources to being at the top of it, because those resources go into being top in the other metrics that actually do matter. The Chinese moved into the top spot after it ceased to be the most important.

          This report is advising the president not to be tricked into wasting resources competing with China in that less important category at the expense of retaining leadership in the other categories.

          The US doesn't build the tallest skyscrapers anymore, because we built an even bigger suburban infrastructure around cities, making such density less valuable. Other countries that do build the world's tallest building (for a while) are either wasting their time, or competing in an unnecessary race the US has no real value anymore in winning.

          The US has a largely transparent government, and this advice from an advisory group to the president is good advice. Far better to compete in what matters, and to tell the truth about why. If the president were in the critical path to building the tallest buildings, the focus away from them to other development patterns would have become better known, though just as aptly practiced.

          The people who misunderstand this report and the US compliance with it are not important in the supercomputer industry. Just as people who whine that the US doesn't have the world's tallest buildings anymore aren't important to the construction industry.

    • That true because after all - America doesn't make anything anymore and we don't export anything either. (hint if you think this is the case, please - for the love of god - don't breed).
    • by ultranova ( 717540 ) on Friday December 24, 2010 @06:58PM (#34662480)

      America's strength used lie in an immense manufacturing culture, and that's given way to "intellectual property". Instead of dealing with tangibles, America is content to sit behind a desk and let the Chinese labour.

      The problem is not "intellectual property", the problem is service economy. Manufacturing, no matter what you're producing, be it cars or blueprints for them, creates value. Service jobs don't. That's why they pay so badly. As economy increasingly gets all of its growth from services, rather than industry, the amount of stuff - also known as wealth - circulating does not grow. That is why we are seeing so much economic problems.

      The Western world is de-industrializing as all manufacturing jobs are moved to China, and design jobs are following since few people can actually do them well. We are simply returning to the pre-industrial situation where the only ones who have significant amount of wealth are the nobles, and they are so much richer than everyone else that they have a practical monopoly on power as well. Whether this was by design or by accident I can't say, but whichever the reason, the increasing poverty and destruction of Western civilization is in the best interests of our overlords, so it will continue.

      Oh well, another few millenias under ruthless Chinese dictators. When they take over I at least hope they reward our traitors as a traitor deserves. A pity for the children, thought; good thing I don't have them.

    • If another country starts to outshine you, try changing the rules.

      Is this Rule 2012?

      I nominate your comment for an Oscar in the "2010 most insightful Slashdot comments" category.

    • by Xyrus ( 755017 )

      Computational speed is actually useless if you have no way to access, store, or process the data being produced in an efficient way.

      The real problem is on the data side. A petaflop system could easily generate peta or exabyte scale data. How do you quickly store and access this data? How do you analyze it? The computations are always fast, but how fast will your app run if it is spending 30% of it's time waiting on IO?

      Computational benchmarks really are pointless when it comes to real system utilization. Th

    • Thats a wonderful soapbox speech which has nothing at all to do with supercomputers or the issue at hand. Linpack is no longer the most relevant benchmark and just as we saw in benchmarks for video cards, the system can be gamed to show exceptional performance on the benchmark while failiing to live up to that same standard on real world work. For instance, current and near term systems from IBM are exceptional and place more emphasis on memory bandwidth and power use than just pure compute ability. Fo
  • Curses! (Score:4, Funny)

    by Haedrian ( 1676506 ) on Friday December 24, 2010 @02:08PM (#34660992)

    We cannot allow a Supercomputer gap!

    • by bosef1 ( 208943 )

      Which, of course, is compounded by the looming "Minesweeper" gap.

    • by slick7 ( 1703596 )

      We cannot allow a Supercomputer gap!

      When so many countries have super-computers, how do you do a dis-super-computer-ment, to stem the rampant super-computer proliferation?
      Think of the children (OLPC)

  • At difference with War Games, playing is the only way that everyone wins.
    • by slick7 ( 1703596 )

      At difference with War Games, playing is the only way that everyone wins.

      However, a glitch in the form factor on CPU APN/25689721/2A (Made in China) calculates that the game is winnable, ..
      Initiate countdown...
      Launch all missiles...

  • True to an extent (Score:5, Insightful)

    by GigsVT ( 208848 ) on Friday December 24, 2010 @02:12PM (#34661014) Journal

    If you really need to crunch a lot of numbers and are willing to spend a lot to do it, it often makes more sense to develop an ASIC or FPGA type solution. I know the EFF put together a key cracking system for $250,000 that would probably still blow modern supercomputers out of the water for that specific application.

    • by Anonymous Coward

      http://www.conveycomputer.com/products.html

      People realize that and have created a not absurdly expensive solution.

    • Nah, just get a fast GPU and use this [arstechnica.com].
  • "Mr. President, we must not allow a mineshaft gap!"
  • When spending money on a supercomputer, wouldn't you do it to do something useful with it? I'm sure if it gets built, it's built and optimized for a certain purpose other than just being in spot 1 of the Top500 list. On the other hand, if someone does spend the money on a supercomputer purely to have it be in spot 1, well, it's their money and their choice...

    • A lot of supercomputers are not used to their fullest extent; often, this is because scientists either do not know how to program a supercomputer, or do not have enough data, or have computations that are not easily parallelized. Some supercomputer centers have started renting their time to Wall St. firms, because there is not enough demand from scientists or engineers.
      • Dan Brown led me to believe the Gov't used their supercomputers to break encryption on emails and other net traffic to catch various criminals.

        But then again, thats probably too smart an Idea to have actually been implemented.

        • You only see that in Dan Brown novels because it's too dumb of an idea to be actually implemented. Short of a massive breakthrough in computer speeds that they've somehow managed to keep secret, even all the secret government supercomputers in the world would have a hard time breaking AES-128 or RSA-4096 in a reasonable amount of time.

          If the government needs to break somebody's crypto, it's done through side-channel attacks [xkcd.com]. Anything else is a waste of effort.

        • Let's suppose you can perform one AES128 decryption per millisecond. How many cores would you need to brute force a single key within a decade (keeping in mind that brute force is the only publicly known ciphertext-only attack on AES)? Now, how many cores does the world's fastest (known) supercomputer have?

          There is a reason that only Dan Brown novels portray supercomputers breaking modern ciphers. It is true that the NSA is believed to have a very powerful supercomputer (or perhaps several) at its dis
  • so that people can sleep like morons, forgetting that the current government is practicing a worse type of censorship and repression on all freedoms in collaboration with private interests - worldwide too. popping up one censorship method after another, pressurizing foreign governments to implement censorship laws, trying to label journalists who go 'out of line' as terrorists ....

    just like it was with 'terror' an external threat needs to be invented so that all kinds of practices violating freedoms can
    • so that people can sleep like morons...

      How do intelligent people sleep? I used to think that while I slept, I sipped tea (with both pinkies elevated), composed symphonies and pondered the financial situation over in France; but alas, my wife has informed me that I just drool into my pillow and mumble incoherently.

  • by pclminion ( 145572 ) on Friday December 24, 2010 @02:24PM (#34661088)

    The "race" is not about the hardware. All modern supercomputers are massively clustered, using various shared memory architectures. The technology is commodity level, and even a small sum like $10 million can buy a SHITLOAD of hardware. The challenge, and the point of competition, is the creation of software technologies and algorithms to effectively make use of clustered hardware. It's a question of who has the best minds working on the software. The hardware is a given. People have constructed impressive massively parallel processors using game consoles, after all.

    It's the programmers, not the supercomputer makers, who will make the difference in this "race."

    • by alexo ( 9335 )

      The technology is commodity level, and even a small sum like $10 million can buy a SHITLOAD of hardware.

      Would you mind lending me a small sum?

    • by dkf ( 304284 )

      All modern supercomputers are massively clustered, using various shared memory architectures.

      Actually, they do very little memory sharing because it doesn't scale at all. Shared memory systems top out at on the order of 1k cores, after which the memory backbone becomes just too damn expensive, even by supercomputer standards. Instead, supercomputers use message passing (especially various MPI implementations) over what is still very fast dedicated interconnect. Algorithms have to be very carefully written to take good advantage of that sort of system. (Some will actually have a mix of technologies,

    • by Xyrus ( 755017 )

      The technology is commodity level

      No. No it isn't. Supercomputers require high-speed optical interconnects and extremely fast switches to handle the bandwidth from massive computations with sub-millisecond latency. The coms hardware alone alone will run you into the millions for any decent powered supercomputer. Then you start getting to blades which use high reliability components, ultra-fast drives, and even customized hardware in some cases. Then you've got the cost for maintaining and administering the system, which is far from cheap. T

  • 'cuz we wouldn't want China to discover the eleventy-billionith prime number before we do.

  • 'While it would be imprudent to allow ourselves to fall significantly behind our peers with respect to scientific performance benchmarks that have demonstrable practical significance, a single-minded focus on maintaining clear superiority in terms of flops count is probably not in our national interest,"

    Now try to explain *that* to your TayPartists ...

  • If we look at the top 10 on the TOP500 list [top500.org], it's still pretty dominated by the USA:

    1. Tianhe-1A (China)
    2. Jaguar (USA, ORNL)
    3. Nebulae (China)
    4. TSUBAME (Japan)
    5. Hopper (USA, LBNL)
    6. Tera-100 (France)
    7. Roadrunner (USA, LANL)
    8. Kraken (USA, UT)
    9. JUGENE (Germany)
    10. Cielo (USA, LANL)

    So, let's see -- half of the top ten are in the USA, two in China, two in Europe, and one in Japan. Granted, China is catching up (rapidly), but if you look beyond the #1 spot, the USA still pretty dominates the overall list. Expand this list out beyond the top ten [top500.org], and SEVEN supercomputers from 11-20 are also in the USA (11-16 & 18), one is in Russia (17), and two in South Korea (19 & 20). So let's not all freak out here about China stealing the #1 and #3 slots on the list -- the USA still has quite a bit more computational resources than the Chinese,. . .

    • by pspahn ( 1175617 )
      Yeah, but China's #1 system is the Tianhe-1A. 1A, which probably means it's the first one. Just wait until they get to 1B, 1C, etc. Alas, woe is us.
    • a nice graph showing the US is way ahead of everyone else can be found here http://www.top500.org/charts/list/36/countries [top500.org]

  • "The Republicans/Democrats/Whigs are dangerously naive about Soviet/Al Qaida/Chinese/Brazilian intentions regarding multicore MIMD instruction code. We must maintain supremacy in cyberspace to protect FREEDOM."

    This would be almost as good as a bridge across the Pacific for keeping cold warriors busy.

  • The race to shave yet another hundred of a thousandth of a second when launching your Minesweeper game.
  • I love the warm, comforting smell of a "national security problem" that can be solved by giving IBM some of my money, rather than one of those genuinely difficult ones like flushing the peasants with small arms out of some sandbox hellhole or doing something that actually improves airline security, rather than harassing pilots who point out problems...
  • This is laughable, but I can't tell what is funnier:

    The fact that the government thinks they know shit about computers,

    or the fact that they think they can do it better than any other technologically-advanced country in the world right now.

  • Craig Mundie Chief Research and Strategy Officer Microsoft Corporation I thing Microsoft is tired of not being in the top500. They will promote new benchmarks instead of the ones that make them look bad. And there are rumours on Windows for ARM on servers. I say Mr. Mundie, I can see what your strategy is all about...
  • by Theovon ( 109752 ) on Friday December 24, 2010 @04:04PM (#34661632)

    Ok, not entirely worthless. Linear algebra is used in loads and loads HPC workloads, but Linpack as a benchmark is NOT prepresentative of a typical real-world HPC workload. It focuses on peak flops, leaving behind things like inter-node bandwidth and latency, which are crucial for many important, real scientific supercomputing tasks.

    Our CSE department chair recently quotes an article he read. To paraphrase, we're heading to the point where computation is going to be basically free, and what costs all the energy will be moving the data around. This is true for several reasons. One is the recent trend towards near-threshold computing. Ultra-low voltage (i.e. 400mV, when 900mV is nominal Vdd) can save 100x on power. It costs us 10x on speed, but now we can pack in 100x as many nodes into the same power and cooling budget, allowing for a 10x increase in aggregate throughput. But this works best for highly parallel and communication-heavy workloads. Fortunately, for many important areas (bombs, climate simulations, astronomy, real-time raytracing), this is the case. And moreover, people are getting better at parallelizing work.

    • Agreed. I very much doubt we are modeling weather patterns or nuclear explosions anymore. People know an AGI will be created at some point in the future. Some people (including myself) believe that it is highly probable the AGI will be created in the next few decades. Given the huge advantage an AI could create, how could a superpower (or an aspiring superpower) NOT act to create one? Even if the odds are .01% an AGI will be created, a loss in this area means a loss in any conflict. Therefore a power
      • What is AGI?

        Also, if by "trying to create an AGI" you mean "improving computer fabrication"*, we can talk. But I somehow doubt that is ocurring inside a superpower.

        * Using computers to improve computers. Looks like the beggining of a singularity, doesn't it? Happening since late 90's. Sometimes I think people think too much of the (next) Singularity.

        • Artificial General Intelligence. AI is generally referred to as the specific expert systems and AGI is something more "humanlike" in that it can attack general problems without specific domain knowledge.

          Yeah, I do believe humans will be disadvantaged in terms of computing power relative to machines in a few decades. That will very likely lead to "the singularity" in my estimation. Any time I tell non-tech folks about this, they think I'm nuts. And most tech folks think the idea is similarly nuts. O
          • That is one place we disagree. I think the combination of humans + machines will still be way more powerfull than just machines in 2 or 3 decades*. We'll have a chance to jump on it before we are completely outcompeted.

            Anyway, some people jumped on the industrial (or scientific) revolution, outcompeting the ones who didn't. Ditto for the agriculture (or writting) revolution of the neolitics, and it probably happened some times before that.

            * that is the timeframe moore's law say it will happen. I, personaly

            • True. I actually do not disagree with you there, necessarily. It could go either way. But I would think it's more probable that a machine will obtain some level of autonomous "thought" and "understanding" before we can adequately meld humans with them. I would be surprised if the progress of neural/machine interfaces was as rapid as the AGI algorithms which are currently being researched, especially when the raw horsepower of processing capability is being doubled roughly every 18 months. But... it cer
  • A mine shaft gap!

  • isn't supercomputers, per se, but AI. The first country that develops scalable human-like AI wins. Period. End of story.

    Naturally this isn't even on the *radar* of the leadership of the USA, whose government is dominated by lawyers and financiers, not engineers.

    • Oooh, I don't know if our leaders deserve quite that much vitriol. At least our current ones. Putting partisan politics aside as much as possible, I think it's fair to say that Obama's cozy relationship with Erik Schmidt of Google almost guarantees he has heard about the possible implications of a scalable AGI. And I think Obama is with it (I'm speaking of technology) enough to understand and not immediately dismiss the implications of this. Why do you think the dept of energy constantly funds these sup
      • that was the military-industrial complex....if they fund and make the AI, it's the skynet / terminator scenario.

  • It's about the same idea that some government in NL has to invest in fiber tech. Personally, I don't think that is the way to do it. The idea is to invest in internet content that *possibly* requires high speed internet. The high speed internet will follow. Of course, sometimes you have to push technology a bit as well. But the way to do that is investing in research and small scale try-outs. If the technology succeeds in the try-outs, leave it to the market to fill in demand. I think there are loads upon l

  • It's so obvious how to blow the doors off the competition in this race... if everything is all hitting the limits of the Von Neuman wall... go around it with something like a BitGrid [blogspot.com], which is a reconfigurable systolic array granular down to the bit level. Using the latest memristors with this idea it should be quite feasible to build Exaflop computers for the desktop.

  • Since when do we get tech advise from the likes of those occupying the White House? Just because you know how to lie and deceive the public does not make you an authority on anything technical.

BLISS is ignorance.

Working...