Earth

Climate Change Research Gets Petascale Supercomputer 121

dcblogs writes "The National Center for Atmospheric Research (NCAR) has begun has begun using a 1.5 petaflop IBM system, called Yellowstone. For NCAR researchers it is an enormous leap in compute capability — a roughly 30x improvement over its existing 77 teraflop supercomputer. Yellowstone is capable of 1.5 quadrillion calculations per second using 72,288 Intel Xeon cores. The supercomputer gives researchers new capabilities. They can run more experiments with increased complexity and at a higher resolution. This new system may be able to reduce resolution to as much as 10 km (6.2 miles), giving scientists the ability to examine climate impacts in greater detail. Increase complexity allows researchers to add more conditions to their models, such as methane gas released from thawing tundra on polar sea ice. NCAR believes it is the world's most powerful computer dedicated to geosciences."
Moon

A Supercomputer On the Moon To Direct Deep Space Traffic 166

Hugh Pickens writes "NASA currently controls its deep space missions through a network of 13 giant antennas in California, Spain and Australia known as the Deep Space Network (DSN) but the network is obsolete and just not up to the job of transmitting the growing workload of extra-terrestrial data from deep space missions. That's why Ouliang Chang has proposed building a massive supercomputer in a deep dark crater on the side of the moon facing away from Earth and all of its electromagnetic chatter. Nuclear-powered, it would accept signals from space, store them, process them if needed and then relay the data back to Earth as time and bandwidth allows. The supercomputer would run in frigid regions near one of the moon's poles where cold temperatures would make cooling the supercomputer easier, and would communicate with spaceships and earth using a system of inflatable, steerable antennas that would hang suspended over moon craters, giving the Deep Space Network a second focal point away from earth. As well as boosting humanity's space-borne communication abilities, Chang's presentation at a space conference (PDF) in Pasadena, California also suggests that the moon-based dishes could work in unison with those on Earth to perform very-long-baseline interferometry, which allows multiple telescopes to be combined to emulate one huge telescope. Best of all the project has the potential to excite the imagination of future spacegoers and get men back on the moon."
Supercomputing

Parallella: an Open Multi-Core CPU Architecture 103

First time accepted submitter thrae writes "Adapteva has just released the architecture and software reference manuals for their many-core Epiphany processors. Adapteva's goal is to bring massively parallel programming to the masses with a sub-$100 16-core system and a sub-$200 64-core system. The architecture has advantages over GPUs in terms of future scaling and ease of use. Adapteva is planning to make the products open source. Ars Technica has a nice overview of the project."
AI

Google Puts Souped-Up Neural Networks To Work 95

holy_calamity writes "A machine learning breakthrough from Google researchers that grabbed headlines this summer is now being put to work improving the company's products. The company revealed in June that it had built neural networks that run on 16,000 processors simultaneously, enough power that they could learn to recognize cats just by watching YouTube. Those neural nets have now made Google's speech recognition for U.S. English 25 percent better, and are set to be used in other products, such as image search."
Supercomputing

India Plans To Build Fastest Supercomputer By 2017 135

First time accepted submitter darkstar019 writes "India is planning to build a computer that is going to be at least 61 times faster than the current fastest super computer, IBM Sequoia. Right now the most powerful supercomputer in India is 58th in the list of top 100 supercomputers. From the article: 'Telecom and IT Minister Kapil Sibal is understood to have written to Prime Minister Manmohan Singh sharing the roadmap to develop "petaflop and exaflop range of supercomputers" at an estimated cost of Rs 4,700 crore over 5 years.'"
Intel

TACC "Stampede" Supercomputer To Go Live In January 67

Nerval's Lobster writes "The Texas Advanced Computing Center plans to go live on January 7 with "Stampede," a ten-petaflop supercomputer predicted to be the most powerful Intel supercomputer in the world once it launches. Stampede should also be among the top five supercomputers in the TOP500 list when it goes live, Jay Boisseau, TACC's director, said at the Intel Developer Forum Sept. 11. Stampede was announced a bit more than two years ago. Specs include 272 terabytes of total memory and 14 petabytes of disk storage. TACC said the compute nodes would include "several thousand" Dell Stallion servers, with each server boasting dual 8-core Intel E5-2680 processors and 32 gigabytes of memory. In addition, TACC will include a special pre-release version of the Intel MIC, or "Knights Bridge" architecture, which has been formally branded as Xeon Phi. Interestingly, the thousands of Xeon compute nodes should generate just 2 teraflops worth of performance, with the remaining 8 generated by the Xeon Phi chips, which provide highly parallelized computational power for specialized workloads."
Supercomputing

A Look Inside Oak Ridge Lab's Supercomputing Facility 59

1sockchuck writes "Three of the world's most powerful supercomputers live in adjacent aisles within a single data center at Oak Ridge National Laboratory in Tennessee. Inside this facility, technicians are busy installing new GPUs into the Jaguar supercomputer, the final step in its transformation into a more powerful system that will be known as Titan. The Oak Ridge team expects the GPU-accelerated machine to reach 20 petaflops, which should make it the fastest supercomputer in the Top 500. Data Center Knowledge has a story and photos looking at this unique facility, which also houses the Kraken machine from the University of Tennessee and NOAA's Gaea supercomputer."
Space

GPU Supercomputer Could Crunch Exabyte of Data Daily For Square Kilometer Array 40

An anonymous reader writes "Researchers on the Square Kilometer Array project to build the world's largest radio telescope believe that a GPU cluster could be suited to stitching together the more than an exabyte of data that will be gathered by the telescope each day after its completion in 2024. One of the project heads said that graphics cards could be cut out for the job because of their high I/O and core count, adding that a conventional CPU-based supercomputer doesn't have the necessary I/O bandwidth to do the work."
Supercomputing

World's Most Powerful x86 Supercomputer Boots Up in Germany 151

Nerval's Lobster writes "Europe's most powerful supercomputer — and the fourth most powerful in the world — has been officially inaugurated. The SuperMUC, ranked fourth in the June TOP500 supercomputing listing, contains 147,456 cores using Intel Xeon 2.7-GHz, 8-core E5-2680 chips. IBM, which built the supercomputer, stated in a recent press release that the supercomputer actually includes more than 155,000 processor cores. It is located at the Leibniz-Rechenzentrum (Leibniz Supercomputing Centre) in Garching, Germany, near Munich. According to the TOP500 list, the SuperMUC is the world's most powerful X86-based supercomputer. The Department of Energy's 'Sequoia' supercomputer at the Lawrence Livermore National Laboratory in Livermore, Calif., the world's [overall] most powerful, relies on 16-core, 1.6-GHz POWER BQC chips."
Earth

More Hot Weather For Southern California, Says UCLA Study 218

The L.A. Times reports on a study by UCLA climate researchers who conclude, based on supercomputer analysis of a model "2,500 times more precise than previous climate models for the region" that the area around L.A. will experience more (and more extreme) hot spells in decades to come. From the article: "The study, released Thursday, is the first to model the Southland's complex geography of meandering coastlines, mountain ranges and dense urban centers in high enough resolution to predict temperatures down to the level of micro climate zones, each measuring 2 1/4 square miles. The projections are for 2041 to 2060. Not only will the number of hot days increase, but the study found that the hottest of those days will break records, said Alex Hall, lead researcher on the study by UCLA's Institute of the Environment and Sustainability."
Intel

Intel To Ship Xeon Phi For "Exascale" Computing This Year 77

MojoKid writes "At the International Supercomputing Conference today, Intel announced that Knights corner, the company's first commercial Many Integrated Core product will ship commercially in 2012. The descendent of the processor formerly known as Larrabee also gets a new brand name — Xeon Phi. The idea behind Intel's new push is that the highly efficient Xeon E5 architecture (eight-core, 32nm Sandy Bridge) fuels the basic x86 cluster, while the Many Integrated Core CPUs that grew out of the failed Larrabee GPU offer unparalleled performance scaling and break new ground. The challenges Intel is trying to surmount are considerable. We've successfully pushed from teraflops to petaflops, but exaflops (or exascale computing) currently demands more processors and power than it's feasible to provide in the next 5-7 years. Intel's MIC is targeted at hammering away at that barrier and create new opportunities for supercomputing deployments."
IBM

IBM Deploys Hot-Water Cooled Supercomputer 112

MrSeb writes "With the ISC (International Supercomputer Conference) kicking off this week, there's been a flurry of announcements around new supercomputer buildouts. One of the more interesting systems debuting this week is SuperMUC — IBM's new supercomputer at the Leibniz Supercomputing Center in Germany IBM is billing SuperMUC as the first 'hot-water cooled supercomputer,' an advance it claims cut power consumption by 40%. Dubbed Aquasar, the new system looks like any standard water cooler: water is pumped in one side of the blade, circulates throughout the system, and is pumped out. The difference, according to IBM, are the microchannels etched into the copper heatblock above the CPU cores. Rather than simply being dumped, SuperMUC's waste heat is designed to be converted into building heat during winter. Presumably it is mostly radiated away in summer, rather than being dumped into the offices of angry German scientists."
Supercomputing

US Regains Supercomputing Crown, Besting China and Japan 132

dcblogs writes "The U.S., once again, is home to the world's most powerful supercomputer after being knocked off the list by China two years ago and Japan last year. The top computer, an IBM system at the Department of Energy's Lawrence Livermore National Laboratory, is capable of 16.32 sustained petaflops, according to the Top 500 list, a global, twice a year ranking, released Monday. Despite the continuing strength of U.S. vendors globally, when China's supercomputer took the top position in June, 2010, it seemed to hit a national nerve. President Barack Obama mentioned China's top ranked supercomputer in two separate speeches, including his State of the Union address last year."
Space

Hawking Is First User of "Big Brain" Supercomputer 93

miller60 writes "Calling your product the 'Big Brain Computer' is a heady claim. It helps if you have Dr. Stephen Hawking say that the product can help unlock the secrets of the universe. SGI says its UV2 can scale to 4,096 cores and 64 terabytes of memory, with a peak I/O rate of four terabytes per second and runs off-the-shelf Linux software. Hawking says the UV2 'will ensure that UK researchers remain at the forefront of fundamental and observational cosmology.'"
Google

Is Siri Smarter Than Google? 366

storagedude writes "Google could go the way of the dodo if ultra intelligent electronic agents (UIEA) make their way into the mainstream, according to technology prognosticator Daniel Burrus. Siri is just the first example of how a UIEA could end search as we know it. By leveraging the cloud and supercomputing capabilities, Siri uses natural language search to circumvent the entire Google process. If Burrus is right, we'll no longer have to wade through '30,000,000 returns in .0013 milliseconds' of irrelevant search results."
Space

First Full Observable-Universe Simulation 95

First time accepted submitter slashmatteo writes "The goal of the DEUS project (Dark Energy Universe Simulation) is to investigate the imprints of dark energy on cosmic structure formation through high-performance numerical simulations. In order to do so, the project has conducted a simulation of the structuring of the entire observable universe, from the Big Bang to the present day. Thanks to the Curie super-computer, the simulation has made it possible to follow the evolution of 550 billion particles. Two other complementary runs are scheduled by the end of May. More details in the press release."
Math

Rybka Solves the King's Gambit Chess Opening 206

New submitter smarq2 writes "Chessbase reports that chess programmer IM Vasik Rajlich has solved the King's Gambit chess opening with technical means. 3000 processor cores, running for over four months, exhaustively analyzed all lines that follow after 1.e4 e5 2.f4 exf4 and came to some extraordinary conclusions." Update: 04/02 22:11 GMT by U L : Skuto points out that this is the same person who was found guilty of plagiarizing GNU Chess and Crafty.
IBM

IBM Optical Chip Moves Data At 1Tbps 127

snydeq writes "IBM researchers have developed a prototype optical chip that can transfer data at 1Tbps, the equivalent of downloading 500 high-definition movies, using light pulses, the company said Thursday. The chip, called Holey Optochip, is a parallel optical transceiver consisting of both a transmitter and a receiver, and is designed to handle the large amount of data created and transmitted over corporate and consumer networks as a result of new applications and services. It is expected to power future supercomputer and data center applications, an area where IBM already uses optical technology." User judgecorp links to more coverage, writing "The record was achieved because 24 holes in the chip allow direct access to lasers connected to the chip."
Earth

Texas Supercomputer Upgrading the Hurricane Forecast 31

aarondubrow writes "Researchers used the Ranger supercomputer to test a new, high-resolution hurricane forecasting system that incorporates Doppler radar data from planes flying into the storm. The forecasts were shown to improve intensity predictions by an average of 20 to 40 percent over the official forecasts of the National Hurricane Center (NHC). The prediction system is being hailed as a breakthrough and is one of a handful being assessed by the NHC to become part of the operational forecasting system used in emergency situations."
IBM

Installation of Blue Waters Petaflop Supercomputer Begins 86

An anonymous reader writes "The National Center for Supercomputing Applications at the University of Illinois is finally getting the troubled Blue Waters supercomputer installed. After IBM walked away from the project after 3 years of planning, Cray stepped in to pick up the $188 million contract. Now, in around 9 months time, Blue Waters should be fully operational and achieve performance of 1 petaflop or more. As for the hardware... who wouldn't want access to 235 Cray XE6 cabinets using AMD 16 core Opteron 2600 processors with access to 1.5 petabytes of memory (4GB per chip) and 500 petabytes of local storage."

Slashdot Top Deals