Technology

Opening Quantum Computing To the Public 191

director_mr writes "Tom's Hardware is running a story with an interesting description of a 28-qubit quantum computer that was developed by D-Wave Systems. They intend to open up use of their quantum computer to the public. It is particularly good at pattern recognition, it operates at 10 milliKelvin, and it is shielded to limit electromagnetic interference to one nanotesla in three dimensions across the whole chip. Could this be the first successful commercial quantum computer?"
Programming

The Father of Multi-Core Chips Talks Shop 90

pacopico writes "Stanford professor Kunle Olukotun designed the first mainstream multi-core chip, crafting what would become Sun Microsystems's Niagra product. Now, he's heading up Stanford's Pervasive Parallelism Lab where researchers are looking at 100s of core systems that might power robots, 3-D virtual worlds and insanely big server applications. The Register just interviewed Olukotun about this work and the future of multi-core chips. Weird and interesting stuff."
Software

BOINC Now Available For GPU/CUDA 20

GDI Lord writes "BOINC, open-source software for volunteer computing and grid computing, has posted news that GPU computing has arrived! The GPUGRID.net project from the Barcelona Biomedical Research Park uses CUDA-capable NVIDIA chips to create an infrastructure for biomolecular simulations. (Currently available for Linux64; other platforms to follow soon. To participate, follow the instructions on the web site.) I think this is great news, as GPUs have shown amazing potential for parallel computing."
Supercomputing

IBM's Eight-Core, 4-GHz Power7 Chip 425

pacopico writes "The first details on IBM's upcoming Power7 chip have emerged. The Register is reporting that IBM will ship an eight-core chip running at 4.0 GHz. The chip will support four threads per core and fit into some huge systems. For example, University of Illinois is going to house a 300,000-core machine that can hit 10 petaflops. It'll have 620 TB of memory and support 5 PB/s of memory bandwidth. Optical interconnects anyone?"
Supercomputing

Simple Mod Turns Diodes Into Photon Counters 118

KentuckyFC writes "The standard way to detect single photons is to use an avalanche photodiode in which a single photon can trigger an avalanche of current. These devices have an important drawback, however. They cannot distinguish the arrival of a single photon from the simultaneous arrival of two or more. But a team of physicists in the UK has found a simple mod that turns avalanche photodiodes into photon counters. They say that in the first instants after the avalanche forms, its current is proportional to the number of photons that have struck. All you have to do is measure it at this early stage. That's like turning a Fiat 500 into a Ferrari. Photon counting is one of the enabling technologies behind optical quantum computing. A number of schemes are known in which it is necessary to count the arrival of 0, 1 or 2 photons at specific detectors (abstract). With such a cheap detector now available (as well as decent photon guns), we could see dramatic progress in this field in the coming months."
Supercomputing

Cool/Weird Stuff To Do On a Cluster? 608

Gori writes "I'm a researcher at a university. Our group mainly does Agent Based Modeling of interdisciplinary problems (think massive simulations where technology, policy, and economics meet). Recently, we managed to get a bunch of money for a High Performance Cluster to run our stuff on. The code is mostly written in Java. Our IT support people are very capable of setting up a stable cluster that will run Java perfectly. But where's the fun in that? What I'm trying to figure out are other, more far-out and interesting things to do with this machine — think 500+ Opteron cores, 2 GB RAM per core, a gigabit interconnect with some badass switches, a massive storage array, plus a bunch of UltraSPARC boxes. So at times when there's no stuff to crunch, I'd like to boot the thing up with a 'weird' system image and geek around in the name of science. Try fancy ways of building models, dynamically adding all sorts of hardware to it, etc. Have different schedulers compete for resources. Imagine a Matlab vs. Boinc vs. ProActive shootout. Maybe run plan9 on it? Most of us are not CE/CS people, but we are geeky enough. So, what would be the coolest and most far out thing you would do with this kind of hardware ?"
Microsoft

Fastest-Ever Windows HPC Cluster 216

An anonymous reader links to an eWeek story which says that Microsoft's "fastest-yet homegrown supercomputer, running the U.S. company's new Windows HPC Server 2008, debuted in the top 25 of the world's top 500 fastest supercomputers, as tested and operated by the National Center for Supercomputing Applications. ... Most of the cores were made up of Intel Xeon quad-core chips. Storage for the system was about 6 terabytes," and asks "I wonder how the uptime compares? When machines scale to this size, they tend to quirk out in weird ways."
Supercomputing

"Intrepid" Supercomputer Fastest In the World 122

Stony Stevenson writes "The US Department of Energy's (DoE) high performance computing system is now the fastest supercomputer in the world for open science, according to the Top 500 list of the world's fastest computers. The list was announced this week during the International Supercomputing Conference in Dresden, Germany. IBM's Blue Gene/P, known as 'Intrepid,' is located at the Argonne Leadership Computing Facility and is also ranked third fastest overall. The supercomputer has a peak performance of 557 teraflops and achieved a speed of 450.3 teraflops on the Linpack application used to measure speed for the Top 500 rankings. According to the list, 74.8 percent of the world's supercomputers (some 374 systems) use Intel processors, a rise of 4 percent in six months. This represents the biggest slice of the supercomputer cake for the firm ever."
Supercomputing

Supercomputer Simulates Human Visual System 244

An anonymous reader writes "What cool things can be done with the 100,000+ cores of the first petaflop supercomputer, the Roadrunner, that were impossible to do before? Because our brain is massively parallel, with a relatively small amount of communication over long distances, and is made of unreliable, imprecise components, it's quite easy to simulate large chunks of it on supercomputers. The Roadrunner has been up only for about a week, and researchers from Los Alamos National Lab are already reporting inaugural simulations of the human visual system, aiming to produce a machine that can see and interpret as well as a human. After examining the results, the researchers 'believe they can study in real time the entire human visual cortex.' How long until we can simulate the entire brain?"
Supercomputing

Cell-based "Roadrunner" Tops Elusive Petaflop Mark 269

prunedude writes "The NY times is reporting that an American military supercomputer, assembled from components originally designed for video game machines, is more than twice as fast as the previous fastest supercomputer, the I.B.M. BlueGene/L. To put the performance of the machine in perspective, Thomas P. D'Agostino, the administrator of the National Nuclear Security Administration, said that if all six billion people on earth used hand calculators and performed calculations 24 hours a day and seven days a week, it would take them 46 years to do what the Roadrunner can in one day."
Supercomputing

Linux Cluster Supercomputer Performs Surgery on Dog 68

An anonymous reader writes "In April, the Lonestar supercomputer, a Dell Linux Cluster with 5,840 processors at the Texas Advanced Computing Center in Austin, performed laser surgery on a dog in Houston without the intervention of a surgeon. The article describes the process: 'The treatment itself is broken into four stages: 1) Lonestar instructs the laser to heat the domain with a non-damaging calibration pulse; 2) the thermal MRI acquires baseline images of the heating and cooling of the patient's tissue for model calibration; 3) Lonestar inputs this patient-specific information and recomputes the optimal power profile for the rest of the treatments; and 4) surgery begins, with remote visualizations and evolving predictions continuing throughout the procedure.'"
Supercomputing

Diamonds Key To Quantum Computing 92

I Don't Believe in Imaginary Property writes "Scientists P. Neumann, N. Mizuochi & co. have advanced quantum computing by finding a new method to get two-way and three-way, high quality quantum correlations that persist for hundreds or thousands of microseconds, even at room temperature. Their paper (subscription required) describes how they manipulated electrons from nitrogen vacancies in diamond using microwaves to entangle adjacent carbon-13 nuclei. Even better, this builds on previous results which indicate that diamonds with nitrogen impurities may be the key to creating useful quantum computing devices. The article provides a good description of what nitrogen vacancies are and why they prove useful."
Supercomputing

IBM Water-Cools 3D Multi-Core Chip Stacks 170

An anonymous reader writes "Water cooling will enable multi-core processors to be stacked into 3D cubes, according to IBM's Zurich Research Laboratory which is demonstrating three-dimensional chip stacks. By stacking memory chips between processor cores IBM plans to multiply interconnections by 100 times while reducing their feature size tenfold. To cool the stack at a rate of 180 watts per layer, water flows down 50-micron channels between the stacked chips. Earlier this year, the same group described a copper-plate water cooling method for IBM's Hydro-Cluster supercomputer. The Zurich team predicts high-end IBM multicore computers will migrate from the copper-plate water-cooling-method to the 3-D chip-stack in five to 10 years." Reader Lilith's Heart-shape adds a link to the BBC's article on these internally-cooled chips.
Supercomputing

Supercomputer Built With 8 GPUs 232

FnH writes "Researchers at the University of Antwerp in Belgium have created a new supercomputer with standard gaming hardware. The system uses four NVIDIA GeForce 9800 GX2 graphics cards, costs less than €4,000 to build, and delivers roughly the same performance as a supercomputer cluster consisting of hundreds of PCs. This new system is used by the ASTRA research group, part of the Vision Lab of the University of Antwerp, to develop new computational methods for tomography. The guys explain the eight NVIDIA GPUs deliver the same performance for their work as more than 300 Intel Core 2 Duo 2.4GHz processors. On a normal desktop PC their tomography tasks would take several weeks but on this NVIDIA-based supercomputer it only takes a couple of hours. The NVIDIA graphics cards do the job very efficiently and consume a lot less power than a supercomputer cluster."
Supercomputing

U.S. Plan For "Thinking Machines" Repository 148

An anonymous reader writes "Information scientists organized by the US's NIST say they will create a "concept bank" that programmers can use to build thinking machines that reason about complex problems at the frontiers of knowledge — from advanced manufacturing to biomedicine. The agreement by ontologists — experts in word meanings and in using appropriate words to build actionable machine commands — outlines the critical functions of the Open Ontology Repository (OOR). More on the summit that produced the agreement here."
Supercomputing

IBM Touts Supercomputers for Enterprise 94

Stony Stevenson writes "IBM has announced an initiative to offer smaller versions of its high-performance computers to enterprise customers. The first new machine is a QS22 BladeCenter server powered by a Cell processor. Developed to power gaming systems, the Cell chip has also garnered interest from the supercomputing community owing to its ability to handle large amounts of floating point calculations. IBM hopes that the chips, which currently power climate modeling and other traditional supercomputing tasks, will also appeal to customers ranging from financial analysis firms to animation studios."
Handhelds

Researcher Discusses iPod Supercomputer 108

schliz writes to mention that in a recent interview with ITNews researcher John Shalf explained the purpose and some of the technical details of the newly-announced "iPod supercomputer." "Microprocessors from portable electronics like iPods could yield low-cost, low-power supercomputers for specialized scientific applications, according to computer scientist John Shalf. Along with a research team from the US Department of Energy's Lawrence Berkeley National Laboratory, Shalf is designing a supercomputer based on low-power embedded microprocessors, which has the sole purpose of improving global climate change predictions."
Supercomputing

Modeling Supernovae With a Supercomputer 64

A team of scientists at the University of Chicago will be using 22 million processor-hours to simulate the physics of exploding stars. The team will make use of the Blue Gene/P supercomputer at Argonne National Laboratory to analyze four different scenarios for type Ia supernovae. Included in the link is a video simulation of a thermonuclear flame busting its way out of a white dwarf. The processing time was made possible by the Department of Energy's INCITE program. "Burning in a white dwarf can occur as a deflagration or as a detonation. 'Imagine a pool of gasoline and throw a match on it. That kind of burning across the pool of gasoline is a deflagration,' Jordan said. 'A detonation is simply if you were to light a stick of dynamite and allow it to explode.' In the Flash Center scenario, deflagration starts off-center of the star's core. The burning creates a hot bubble of less dense ash that pops out the side due to buoyancy, like a piece of Styrofoam submerged in water."
Supercomputing

Purdue Plans a 1-Day Supercomputer "Barnraising" 97

An anonymous reader points out an article which says that "Purdue University says it will only need one day to install the largest supercomputer on a Big Ten campus. The so-called 'electronic barn-raising' will take place May 5 and involved more than 200 employees. The computer will be about the size of a semi trailer. Vice President for Information Technology at Purdue Gerry McCartney says it will be built in a single day to keep science and engineering researchers from facing a lengthy downtime." Another anonymous reader adds "To generate interest on campus, the organizers created a spoof movie trailer called 'Installation Day.'"
Supercomputing

Cray, Intel To Partner On Hybrid Supercomputer 106

An anonymous reader writes "Intel convinced Cray to collaborate on what many believe will be the next generation of supercomputers — CPUs complemented by floating-point acceleration units. NVIDIA successfully placed its Tesla cards in an upcoming Bull supercomputer, and today we learn that Cray will be using Intel's x86 Larrabee accelerators in a supercomputer that is expected to be unveiled by 2011. It's a new chapter in the Intel-NVIDIA battle and a glimpse at the future of supercomputers operating in the petaflop range. The deal has also got to be a blow to AMD, which has been Cray's main chip supplier."

Slashdot Top Deals