Supercomputing

Breakthrough Toward Quantum Computing 61

redwolfe7707 writes "Qubit registers have been a hard thing to construct; this looks to be a substantial advance in the multiple entanglements required for their use. Quoting: 'Olivier Pfister, a professor of physics in the University of Virginia's College of Arts & Sciences, has just published findings in the journal Physical Review Letters demonstrating a breakthrough in the creation of massive numbers of entangled qubits, more precisely a multilevel variant thereof called Qmodes. ... Pfister and researchers in his lab used sophisticated lasers to engineer 15 groups of four entangled Qmodes each, for a total of 60 measurable Qmodes, the most ever created. They believe they may have created as many as 150 groups, or 600 Qmodes, but could measure only 60 with the techniques they used.'" In related news, research published in the New Journal of Physics (abstract) shows "how quantum and classical data can be interlaced in a real-world fiber optics network, taking a step toward distributing quantum information to the home, and with it a quantum internet."
The Almighty Buck

Banks' Big Upgrade: Meet Real-Time Processing 89

CWmike writes "It has been years since the banking industry made any large investments in core IT systems, but some of the largest financial services firms in the U.S. are now in the midst of rolling out multi-million dollar projects, say industry experts. About a decade ago, they began replacing decades-old Cobol-based core systems, with open, Web-enabled apps. Now, they are spending more than $100,000,000 to replace aging systems, converting to real-time mobile applications for retail services such as savings and checking accounts and lending systems. The idea behind going real-time: Grab more business — and money — from customers. 'Five of the top 20 banks are engaged in some sort of core banking replacement and we expect to see another three or four in next 12 months,' said Fiaz Sindhu, who leads Accenture's North American core banking practice. 'They're looking at those upgrades as a path to growth.'"
Supercomputing

JPMorgan Rolls Out FPGA Supercomputer 194

An anonymous reader writes "As heterogeneous computing starts to take off, JP Morgan have revealed they are using an FPGA based supercomputer to process risk on their credit portfolio. 'Prior to the implementation, JP Morgan would take eight hours to do a complete risk run, and an hour to run a present value, on its entire book. If anything went wrong with the analysis, there was no time to re-run it. It has now reduced that to about 238 seconds, with an FPGA time of 12 seconds.' Also mentioned is a Stanford talk given in May."
Supercomputing

A Million Node Supercomputer 116

An anonymous reader writes "Veteran of microcomputing Steve Furber, in his role as ICL Professor of Computer Engineering in the School of Computer Science at the University of Manchester, has called upon some old friends for his latest project: a brain-simulating supercomputer based on more than a million ARM processors." More detailed information can be found in the research paper.
AMD

AMD Gains In the TOP500 List 77

MojoKid writes "AMD recently announced its share of the TOP500 supercomputer list has grown 15 percent in the past six months. The company credits industry trends, upgrade paths, and competitive pricing for the increase. Of the 68 Opteron-based systems on the list, more than half of them use the Opteron 6100 series processors. The inflection point was marked by AMD's launch of their Magny-Cours architecture more than a year ago and includes the twelve-core Opteron 6180 SE at 2.5GHz at one end and two low-power parts at the other. Magny-Cours adoption is important. Companies typically don't upgrade HPC clusters with new CPUs, but AMD is billing their next-gen Interlagos architecture as a drop-in option for Magny-Cours. As such, it'll offer up to 2x the cores as well as equal-to or faster clock speeds."
Supercomputing

Could Wikipedia Become a Supercomputer? 165

An anonymous reader writes "Large websites represent an enormous resource of untapped computational power. This short post explains how a large website like Wikipedia could give a tremendous contribution to science, by harnessing the computational power of its readers' CPUs and help solve difficult computational problems." It's an interesting thought experiment, at least — if such a system were practical to implement, what kind of problems would you want it chugging away at?
Supercomputing

Intel Aims For Exaflops Supercomputer By 2018 66

siliconbits writes "Intel has laid down its roadmap in terms of computing performance for the next seven years in a press release; in addition, it revealed its expectations until 2027 in one deck of slides shown last week. The semiconductor chip maker wants a supercomputer capable of reaching 1000 petaflops (or one exaflops) to be unveiled by the end of 2018 (just in time for the company's 50th anniversary) with four exaflops being the upper end target by the end of the decade. The slide that was shared also shows that Intel wants to smash the zettaflops barrier — that's one million petaflops — sometime before 2030. This, Intel expects, will allow for significant strides in the field of genomics research, as well as much more accurate weather prediction (assuming Skynet or the Matrix hasn't taken over the world)."
Supercomputing

Japan's 8-petaflop K Computer Is Fastest On Earth 179

Stoobalou writes "An eight-petaflop Japanese supercomputer has grabbed the title of fastest computer on earth in the new Top 500 Supercomputing List to be officially unveiled at the International Supercomputing Conference in Hamburg today. The K Computer is based at the RIKEN Advanced Institute for Computational Science in Kobe, Japan, and smashes the previous supercomputing records with a processing power of more than 8 petaflop/s (quadrillion calculations per second) — three times that of its nearest rival."
IBM

IBM Turns 100 189

adeelarshad82 writes "On this day in 1911, IBM started as the Computing-Tabulating-Recording Company (C-T-R). It wasn't until 1924 that the company changed their name to IBM. Needless to say that a 100-year milestone is quite the feat. While some of us might know IBM for its recent "Jeopardy"-playing Watson computer, a look back shows that IBM has a long history of innovation, from cheese slicers (yes, really) and the tech behind Social Security to the UPC bar code and the floppy disk. One of the most notable leaps of faith IBM took was in 1964 with the introduction of System/360, a family of computers that started the era of computer compatibility. To date the company has invested nearly $30 billion in technology."
China

Chinese Tianhe-1A Supercomputer Starts Churning Out the Science 103

gupg writes "When China built the world's fastest supercomputer based on NVIDIA GPUs last year, a lot of naysayers said this was just a stunt machine. Well, guess what — here comes the science! They are working on better material for solar panels and they ran the world's fastest simulation ever. NVIDIA (whose GPUs accelerate these applications as a co-processor) blogged on this a while ago, where they talk about how the US really needs to up its investment in high performance computing."
Data Storage

Too Much Data? Then 'Good Enough' Is Good Enough 56

ChelleChelle writes "While classic systems could offer crisp answers due to the relatively small amount of data they contained, today's systems hold humongous amounts of data content — thus, the data quality and meaning is often fuzzy. In this article, Microsoft's Pat Helland examines the ways in which today's answers differ from what we used to expect, before moving on to state the criteria for a new theory and taxonomy of data."
Supercomputing

Simulations Show Quantum Error Not As Bad As Believed 44

aarondubrow writes "Because quantum systems become unstable quickly, their error threshold is an important factor. How many bits can 'break' before the system stops working? An international team of researchers used the supercomputers at the Texas Advanced Computing Center to simulate the error threshold of a topological quantum system and found a much higher threshold than had previously been reported."
Red Hat Software

Ask Slashdot: Best Linux Distro For Computational Cluster? 264

DrKnark writes "I am not an IT professional, even so I am one of the more knowledgeable in such matters at my department. We are now planning to build a new cluster (smallish, ~128 cores). The old cluster (built before my time) used Redhat Fedora, and this is also used in the larger centralized clusters around here. As such, most people here have some experience using that. My question is, are there better choices? Why are they better? What would be recommended if we need it to fairly user friendly? It has to have an X-windows server since we use that remotely from our Windows (yeah, yeah, I know) workstations."
Supercomputing

Cray Unveils Its First GPU Supercomputer 76

An anonymous reader writes "Supercomputer giant Cray has lifted the lid on its first GPU offering, bringing it into the realm of top supers like the Chinese Tianhe-1A" The machine consists of racks of blades, each with eight GPU and CPU pairs (that can even be installed into older machines). It looks like Cray delayed the release of hardware using GPUs to work on a higher level programming environment than is available from other vendors.
Biotech

16-Year-Old Discovers Potential Treatment For Cystic Fibrosis 236

Bob the Super Hamste writes "According to a story at LiveScience, a 16-year-old Canadian 11th grade student has discovered a possible treatment for cystic fibrosis. The treatment is a combination of two drugs which, in a computer simulation on the Canadian SCINET supercomputing network, did not interfere with each other while interacting with the defective protein responsible for the disorder. He has also tested the drug combination on living cells with results that 'exceeded his expectations.'"
Cloud

Simulating Societies At the Global Scale 64

An anonymous reader writes "Teams of European researchers are vying to create a distributed supercomputer of unprecedented scale to analyze the data that streams in from hundreds of devices and feeds (mobile, social data, market data, medical input, etc) and use it to 'run global-scale simulations of social systems.'"
Math

Blue Gene/P Reaches Sixty-Trillionth of Pi Squared 212

Reader Dr.Who notes that an Australian research team using IBM's Blue Gene/P supercomputer has calculated the sixty-trillionth binary digit of Pi-squared, a task which took several months of processing. Snipping from the article, the Dr. writes: "'A value of Pi to 40 digits would be more than enough to compute the circumference of the Milky Way galaxy to an error less than the size of a proton.' The article goes on to cite use of computationally complex algorithms to detect errors in computer hardware. The article references a blog which has more background. Disclaimers: I attended graduate school at U.C. Berkeley. I am presently employed by a software company that sells an infrastructure product named PI."

Slashdot Top Deals