Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Supercomputing

Is Parallelism the New New Thing? 174

astwon sends us to a blog post by parallel computing pioneer Bill McColl speculating that, with the cooling of Web 2.0, parallelism may be a hot new area for entrepreneurs and investors. (Take with requisite salt grains as he is the founder of a Silicon Valley company in this area.) McColl suggests a few other upcoming "new things," such as Saas as an appliance and massive memory systems. Worth a read.
This discussion has been archived. No new comments can be posted.

Is Parallelism the New New Thing?

Comments Filter:
  • by Nursie ( 632944 ) on Friday March 28, 2008 @10:50AM (#22893630)
    Oh yes, here it is [slashdot.org].

    And the conclusion?

    It's been around for years numbnuts, in commercial and server applications, middle tiers, databases and a million and one other things worked on by serious software developers (i.e. not web programming dweebs).

    Parallelism has been around for ages and has been used commercially for a couple of decades. Get over it.
  • by 1sockchuck ( 826398 ) on Friday March 28, 2008 @10:53AM (#22893670) Homepage
    This sure looks like a growth area for qualified developers. An audience poll at the Gartner Data Center conference in Las Vegas in November found that just 17 percent of attendees [datacenterknowledge.com] felt their developers are prepared for coding multi-core applications, compared to 64 percent who say they will need to train or hire developers for parallel processing. "We believe a minority of developers have the skills to write parallel code," said Gartner analyst Carl Claunch. I take the Gartner stuff with a grain of salt, but the audience poll was interesting.

    McColl's blog is pretty interesting. He only recently started writing regularly again. High Scalability [highscalability.com] is another worthwhile resource in this area.
  • by david.emery ( 127135 ) on Friday March 28, 2008 @10:57AM (#22893720)
    So all-of-a-sudden people have discovered parallelism? Gee, one of the really interesting things about Ada in the late 80s was its use on multiprocessor systems such as those produced by Sequent and Encore. There was a lot of work on the language itself (that went into Ada95) and on compiler technologies to support 'safe parallelism'. "Safe" here means 'correct implementation' against the language standard, considering things like cache consistency as parts of programs get implemented in different CPUs, each with its own cache.

    Here are a couple of lessons learned from that Ada experience:
    1. Sometimes you want synchronization, and sometimes you want avoidance. Ada83 Tasking/Rendezvous provided synchronization, but was hard to use for avoidance. Ada95 added protected objects to handle avoidance.
    2. In Ada83, aliasing by default was forbidden, which made it a lot easier for the compiler to reason about things like cache consistency. Ada95 added more pragmas, etc, to provide additional control on aliasing and atomic operations.
    3. A lot of the early experience with concurrency and parallelism in Ada learned (usually the hard way) that there's a 'sweet spot' in the number of concurrent actions. Too many, and the machine bogs down in scheduling and synchronization. Too few, and you don't keep all of the processors busy. One of the interesting things that Karl Nyberg worked on in his Sun T1000 contest review was the tuning necessary to keep as many cores as possible running. (http://www.grebyn.com/t1000/ [grebyn.com] ) (Disclosure: I don't work for Grebyn, but I do have an account on grebyn.com as a legacy of the old days when they were in the ISP business in the '80s, and Karl is an old friend of very long standing....)

    All this reminds me of a story from Tracy Kidder's Soul of a New Machine http://en.wikipedia.org/wiki/The_Soul_of_a_New_Machine [wikipedia.org]. There was an article in the trade press pointing to an IBM minicomputer, with the title "IBM legitimizes minicomputers". Data General proposed (or ran, I forget which) an ad that built on that article, saying "The bastards say, 'welcome' ".

    dave

  • by kiyoshilionz ( 977589 ) on Friday March 28, 2008 @11:35AM (#22894158)

    You think that nobody has a real interest in parallel computing? Intel's put their money on it already - they've allotted $20 million between UC Berkeley [berkeley.edu] and University of Illinois [uiuc.edu] to research parallel computing, both in hardware and software.

    I am a EECS student at Cal right now and I have heard talks by the UC Berkeley PARLab [berkeley.edu] professors (Krste Asanovic and David Patterson, the man who brought us RAID and RISC), and all of them say that the computing industry is going to radically change unless we figure out how to efficiently use parallelism. This is the first time in history that software performance is beginning to lag behind how fast we can make our hardware. The failure of the frequency scaling to continue to improve system performance has been shown in the failure of the NetBurst microarchitecture - remember the Prescott? And the failure of the Tejas and Jayhawk [wikipedia.org]? Building faster chips is over, it's a mechanical engineering issue - we can make chips put out more heat per area than the surface of the sun. Quoting professor Hennessey from Stanford:

    "...when we start talking about parallelism and ease of use of truly parallel computers, we're talking about a problem that's as hard as any that computer science has faced. ... I would be panicked if I were in industry. ... you've got a very difficult situation."

    To whoever is saying that parallelism is just a fad, you're really missing a lot of what's going on in the computing world. We've already switched to dual- and quad-core CPU's, and it doesn't look like it's going to stop any time soon.

  • Re:About time (Score:5, Informative)

    by pleappleappleap ( 1182301 ) on Friday March 28, 2008 @11:37AM (#22894174) Homepage

    As a user of Linux, I have to say the Parallelism is the 'old thing', as Linux has supported parallel operations for over a decade. Compare this to closed source, proprietary operating systems, such as Windows, where this sort of thing is relatively new.

    Windows is not the only closed-source proprietary operating system out there. AIX and Solaris have supported parallel functions for a number of years, and various IBM mainframe operating systems have had those functions since the '70's. There are architectures which had it in the '60's.

    Proprietary closed-source operating systems had these functions FIRST before Linux was a twinkle in Linus Torvalds's shorts.

  • by Anonymous Coward on Friday March 28, 2008 @12:14PM (#22894590)
    Unfortunately for them, they could never get it to work.

    Hyperbole much? Parallel systems such as MPI have been the staple of high performance computing since the mid 90's, and there are plenty of developers (including myself) who can write multi-threaded code without breaking into a sweat, and get it right.

    At what point did parallel and concurrent programming "fail"? I really must have missed that memo.
  • by MOBE2001 ( 263700 ) on Friday March 28, 2008 @12:23PM (#22894700) Homepage Journal
    Parallel systems such as MPI have been the staple of high performance computing since the mid 90's, and there are plenty of developers (including myself) who can write multi-threaded code without breaking into a sweat, and get it right.

    In that case, you should hurry and tell Microsoft and Intel to refrain from giving that 20 million they want to give to UC Berkeley and UI Urbana-Champaign to find a solution to the parallel programming problem. According to you, Microsoft, Intel, AMD and all the others are wasting hundreds of millions in research labs around the world trying to make it easy to build apps with threads. After all, you already found the solution, right? And you found an easy way to build threaded programs, right?

    Sure.

Always draw your curves, then plot your reading.

Working...