Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software The Internet

Swarm — a New Approach To Distributed Computation 80

An anonymous reader writes "Ian Clarke, creator of Freenet, has been working on a new open source project called Swarm. The concept is to allow a computer program to be distributed across multiple computers in a manner almost completely transparent to the programmer. The system observes the program executing and figures out how the workload should be distributed for maximum efficiency. Swarm is implemented in Scala. Its at an early-prototype stage, and Ian has created a good 36 minute video explaining the concept and the current implementation."
This discussion has been archived. No new comments can be posted.

Swarm — a New Approach To Distributed Computation

Comments Filter:
  • by Anonymous Coward on Sunday October 11, 2009 @02:15PM (#29712209)

    Just saying it outloud so we can start working on countermeasures.

  • by Darkness404 ( 1287218 ) on Sunday October 11, 2009 @02:32PM (#29712309)
    You know though, most people don't ever check that. They think that over time Windows just "gets slow" because hardware "goes obsolete". So when that happens they think they have to buy a new computer.
  • looks intriguing (Score:5, Insightful)

    by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Sunday October 11, 2009 @02:45PM (#29712397)

    The thing that's always killed this idea (along with automatic parallelization even on the same machine) is that the overhead of figuring out what's worth distributing, and the additional overhead from mistakes (accidentally distribute trivial computations), often swamps the gains from the multiple processors banging away on it simultaneously. Determining statically what's worth distributing is very hard, since solving it properly is undecidable (basically equivalent to the halting problem), and even solving it in a significant enough subset of cases to be useful has proved difficult. It looks like this project is monitoring dynamically to determine what to distribute, which seems likely to be more fruitful, although historically that approach has suffered from the overhead of the monitoring (like always running your code with debugging instrumentation turned on).

    I certainly hope he has a breakthrough vs. past approaches, or it could just be that advances in a lot of areas of technology have given him a better substrate on which to build things that naturally mitigates lots of the problems these things used to have (automatic parallelization research started probably ahead of its time, back in the 1970s, so that most academic stuff was killed off by the 1990s after no really knock-down results emerged). It's not entirely clear to me what the killer advance is, though. The particular variety of portable continuations? A good way of easily monitoring computations? Something that makes the data-dependency analysis particularly easy?

  • by hazem ( 472289 ) on Sunday October 11, 2009 @03:58PM (#29712771) Journal

    I'm just getting into Agent Based Modeling myself and I had exactly the same thought... why would they use the name of an established tool; especially when there are similarities in the concepts. This seems like a recipe for confusion.

    A good first check when starting an open project is to check propesedprojectname.org and see if there's anything active there. Or even just Google it - if another project shows up near the top with the same name, it's probably a good idea to pick another name.

    I'm sure there are plenty of synonyms for "swarm" that capture the idea, if not an alternate spelling.

    But like you said, it does sound like an interesting project.

  • by FlyingBishop ( 1293238 ) on Sunday October 11, 2009 @04:41PM (#29713013)

    Depending on how many cores you have access to, distributing trivial computations may not matter. If we ever start seeing 32 core desktop machines, for example, you start to get to the point where forking could create a realtime speedup even though in absolute terms you've wasted 5 times as many cycles.

  • by Anonymous Coward on Sunday October 11, 2009 @04:41PM (#29713015)

    Mod parent up. This is exactly what Ian did with Freenet.

    He cobbled together an overly-simplistic prototype to address a set of very difficult unsolved problems in anonymous communication and then farmed out the actual real-world legwork on those problems to interested open source developers while Ian himself effectively abandoned Freenet for other (paying) gigs. To this day he is credited, somewhat ironically, as "the creator of Freenet," and a decade later the Freenet project still hasn't solved the problems it set out to solve, even after changing the fundamental network architecture several times.

    Great career strategy though. Get credit for the shiny things and pass the shame of failure off on others. He's CEO material all the way.

  • by BikeHelmet ( 1437881 ) on Sunday October 11, 2009 @05:39PM (#29713427) Journal

    In my experience, Java is not the reason people buy new computers.

    Their computers slow down from viruses, or virus-like Antivirus, and then they think they need to upgrade.

    Lately commercially made programs (AIM? Windows Live stuff? Most printer software? Most shareware?) seem to consume as much memory as a whole JVM, despite being written in C. This has led me to conclude that companies really don't give a shit how much memory their software uses. This is quite ironically pushing Java closer and closer to C in actual memory and CPU usage.

    Disclaimer: I know C is amazing when used properly - but it seems like only small FOSS projects and apps destined for phones have any sort of optimization work done. I've seen daemons use 200KB on a tiny linux handheld, but multiple megabytes is the norm on any desktop.

  • by Anonymous Coward on Sunday October 11, 2009 @06:00PM (#29713607)
    "Writing free operating systems is hard, really hard. Its so hard that nobody ever did it properly. But Linux will change this! How? Well, I've produced some code that works, but there is a lot left to do, and you can help!" - Linus around 1991 (paraphrased)

    Clarke never said he doesn't know how to solve the remaining problems (of which, he freely admits in the video, there are many). Would you prefer that no open source project was released to the world until it was 100% finished? Good luck with that.

Kleeneness is next to Godelness.

Working...