Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Social Networks The Internet Science

Citizen Science and Grid Computing 69

japonicus writes "The Economist has an article summarizing the current state of distributed computing (think SETI@home and its ilk), which suggests that distributed-human projects are going to be the next big thing. (We discussed one such project, the Galaxy Zoo, a few months back.) The distributed-computing platform BOINC is about to expand to human processing. Distributed proofreaders have been a longstanding success (yet inexplicably failed to get even a mention in the article); but there are a lot of other projects waiting in the wings."
This discussion has been archived. No new comments can be posted.

Citizen Science and Grid Computing

Comments Filter:
  • Wiki? (Score:3, Interesting)

    by ttapper04 ( 955370 ) on Tuesday December 11, 2007 @05:09PM (#21662357) Journal
    I wonder if they could borrow ideas from the wiki community.
  • That'll have to be first. Not impossible to do, but given the state of the IT infrastructures I've seen unlikely for a while.

     
  • Wow (Score:4, Funny)

    by moogied ( 1175879 ) on Tuesday December 11, 2007 @05:15PM (#21662477)
    The economist, a magazine respected all around the world, has just published an article that concludes: "Two heads are better then one."

    Hmph..

  • games (Score:5, Informative)

    by enjahova ( 812395 ) on Tuesday December 11, 2007 @05:20PM (#21662579) Homepage
  • by east coast ( 590680 ) on Tuesday December 11, 2007 @05:21PM (#21662599)
    The Economist has an article summarizing the current state of distributed computing (think SETI@home and its ilk), which suggests that distributed-human projects are going to be the next big thing
     
    After all, just look at BotNets. How much more insight do we need than that?
     
    If only Joe Sixpack (who leaves his computer on 24/7 even tho he only uses it about a half hour per day) would understand that every clock cycle is sacred, every clock cycle is great...
     
    If only.
    • If only Joe Sixpack (who leaves his computer on 24/7 even tho he only uses it about a half hour per day) would understand that every clock cycle is sacred, every clock cycle is great...

      Don't worry : Joe Sixpack is taking part in distributed computing. Mainly distributed Spamming and distributed DOSing. Thanks to Microsoft's legendary security and modern Zombie worms, all those computer ARE used indeed.

      Strom Botnet : brings Grid computing to average Joe's reach (tm).

  • by compumike ( 454538 ) on Tuesday December 11, 2007 @05:24PM (#21662627) Homepage
    Sure, there are tasks that computers can't do so well at the moment, where giving the work parcels to humans would make the most sense. But can you imagine what micropayments might allow? It would enable a consistent set of trained, motivated workers to be stable over time, and dependable enough to use this kind of network for important activities.

    Ultimately, humans get bored and computers don't. But humans can be delayed from boredom quite a bit by financial compensation.

    --
    Educational microcontroller kits for the digital generation. [nerdkits.com]
    • But can you imagine what micropayments might allow?

      Abuse, fraud and theft?

      It would enable a consistent set of trained, motivated workers to be stable over time, and dependable enough to use this kind of network for important activities.

      I tend to agree with you, but you do have to figure out how to combat fraudulent activities. After all, most of these are like "pick the picture that most matches foo" or whatever but if someone writes a bot to randomly click on a picture to get micropayments? Not so good because not only were you cheated, but now you have a bunch of wrong data. How do you detect fraud in such a system?

      • Re: (Score:3, Informative)

        by Yetihehe ( 971185 )

        [...] but if someone writes a bot to randomly click on a picture to get micropayments? Not so good because not only were you cheated, but now you have a bunch of wrong data. How do you detect fraud in such a system?
        Did you RTFA? It's obvious: with redundancy. When 10 users agree and one misses this agreement most times, he is considered not trustworthy and therefore ignored and not payed.
        • Exactly. Amazon's Mechanical Turk system already does this.
        • by galoise ( 977950 )

          actually, at least in statistics, is a bit more precise, and you normally disregard data points with more than three deviations of the popualtion mean, as "aberrant" cases. The problem is that normally, random values in a vector do not deviate enough from valid cases to be detectable, so the noise produced by a bot cheating could very well cripple the whole project.

          Probably only after a lot of rounds, when tendencies are well known and researched, you could devise more precise tests to check the validity o

  • by schnikies79 ( 788746 ) on Tuesday December 11, 2007 @05:25PM (#21662653)
    The only problem I have with the current way of grid-based computing is that it cost me a decent amount over the year. I have to leave my PC('s) on, which burns up power that could otherwise be saved.

    I know several slashdotters leave their computers on 24/7, but I don't. It's akin to leaving a light-bulb on overnight, or leaving the fridge door open. I do have a computer I leave on overnight when it's downloading, but it's a 5headless 00mhz p3 with 256mb ram and it's promptly shut down until I need to download again.
    • Thats supposed to a headless 500mhz p3
    • by Charcharodon ( 611187 ) on Tuesday December 11, 2007 @05:34PM (#21662805)
      That's funny your comment about power usage, because that's exactly how one of the IT guys got found out by management. He was running seti@home during the night on all the work stations and servers. Finance noticed a jump in the power bill about the same time this guy was brought in to work in their IT section. He was racking up quite a few points for the 3 months or so he was getting away with it.
    • by gQuigs ( 913879 )
      I have a crazy idea for you. Install it, and don't change your habits. It is fine to turn a computer using BOINC off at night. Most applications checkpoint every 5 minutes or so, which means you might lose 5 minutes of work by turning it off. Hardly anything to be upset about.

      Is there another reason you think you need to leave your PCs running BOINC on?
    • by EMeta ( 860558 ) on Tuesday December 11, 2007 @05:59PM (#21663233)
      But now it is winter, so my computer is at worst a badly directed space heater.
      • Depending on how you heat your house and how your electricity is generated, it's also an inefficient space heater. I'd much rather use a primary energy source to heat my home than incur the 50% efficiency loss by converting the primary source into electricity first.
      • by itof500 ( 239202 )
        This is my approach as well. During the air conditioning season, the computer is on only when I am in front of it. From mid autumn through winter to mid spring I have it on 24/7 running climatepredition.net. I figure the excess heat just keeps my apartment warm.

        duke out
    • Re: (Score:1, Insightful)

      by Anonymous Coward
      Say is costs 10 cents a kilowatt hour for energy, your PC draws 200 watts on average, and it's on eight hours per day.

      That's 58 dollars a year, saving about 117 by turning if off at night.

      The expansion and contraction from the heating and cooling cycles ruin hardware.

      I imagine that by thermal cycling it every day it will cost more money (with a long enough time frame) in destroyed hardware than the electricity you saved by powering it down.

      • I've been running PC's for 12 years. The only part I have ever had fail is a hard drive. I keep hardware an average of 4 years.
    • by caferace ( 442 )
      I know several slashdotters leave their computers on 24/7, but I don't. It's akin to leaving a light-bulb on overnight, or leaving the fridge door open. I do have a computer I leave on overnight when it's downloading, but it's a 5headless 00mhz p3 with 256mb ram and it's promptly shut down until I need to download again.

      I understand your enthusiastic if misplaced green-ism, but I do hope you know your inefficient 500Mhz P3 with an aging mobo, RAM and PSU is likely 2-3 times as watt-hungry as a a modern

      • I'm well aware of this, thats why it's never on except when download a torrent. It's only fired up once or twice a month and only for a few hrs. I really don't download much.

        This has nothing to do my being green, it's about saving money.
        • by galoise ( 977950 )

          awh, come on, you mentioned the p3 as an example of a computer that sucked up less juice, presumably because it's slower. But if you really want to save bucks, and GP is right, you should really STOP using that p3, and use your main box, or any newer computer, for the download. Don't take it personal, in any case: i'm just pointing out that if GP is right, your strategy of using the p3 to download is not energy efficient, and you should review it.*hint hint*

          by the way, i just moved to a new appartment, an

  • The Economist coined that out of their ass? Seriously, the current acception of 'citizen' is a person taken as subject to the laws of a specific government. What does *that* has to do with voluntary distributed computing? Nothing! They just assume voluntary distributed computing = virtuous, virtuous = good citizen, and there bingo citizen becomes synonymous with virtuous. Participation in a common project becomes not a personal contribution, but a contribution from us, *as subjects of a government*.

    I'm not
    • My thought exactly. Lets assume they mean more of a society science, or social science in that the society is contributing to science. I'm sorry all you SETI@home loving people but it isn't "citizen science" in the sense that you have a claim on the science. You didn't help develop the algorithms, test the model etc etc. You are nothing but someone donating a high powered calculator. The calculator has no claim on the scientific results, nor do you.

      That said, you can feel good that you have contributed so

  • As soon as you see some asshat saying in print or especially on the internet that something is "the next big thing" you can bet your left nut it isn't.

    -mcgrew [slashdot.org]
  • You can always pickup a jiggit at http://www.thesheepmarket.com/ [thesheepmarket.com].

    About : http://users.design.ucla.edu/~akoblin/work/thesheepmarket/ [ucla.edu]
    Created with : http://www.processing.org/ [processing.org], http://www.mturk.com/mturk/ [mturk.com]
  • is listed on my site: http://distributedcomputing.info/ [distribute...uting.info] . If you leave your computer on all the time and it isn't doing anything useful when you aren't using it, please look through these projects and pick one or more to contribute to.
    • by cp.tar ( 871488 )

      Recently, I've started thinking about a distributed computing project for language analysis... some statistical analyses and machine learning could very well be implemented in this way, especially if we use Google (with a limited number of searches per day) as a corpus...

      The idea occured to me when I saw a presentation of a bootstrapping system that used Google, but the author said the access was severely limited -- he couldn't get access to more professional APIs without paying quite a lot of money, and a

  • by fm6 ( 162816 )

    Distributed proofreaders have been a longstanding success (yet inexplicably failed to get even a mention in the article)

    Maybe because it's a totally amateur effort?

    I volunteered for DP for a few months. I got buggy TIFFs that my web browser couldn't deal with, so I sometimes had to work outside the DP proofing environment, which was a pain. (My suggestion that they switch to a more portable format, such as PNG, fell on deaf ears.) And they're still stuck on the idea that plain text is a universal format.

    • Re: (Score:2, Informative)

      Maybe because it's a totally amateur effort?

      "All-volunteer" is not the same thing as "totally amateur." A number of our volunteers work in library science, proofreading, or other directly related fields.

      I volunteered for DP for a few months. I got buggy TIFFs that my web browser couldn't deal with, so I sometimes had to work outside the DP proofing environment, which was a pain. (My suggestion that they switch to a more portable format, such as PNG, fell on deaf ears.) And they're still stuck on the idea that plain text is a universal format. There was no good way to indicate marginal notes. Both boldface and italic are indicated by all caps. And equations were managed with a subset of LaTex which I'm sure I mangled because I didn't have a LaTex interpreter to test it on in fact, the DP instructions didn't even mention that it was LaTex.

      It sound like you last visited DP a long time ago. DP has been standardized on PNG as their page image format almost since the site's inception 7 years ago, though we do allow jpg as an alternative. TIFF has never been an official format there. DP has also been producing HTML, DJVU, and LaTeX editions of projects (including illustrations) for m

      • by fm6 ( 162816 )

        "All-volunteer" is not the same thing as "totally amateur." A number of our volunteers work in library science, proofreading, or other directly related fields.

        Never said it was. In this kind of context, I think you'll find "amateur" usually means the opposite of "professional". And in this context "professional" doesn't mean "paid", it means "knows what they're doing".

        It sound like you last visited DP a long time ago. DP has been standardized on PNG as their page image format almost since the site's inception 7 years ago, though we do allow jpg as an alternative. TIFF has never been an official format there.

        I don't know what to tell you. I was involved in 2003, and at that time I used a sort of web proofreading tool that used TIFF. Perhaps that was a feature of the particular tool.

        Markup for bold and italics is the same as HTML, and markups exist for and are used to indicate marginal notes, footnotes, and the like. You are welcome to argue that a more complex markup is necessary, but considering the amount of outdated information in your comments here, you may wish to stop by and update your knowledge of the the state of the site. We'll happily welcome you back if you do.

        I just did stop by. All the "recently finished" links on the front page are broken — not the best way to persuade

        • I don't know what to tell you. I was involved in 2003, and at that time I used a sort of web proofreading tool that used TIFF. Perhaps that was a feature of the particular tool.

          Ah, that may have been the long-obsolete Windows-based client "PRTK."

          All the "recently finished" links on the front page are broken not the best way to persuade folks you're not amateurs.

          Those offsite links are valid, but not until after PG does its nightly cataloging run which places files in the correct locations on their server(s). Why they don't move files into place immediately on posting a text is beyond me, since it should be trivial from a technical standpoint, but since I don't volunteer for them directly, I can't respond to that. The downside is, as you've noted, that the offsite links we present don't immed

          • by fm6 ( 162816 )

            Those offsite links are valid, but not until after PG does its nightly cataloging run which places files in the correct locations on their server(s). Why they don't move files into place immediately on posting a text is beyond me, since it should be trivial from a technical standpoint, but since I don't volunteer for them directly, I can't respond to that.

            Neither "it's not our fault" or "they should work" is more than a silly excuse. If this were my website, I'd work with the other website to make sure the links worked. If that didn't work out, I'd take down the links. Proudly displaying links that don't work, for whatever reason, makes you look like idiots.

            Your suggestions would work better in a "professional" environment, but in a volunteer environment, they would fail because the learning curve is too high...

            In other words, you're not going to use the right tools because you don't think your volunteers could be bothered to learn to use them. Well, here's one volunteer who's lost interest because you insist

    • Re: (Score:2, Informative)

      by bgalbrecht ( 920100 )

      As others have mentioned, you must have volunteered at DP a very long time ago because ALL of your objections to our work are no longer valid. The only complaint of yours that was valid when I started volunteering there 3.5 years ago was that DP's final versions submitted to Project Gutenberg were plain-text.

      At the time you were volunteering, PG was primarily a repository of only plain-text documents. These days, in a large part due to the influence of volunteers at DP, nearly every new text submitted to

      • by fm6 ( 162816 )
        I just took a look at the DP site. As I've noticed in another post, not that much has changed.
        • Saying so doesn't make it true. For example, take a look at the HTML editions of a section of the 1911 Encyclopedia Britannica http://www.gutenberg.org/etext/19699 [gutenberg.org], Music Notation and Terminology by Karl Wilson Gehrkens http://www.gutenberg.org/etext/19499 [gutenberg.org], and Elements of Structural and Systematic Botany by Douglas Houghton Campbell http://www.gutenberg.org/etext/20390 [gutenberg.org], all recent productions from Distributed Proofreaders and top 100 downloads from Project Gutenberg. It is true that we're still using LaT
          • by fm6 ( 162816 )
            "Saying it's so doesn't make it true?" How does that even apply here? I didn't just make a claim, I pointed out some severe limitations in the "guidelines" that haven't changed since I was a volunteer. True there have been some improvements (not using ALL CAPS for italics removes a major eyesore) but it's still pretty much a mess.

            I'm glad you linked the 1911 EB, since that's the DP project I care most about. Now, suppose I want to read the article on Sir Thomas Bromley. I have to figure out which file has h
  • Maybe once these projects pick up speed, people will get together and start working out of a common workplace to increase efficiency !
    Hmmm... but then of course they'll need a big building to fit everyone, some form of financing, cubicles....

    Hey... wait a minute ! This sounds familiar...
    Nope, false alarm. What a new & radical concept ! This could change everything !
  • by edsousa ( 1201831 ) on Tuesday December 11, 2007 @06:59PM (#21664103) Journal
    Grid computing is when you request resources to run your app. Projects like SETI@home use a different approach: you pull a task, instead of arbitrarily offering your computing resources.

    IBM defines grid computing as "the ability, using a set of open standards and protocols, to gain access to applications and data, processing power, storage capacity and a vast array of other computing resources over the Internet.
    in http://en.wikipedia.org/wiki/Grid_Computing [wikipedia.org]
    • by dkf ( 304284 )
      Grid computing and distributed computing are related, but not the same. With distributed computing, the focus is mainly on getting large numbers of machines that are functionally the same in some way (e.g. large numbers of SETI@home processing units). With grid computing, the focus is mainly on dealing with heterogeneity and varying access rules between different organizations. The two approaches deal with different things, but can be (and often are) used together.
  • Hi,

    This is something that I have had an interest in for the last few years. As such, a large part of my thesis has been developing "CompTorrent". It is a computing platform that has borrowed some ideas from BitTorrent and combined them with distributed computing.

    The focus has been on making distributed computing projects as easy to start as a BitTorrent swarm. After spending some quality time with both BOINC and Condor I can assure you that getting a project going from scratch, can be a non-trivial exerc

  • http://inchorus.com/ [inchorus.com] was a startup that tried to do exactly this (think Amazon Mechanical Turk, but distributed). Great fun, lots of projects... no seed funding. :)

    http://www.webware.com/8301-1_109-9676000-2.html [webware.com] says what we tried to do.
  • TANSTAAFL. It's ironic that people running, e.g. ClimatePrediction are simultaneously helping to change the climate. Each PC does not generate much heat, but several million of them certainly do - especially if left on 24 hours a day, 7 days a week as many enthusiasts tend to. See for example this rough analysis: http://hardforum.com/showthread.php?t=1240015 [hardforum.com]

    We have to figure both the heat generated and the power consumed (much of which is derived from fossil fuels). Even if you use green electricity, that j
  • I strongly believe the power of volunteer computing project will revolutionize science and technology. That is why I spend all my free time learning and developing a project that attempts to engineer clean energy technology. My work attempts to identify efficient catalysts in hydrogen production using Quantum Monte Carlo and Docking simulations. There is a great deal of developement ahead of me and I work on it with a shoe string budget since it is a hobby. Thanks to the friendly community I am actually ma

"If it ain't broke, don't fix it." - Bert Lantz

Working...