The thinking seems to be that abusive "digging" to promote a story, is less harmful than abusive "burying", and this has the ring of plausibility — that a creative effort is better than a destructive one. After all, Alternet had previously highlighted several artificial right-wing "digg brigades" mentioned in their story (Diggs And Buries, theliberalheretic, etc.), but they didn't blow the lid off of the situation until their report on the Digg Patriots bury brigade, as if to say, "Now we've found something really scandalous!" Annalee Newitz cheekily reported on how she bought votes to boost a story to the front page of Digg, but probably would have felt guilty if she'd hired a service to bury someone else's story. And when a Digg user organized an effort to bury Ron Paul stories that he thought were "spamming" the system, Ron Paul supporters protested that they were merely organizing to vote up stories they agreed with — the clear implication being that this was more honorable than organizing to vote stories down.
But this, I think, is a fallacy. If a story's ranking is artificially inflated, then the extra eyeballs for that story have to come from somewhere, and they come from users paying less attention to the other stories that the phony up-and-comer pushed out of the way. Artificially bumping a story up is just as harmful as artificially burying a story, but the harm is distributed among many innocent victims, not just one. (By the same reasoning, in fact, you could argue that burying a story does no net harm to other users of the Digg site, because the harm done to one story is cancelled out by the benefit to all the other stories that rise in prominence when the victimized story is pushed out of the way. So by strict economic logic, recruiting friends to boost your own story at the expense of everyone else's, is actually more harmful than organizing a bury brigade!)
So I don't think that Digg's replacing the "bury" button with a "report" button will fix the problem. For one thing, obviously groups could abuse the "report" button in the same way — issuing calls to action to report a story for violating the TOU. Since a flurry of bona fide abuse reports is presumably what Digg uses to identify and remove truly abusive stories like MLM spam, how are they going to tell the difference between these cases and cases of abusive "reporting"? (My suggestion: See if there is a sudden change in the percentage of users who view a story and make an abuse report. For stories that are genuine TOU violations, the percentage of users who "report" it should remain steady; for stories that are victimized by a "report brigade," you'll see a sudden spike in viewers and in the percentage of those viewers who report the story for abuse. This might have worked for detecting and stopping the bury brigades as well, although we'll never know now.)
But more fundamentally, even if this change does stop the "bury/report brigades" from killing stories at will, that only fixes the most obvious symptom of the underlying problem, which is that the system can be gamed by recruiting your friends to vote either way. It won't stop "brigades" from artificially promoting shallow stories that agree with their opinions, which does the same net harm overall.
Indeed, the most long-term harm that the DiggPatriots Yahoo Group might have done is that their cheating was so egregious that it makes other examples of cheating look benign by comparison, and might prevent people from realizing that "benign cheating" is just as harmful. As detailed in the Alternet report, the DiggPatriots group talked openly about cycling through different Digg accounts and circumventing bans on their IP addresses. The welcome message to the Yahoo Group told new users that the group was operating "under the radar." The group leader, a woman with the handle "bettverboten," talked about how to prevent Digg from monitoring their actions. And of course the vast majority of posts were calls to bury stories. But what if all of that had been inverted? If the group had operated in the open, while still focusing on recruiting conservative members? If each user limited to themselves to only one Digg account like they were supposed to? And if they focused not on burying stories, but on digging stories that promoted their viewpoints? Just as bad. It just doesn't sound as bad.
I still think the only way to make Digg a true meritocracy, would be to use some version of an algorithm I outlined in an earlier article, inauspiciously titled "How to Stop Digg-cheating, Forever." The gist of it is that in addition to collecting votes from friends, stories should be shown to a random subset of users on the site (perhaps in a box that occasionally appears at the top of the screen when they're logged in), who are asked to vote it up or down. The votes of a random sampling of users would be more representative of how much value the story would have to the Digg community as a whole. Even if most users who are asked to vote on a "random story" simply ignore the request, all you need is to show the story to a large enough sample that you can measure the difference in responses to a truly good story vs. one that has been promoted by digg-cheaters. You don't necessarily have to run this procedure for every story, only the ones that are about to gain some benefit from a large number of diggs (such as being pushed to the front page), and you need to decide whether the story really deserves that big boost. The only way to game that system would be to organize a group of dedicated Digg users so enormous that they constituted a significant percentage of all users on the system — something pretty hard to do without getting caught.
Still, the only site that I know of, that uses a version of this "random sampling" algorithm is HotOrNot.com, which lets you recruit your friends to vote on the "hotness" of your picture on a scale of 1 to 10 (by sending them a link to that specific picture), but also shows a stream of random pictures to visitors, so that your picture can collect votes from strangers. If the votes from the users who visit your picture via the link are significantly different from the votes from users who see your picture via the random stream, then HotOrNot discounts the votes from users who view your page via the link. This prevents digg-style gaming from people who want all their friends to give them a 10. (Note that if you think about it, this is essentially the same as always throwing out the votes from people who visit your picture via the link. If you collect votes from group A and B, but you only count the votes from group A if they agree with the votes from group B, then you're really only counting votes from group B! All the extra votes really give you is the ability to brag that X many people voted on your picture.)
This seems like the simplest way to prevent Digg-cheating, although there may be others. Still unresolved is how to solve the general problem of "gaming" in traditional media and the blogosphere. For the foreseeable future, it's going to be the simple truth that if a major media outlet wants to run a story, it will be heard, and if no media outlet wants to run it, it won't be heard, regardless of how many viewers or readers would have voted in some hypothetical poll that, yes, they want to read that story, and yes, they liked it afterward. That's true for Internet articles as well, except to the extent that a deserving article might be rescued from obscurity by Digg, but the more that system can be gamed, the less it will reward articles that really deserve it. Digg is gameable because power users can recruit votes from their friends; the media and the blogosphere are so obviously "gameable" that we don't even call it "gameable," because "power users" — media outlets and A-list bloggers — can run whatever they want. Right now, the only way I can think of to change this situation that is even logically possible, would be for a site like Digg to adopt some version of the random-sampling algorithm, and to continue growing in power until a significant percentage of the public (not just Internet users, but everybody) relied on it for information. Then, if you had something important to say, people would hear it, but you wouldn't be able to cheat your way to the top.
The ultimate irony is that Alternet's story may never have seen the light of day, if it hadn't been the beneficiary of the same gameable, non-meritocratic inefficiencies that exist in the media-blogo-outrage-o-sphere, just as they exist on Digg. Yes, the Alternet story deserved to be heard, but you don't get the publicity you deserve, you get the publicity that you organize, and Alternet had the organizational publicity structure in place to get their voice heard. If a kid blogging from his bedroom had infiltrated the Digg Patriots group and made essentially the same discovery, would anybody ever have heard about it? (Well, maybe, because of the political hot-button factor — but even then, only after the story had been picked up by a major site like Alternet.) A truly meritocratic Digg algorithm could make it possible to get a good story out without a lot of organizational support behind it — and to ensure that an organized effort can't kill a good story either.