Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Businesses The Internet

Google TrustRank 176

Philipp Lenssen writes "Google registered a trademark for the word "TrustRank", as Search Engine Watch reveals. Is this a sign we can expect a follow-up to Google's PageRank? An earlier, possibly related paper on TrustRank is available; it proposes techniques to semi-automatically separate good pages from spam by the use of a small selection of reputable seed pages."
This discussion has been archived. No new comments can be posted.

Google TrustRank

Comments Filter:
  • with the newly proposed AdSense plans?
  • so when google desides what's trusted for us, what is good content and what isnt, are they still not being "evil"? additionally, how are the pages seperated? on what criteria? man or machine (potential for flaws on either side)?
    • by penguinoid ( 724646 ) on Tuesday April 26, 2005 @07:26AM (#12346843) Homepage Journal
      Its not censorship. Google couldn't censor even if they wanted to. Rather than explaining to you what censorship means, let me just tell you that what Google is doing is siply doing their job better. I don't want to find spam when searching for anything, and neither does anyone else. Ergo, eliminating spam from the search results makes everyone (except spammers) happier.
      • What is SPAM? Pages in the search result that are merely ads for what you were trying to find?

        Well, if Google will be able to filter these out on popular demand (because nobody wants to see these pages show up in their search results) the google ads will come out better as well.
      • by Anonymous Coward
        "Google couldn't censor even if they wanted to."

        Since when? Google is a privately owned corporation. They've got stock holder to answer to now, but it still stands that they can do what they want with what they own. They're not obligated to give you unfiltered results on their free, privately owned service.
      • its censorship in the same way that excluding undesireable content from television or radio is censorship.

        I don't want to find spam when searching for anything, and neither does anyone else. Ergo, eliminating spam from the search results makes everyone (except spammers) happier.
        an anon has replied, "what is spam?" and i pose the same question. "spam" or unwanted content is far too complex an issue to be derived by a script. i could have a moodswing (or multiple personalitydisorder, or any number of othe
    • by Molly Lipton ( 865392 ) <molly.lipton@gmail.com> on Tuesday April 26, 2005 @07:26AM (#12346844)
      Yes, this is always a problem. How can you possibly know whether or not a site is spam just by looking at who's linked to it? A lot of great sites have very few external links to them and often they're from blogs and other sites that will likely be identified as spam prone.

      This is a basic problem of filtering web-content. How do you avoid throwing out the baby with the bath water? I'm running into that problem in designing a custom filter to keep my son from inadvertently seeing pornography as he looks for his "r0mz," but that's peanuts compared to Google's dilemma.

      The fact is, spam filtering is inherently censorship. This kind of interference will always have a negative impact on the marketplace of ideas that is the modern internet. On the other hand, as a side effect, removing blogs from search results (as this trust metric very likely will) may increase the usability of Google overall. I suspect there will be some people who are not as happy about that as I am.
      • by telecsan ( 170227 ) on Tuesday April 26, 2005 @07:33AM (#12346891)
        You fail to understand that google is incapable of actually censoring anything. Them not displaying a webpage in their results does not, indeed, remove it from the web.

        Google's primary responsibility now is to it's shareholders, which means increasing the chance that you and I find exactly what we are trying to look for, and not to unabashedly display every peddler that serves up content over http.
        • by Anonymous Coward
          "You fail to understand that google is incapable of actually censoring anything."

          Yes, they can. Their search results.
        • by generic-man ( 33649 ) on Tuesday April 26, 2005 @07:53AM (#12347022) Homepage Journal
          Considering how much market share Google has, them not displaying a web page in their results (or dropping it a few hundred places) effectively removes it from the web.

          Google's primary responsibility now is to its shareholders. Google makes money from advertising. If Google can encourage you to patronize its advertisers instead of trusting its index for everything (which right now is pretty easily gamed), then Google makes more ad revenue and shareholders are happy.
          • Google makes money from advertising

            So then wouldn't it make more sense to better target the advertisements rather than propogate more? I think this is generally the problem with the Internet as a whole currently; websites are running advertisements that aren't appropriate to their content, and an intellegent search engine like Google's PageRank is simply being confused into think that the ads are 100% to the point, and that the links are 100% focused.

            TrustRank fixes this by better bringing you ad links
            • "Click here, you may have already won" are just as pointless, as any intellegent user would avoid them like the plague.

              The average and median IQ is 100.

              As a population statistic, this means that half (or more) of the population is IQ 100 or under, and that means that "any intelligent user" is by no means the population that those popups and banners address. They're not for intelligent users.

              Remember that banner ads, popups etc. all cost money to generate. Why do you think they have not gone away? I

              • You're using IQ to argue gullability. Just because you're smart, doesn't mean your not gullable, and just because you are stupid (as to IQ score), doesn't make you gullable.

                Secondly, banner ads and popups haven't gone away because we as a society are trying to beat the problem where it lies; every modern web browser (and I use that term liberally; IE is NOT modern) has some kind of integrated Popup blocker. Many have optionally integrated BannerAdvertisement blockers. Email has SpamBlockers. We're not fi
                • You're using IQ to argue gullability. Just because you're smart, doesn't mean your not gullable, and just because you are stupid (as to IQ score), doesn't make you gullable.

                  I disagree with your assertion. It is my impression that there is a very, very high correlation between low intelligence and gullibility. Leaving religion out of it for the moment, because that has a socialization factor added to the mix, I think intelligent people are significantly less likely to buy "Herbal Viagra" or "0v3r 7h3 c0

        • Google's primary responsibility now is to it's shareholders, which means increasing the chance that you and I find exactly what we are trying to look for

          I think what you meant was "stakeholders". Modern Business schools now teach that you're just as responsible to stakeholders other than shareholders. And yes, in Googles case, this mostly means QoS.
      • Well, trust metrics work both ways, not just downstream. So, a reputable site A that links to a seedy site X, where X is also linked by a lot of other seedy sites, will reduce the trust metric of A, not increase the one of X. The problem of course will arise when you have walled-garden-type systems, where a good content-producing site has an exclusive partnership with a seedy one.
      • by bhsx ( 458600 )
        I find it funny that I didn't see OP's post.
        Slashdot censorship at its finest!
        "DAYTOOK'RFREEDUMASPEECH!"
        Oh man, that was bad. I feel dirty.
    • by ciroknight ( 601098 ) on Tuesday April 26, 2005 @07:29AM (#12346861)
      Two points: 1) Any new system Google implements will run along the side of PageRank; they've invested too much to completely switch all of Google running to TrustRank. The system might even augment current PageRank by running an algorithm over the data that PageRank returns. We can only speculate as of now. But I can assure you that one will not replace the other, and there will probably be a way to use both systems in the future if you like. Hell, using your Gmail account, you may even be able to specifically tune PageRank, making certain pages more relevant to you appear higher in search results.

      2) You have the option of not using Google. Yahoo is a completely independent search engine now.
    • so when google desides what's trusted for us, what is good content and what isnt, are they still not being "evil"? additionally, how are the pages seperated? on what criteria? man or machine (potential for flaws on either side)?

      It's not necessarily censorship. They could just present the "trustworthy" pages first. You could always skip to the later pages if you wanted, just like you can browse /. at -1 if you want.

      And yes, this means that the system could be abused, just like PageRank and /. moderation.

    • by Anonymous Coward

      so when google desides what's trusted for us, what is good content and what isnt, are they still not being "evil"?

      Are you fucking kidding me? This is just another mechanism for deciding whether particular pages should be shown for queries. Show me a search engine that doesn't do that.

      If you use a search engine, then by definition you are trusting them to show you relevant results. If you don't want to trust Google, then use another search engine. If you don't want to trust another search engine,

    • by ajs ( 35943 ) <{moc.sja} {ta} {sja}> on Tuesday April 26, 2005 @08:09AM (#12347149) Homepage Journal
      "so when google desides what's trusted for us, what is good content and what isnt, are they still not being "evil"?

      Yes.

      Why is it that everyone is constantly striving to find Google's evil? Ranking the relevancy of pages to a search is Google's job. By ranking spam as relevant to my search they have failed. Using the concept of a web of trust to establish relevancy is a fairly obvious solution and has well established analogs in other fields (e.g. PKI).

      If you're looking for evil, try GE, GM, or Unilever. Google doesn't even begin to rank on the evil-o-meter.
    • Censorship would be making the content unavailable. They're simply bringing more relevent content to the top of the search, which is what a search engine is supposed to do in the first place. If yoou want what's considered to be spam, hit next a few times.
    • To explain it to you, its sort of like whats going to happen to your troll post... it will be modded down, and the people who tell you what google is really doing will be modded up.

      google has been trying to get rid of spam for ages... this is just one of SEVERAL techniques that go into evaluating a page.
  • Conjuction? (Score:2, Interesting)

    by tyleroar ( 614054 )
    Are these going to be used in conjuction? It would be very nice to be able to sort out those pages that have nothing but a long list of keywords on them. It's probably all in vain, as somehow will sooner or later find a way to get around this, as well.
    • Re:Conjuction? (Score:3, Insightful)

      by Slashcrap ( 869349 )
      It would be very nice to be able to sort out those pages that have nothing but a long list of keywords on them.

      And thanks to fuckers like you, pages like THIS are full of "Free IPod" links.

      You can't complain about spam of any sort when you are a spammer yourself.
      • People like that put the mods in a tough position: On the one hand, you want to mod up insightful comments, but you don't want to reward spammy free Ipod links.

        I've got an idea: Anytime you see an informative and/or insightful post whose contents you would like to see modded up, but which has a spam-o-licious free [product] link in the sig, just copy the informative content into a new Anonymous Coward post, which the mods can then moderate higher, while the spammy parent can be modded down into obvlion.

      • Re:Conjuction? (Score:5, Informative)

        by camusflage ( 65105 ) on Tuesday April 26, 2005 @01:53PM (#12350610)
        It should be noted that the Slashdot user No More Free Stuff [slashdot.org] catalogs such links, and by adding this user as a friend and assigning a negative bonus to foes of friends, you can lower the moderated value of any such posts.
  • Potential abuse? (Score:4, Interesting)

    by mferrier ( 878754 ) on Tuesday April 26, 2005 @07:24AM (#12346819)
    This is a step in the right direction conceptually, but giving a smaller number of "seed sites" more rank influence increases the potential fallout from any rank cheats that may be found in the future (see Google Bomb [outer-court.com] and Google 302 exploit [searchenginewatch.com].

    Google may be better off as they are currently leaving all sites initally equal in influence before the Pagerank calculation.

    Then again, Google has a great track record for testing their ideas before committing them to general service...
    • Re:Potential abuse? (Score:5, Interesting)

      by ciroknight ( 601098 ) on Tuesday April 26, 2005 @07:49AM (#12347001)
      With Google's "portal system" they're developing, the trust comes from within; the company trusts its users because they are clicking into an agreement of terms. That being said, hacks that would make this new TrustRank unreliable would probably just lead to the termination of services of the account.

      This to me keys that Google's trying to become a more involved company; instead of just sitting back, caching and searching the internet, they are now trying to serve you best and give you the results you are looking for. I would imagine with TrustRank, you will see a little star or something near a link on Google's home page, and the star would indicate if it is something in your field that you would be looking for. For example, if you were a Biologist and searched for a certain kind of fish, say "Blue Tuna", it would put stars next to sites with the fish's breeding habits, etc., but if you were a general consumer, it would provide links to the local fishery.

      The internet is an extremely powerful tool, and search engines have simply evolved to the point that they are now "dumb technology". Without more user invervention (and not simply by throwing in more keywords and praying), they will continue to be as they are now. Once the company better knows what we'll be looking for, they can better serve us. And that's all I see this new tech as being.
      • The internet is an extremely powerful tool, and search engines have simply evolved to the point that they are now "dumb technology". Without more user invervention (and not simply by throwing in more keywords and praying), they will continue to be as they are now.

        Yeah, they'll continue falling for users abusing their ranking system.
        They'll continue falling for users like you and your sig.
    • Then again, Google has a great track record for testing their ideas before committing them to general service...

      Is that why everything google is constantly in beta?! :)
    • The fact that seeds are chosen manually restricts the use of such a scheme to seeds that never change or are known for a fact to be reliable (such as fortune 1000 companies, governement sites or media outlets). Otherwise, a site can start as one that has terrific content only to switch to a spam site the moment it gets a good trust ranking (and other abuses).

      I don't favor this type of scheme because it is not adaptive enough.

      A much better manner for achieving the goals that Google is reaching for wou
  • by Anonymous Coward
    I've got a TR7 site with four links available...
  • Questions (Score:3, Interesting)

    by tyroneking ( 258793 ) on Tuesday April 26, 2005 @07:31AM (#12346877)
    How is this different from applying a weighting to PageRank?
    Will the owners of the pages / sites deemed to fall within the set of trusted seed sites get any money for all their hard work (i.e. hand-maintaining pages of links)?
    What if such an owner decides to link to a page of commercial or spam links - will they get any money from the owner of the linked site? Is this a possible method of abuse?
    Will that cool poster of links between websites now become 3D to give trusted links more prominence?
    • Re:Questions (Score:3, Insightful)

      by ciroknight ( 601098 )
      How is this different from applying a weighting to PageRank? Will the owners of the pages / sites deemed to fall within the set of trusted seed sites get any money for all their hard work (i.e. hand-maintaining pages of links)?

      Lemme give it a try;

      It's probably exactly giving a weight to PageRank, but the question is "Where will the weight be applied?", before the PageRank calculation (as in giving links a higher Rank because they are from a more legit website) or after the PageRank calculation (as in
    • Re:Questions (Score:5, Informative)

      by pjrc ( 134994 ) <paul@pjrc.com> on Tuesday April 26, 2005 @09:58AM (#12348121) Homepage Journal
      I just finished reading the paper. All these questios are pretty well answered by the text. To save you and others the trouble of reading it, I'm gonna take a stab at these. Feel free to actually read the paper and tell me if I misunderstood.

      How is this different from applying a weighting to PageRank?

      It attempts to detect clusters of pages which have few inbound links, which also propagating "trust" scores to all other sites by using their linking structure. For sites that have many inbound links (high scroring in pagerank), the authors claim this modification tends to classify spam and reputable sites differently.

      Will the owners of the pages / sites deemed to fall within the set of trusted seed sites get any money for all their hard work (i.e. hand-maintaining pages of links)?

      No.

      However, they will get better search engine visibility, which is quite valuable.

      What if such an owner decides to link to a page of commercial or spam links - will they get any money from the owner of the linked site?

      The paper suggests using only highly reputable organizations with long-term stability for the seed pages. Government organizations, universities, very well known companies.

      The analysis in the paper is based on a per-site graph, not per-page, by the way. They lacked the resources to try these computations on such a large data set.

      Is this a possible method of abuse?

      Presumably, the small set of seed pages/sites will need to be monitored by staff employed by the search engine company. If one of the trusted seed sites "went bad", they would need to be removed from the list.

      Will that cool poster of links between websites now become 3D to give trusted links more prominence?

      Probably not.

  • by Vo0k ( 760020 ) on Tuesday April 26, 2005 @07:35AM (#12346902) Journal
    So, links from pages of bad reputation give your page bad reputation?
    I can see this already....

    This page contains very objectionable content.
    If you are easily offended, don't enter.
    Blah, blah, blah.
    Blah, blah, blah.

    Do you agree to these conditions?
    Yes [goatse.cx] No [disney.com]
  • by OblongPlatypus ( 233746 ) on Tuesday April 26, 2005 @07:36AM (#12346909)
    This sounds very similar to Advogato's trust metric [advogato.org], which uses a "seed" of trusted accounts to filter out trolls/spammers. The difference might be that it should be even easier to implement in the case of web pages, because they already have links to each other, avoiding the reliance on users to manually "certify" other user accounts in order to build the graph.
  • by mferrier ( 878754 ) on Tuesday April 26, 2005 @07:40AM (#12346938)
    To see Google's TrustRank Trademark info on the USPTO site, click here [uspto.gov] , click "New User Form Search (Basic)", and search for "TrustRank".
  • A good sign (Score:4, Insightful)

    by treff89 ( 874098 ) on Tuesday April 26, 2005 @07:41AM (#12346943)
    Google, as we all know, is a reputable service provider; they get the job done efficiently and innovatively. Now they are continuing their attack on the ails of the internet which was started by Gmail spam filtering. By developing this tool, Google is helping to clean the Internet up and enable it to become the massive source of pure information it has such potential to be. The "negative" sites on the Internet, such as keyword sites with no real content which invade search results, and the like are a bane to the community and by helping get rid of them, Google is yet again doing us all a favour. Google, I salute you.
    • Re:A good sign (Score:5, Insightful)

      by pjrc ( 134994 ) <paul@pjrc.com> on Tuesday April 26, 2005 @10:25AM (#12348399) Homepage Journal
      Yes, this is a good thing. It might result in wiping out search engine spam, maybe. If the "search engine optimizers" don't find creative ways to cheat.

      Let's not get overly optimistic about what this is going to do for the web... such as:

      By developing this tool, Google is helping to clean the Internet up and enable it to become the massive source of pure information it has such potential to be.

      What exactly is "pure information" anyway?

      Consider my little website [pjrc.com]. Lots of pages about how to design electronic stuff. But we sell components that support those activities, so it's not 100% "pure", is it? You could consider all those pages as a giant ad for the stuff on the store section of the site. But most people would consider my pages on the more informational side (and the vast majority really are).

      About once every 2 or 3 weeks, I get a call from one of these search engine optimiztion companies. Not sure if it's the same couple companies... I usually just say "no" and ask to be on their do-not-call list. They're mostly a bunch of slimey people and probably don't honour such requests.

      But sometimes, the idea is tempting. I resist because I believe it's unethical, and ultimately a bad long-term investment. Still, to anyone selling via the web, even a tiny little 2-person company like me, the sales pitch is quite compelling. Pay some fee, traffic goes up, more sales, increase in revenue offsets the cost for the SEO's work. Maybe it's not so bad if they don't stupe to cheating.

      Still, I resist because I know it's not a black and white distinction. It's a fuzzy line between the obviously good techniques (improving site structure, rewording page titles, etc) and the obviously bad (cloaked pages). I also just don't trust them.

      But even the distinction between "pure information" and "spam" is fuzzy. I'd like to think I'm leaning towards the "pure information" side, but we do indeed sell products. It wasn't always that way... in the mid-90's, the site was smaller and hosted at a university and no products were sold. I had several people begging me to sell them a few of the parts needed for a project. Eventually, a friend started selling some stuff (prices were high, service poor), and so I took it over. Satisfaction with the site has improved dramatically since then!

      Still, it's a fuzzy area between pure information and purely commercial, or advertising or spam.

      I can tell you it's a lot more work crafting really good web pages than just writing a check to a seedy SEO company. But if these ranking algorithms really do improve to perfection, the response is probably going to be more and more pages appearing in that gray region. Increasing sales can pay for a lot of man hours to author more material that's compelling for visitors and truely does help them to solve their products (especially if they buy the described products).

      So, in a best case scenario, these algoriths reaching perfection (seems unlikely) is probably going to lead to a lot more very good content, but content that revolves around pitching products (eg, infomercials), and not "pure information".

      • If it helps, I found your stuff a few years ago just by searching for mp3 player boards. Neat stuff.

        And no need for the slimesucking SEO companies.

        (No, havent bought one yet since the car I want ot use it in has 6v electricals and i don't know enough about sparks to know if it would owrk, etc. but I do keep you bookmarked)
      • your site is a pure information, in the sense that is not a page-full of links in an attempt to get other pages rank better. That is what I got, good informationg could be comercial, I want to buy a car. google should be able to help me and give interesting car-buiyng pages (maybe I can order it over UPS). :-D
  • by protoshoggoth ( 588994 ) on Tuesday April 26, 2005 @07:41AM (#12346949)
    Given that one of the authors of the referenced paper is an employee of Yahoo, I have to wonder if whatever Google has in mind has anything whatsoever to do with the trustrank scheme we're talking about here. I mean, all we know is they trademarked the word, nothing more.
  • Trustrank explained (Score:4, Informative)

    by broothal ( 186066 ) <christian@fabel.dk> on Tuesday April 26, 2005 @07:42AM (#12346957) Homepage Journal
    Trustrank is basically the same as resetting pagerank.

    What happens is, that humans select some webpages which they trust. The idea is, that these trustworthy webpages only links to good sites. So, the trustworthy webpages are used as seed into a regular webcrawler.

    At first glance, this looks like a low pass filter to me. Ie the same result could be achieved by cutting all PR 5 sites.
  • bah, if they could include a green-orange-red light in their toolbar... Go to your bank's website to "verify" your password and a little red light starts flashing in your toolbar? Could be good.
  • by Underholdning ( 758194 ) on Tuesday April 26, 2005 @07:44AM (#12346968) Homepage Journal
    The funny thing is, that one of the authors of the Trustrank paper is from Yahoo.
  • by AndroidCat ( 229562 ) on Tuesday April 26, 2005 @07:49AM (#12346998) Homepage
    Since an entire industry of sleezeballs has grown up around tweeking* Google page rank, I expect that we'll see quite a few lawsuits over Google changing how they figure out what order to present search results. (There have been a few over previous adjustments. [pandia.com]) Whine and cheesed sleezeballs. I'll stick with the popcorn and beer, thanks!

    * When I say sleezeballs and tweeking, I mean the people who will try outrageous stunts to game the system, rather than the consultants who will help you increase rank by the stunning tactic of actually improving your site. Radical, but sometimes it works.

  • Gmail spam filter? (Score:3, Interesting)

    by thegnu ( 557446 ) <(moc.liamg) (ta) (ungeht)> on Tuesday April 26, 2005 @07:52AM (#12347021) Journal
    I read another post speculating that gmail users could be used as voters to choose trusted sites. Something that would probably actually work would be tagging domains that are received by a certain percentage of the gmail population and NOT marked as junk, and then giving them weight according to their percentage.

    Becase we gmailers are picky.

    It would probably have to be integrated with something else, because I bet there are a few pr0n mailing lists that lots of people have.
  • by thbb ( 200684 ) on Tuesday April 26, 2005 @07:56AM (#12347047) Homepage
    The google-watch page on PageRank [google-watch.org] already mentions how pagerank, over the years, has switched from an actual score of popularity (number of links to a page), to a trustrank-like index, based on the reputability of the links to a page. This makes it much harder for the newbie to get a good pagerank, and empowers way too much the owners of old web sites and corporate pages.

    Even though it contains way too much rant for my taste, google watch [google-watch.org] is worth a full read by all /.ers.
  • by jonr ( 1130 )
    Are we now reporting Google news from the future?
  • I read a very interesting article on the possible outcomes of a semantic web, and a google "trust rank" actually appeared in it.

    If "Google trusts fooPage" becomes a standard, recognised triplet, I see no reason why this won't be extended to "Google trusts userX", which becomes "ebay trusts userX" etc.

    It's very possible they're looking to the future, and have more in mind than "there's probably no pr0n on this page"...
  • Question. (Score:2, Interesting)

    by ceeam ( 39911 )
    If I search for "stoned whores" what sites should be considered trusted?
  • by mathmatt ( 851301 ) on Tuesday April 26, 2005 @08:06AM (#12347124) Homepage
    This [64.233.187.104] is wierd. It is the 19th hit (on the second page) of a google search for "trustrank" [google.com] It requires a login from google's results page, but a google's cache reveals a directory including the paper linked to by /.

    I guess we weren't supposed to read this. And you shouldn't have read *this*!
  • by tdvaughan ( 582870 ) on Tuesday April 26, 2005 @08:09AM (#12347152) Homepage
    It would be amazing if Google gave us the ability to assign trust values to sites that we ourselves trust. This way, for example, I might give Wikipedia or the BBC a 10/10 trust rating for all their off-site links (and set it so that links off the linked sites are at 50% of their parent trust rating etc.). If we could also subscribe to someone else's trust ratings then technically illiterate people could hand over the responsibility of managing their trust database to someone else. From first thoughts, this looks like it could solve the problem of malicious SEO.
    • There is. It's called a bookmark. Look for Google to buy out del.icio.us (or however you put the "." in there).
    • One way they could do this would be to compare the number of times a link is clicked on their page ranking to the average. A lot of people can tell a spam site just from reading the google description, those sites won't be clicked on as much, even if they show up early in the rankings.

      say the first listing is clicked 70% of the time, the second is clicked 20% of the time, third 10% of the time. If you have a set of search results that has click rates of 30%, 50%, and 5%. Then you could say that the fi
  • A possible system? (Score:2, Informative)

    by chrima ( 879051 )
    Couldn't they just look for links in gmail messages and use those as
    weights in a trust system?

    Links in messages identified as spam could be given a negative
    weight. That weight could be determined by the number of people
    identifying messages with that link as spam. Links from those sites
    would being given less trust than a completely unknown page, unless they
    are positively weighted themselves or linked to by a positively weighted
    site. Links found in non spam messages could be given positive weights
  • This isn't an original idea, but I can't remember where I most recently read about the concept, so I'll go ahead and say it's mine:

    Trust for things like email senders and web sites shouldn't be centralized. My web of trusted entities, which should be easy to maintain (unlike, say, blacklists or whitelists) and should evolve semi-automatically, should be based on the interaction of my trusted sites/entities, and, in turn, their trusted sites/entities. Sort of like TrustRank, but where each person determin
    • This isn't an original idea, but I can't remember where I most recently read about the concept, so I'll go ahead and say it's mine:

      Well... in that case, I don't trust you. trustrank=0. Next message, please.

      • There you go. See how well that works? Now your searches won't return all of my "How to steal ideas for fun and profit" sites [sco.com].

        Now it occurs to me that you may *want*, for research purposes, say, search for untrusted sites, so that should be a search parameter.
  • by NigelJohnstone ( 242811 ) on Tuesday April 26, 2005 @09:27AM (#12347741)
    I've read it but is sounds mixed up. Isn't the ideal result from a search engine:

    Matches - spam - offtopic, sorted by relevence

    not

    Matches sorted by f(pagerank,trustrank)

    Google used pagerank+on page text as a measure of how relevent a page is but thats not reliable anymore because the set contains spam pages.

    The 'trusted' value tells you nothing about relevence, it only gives the likelyhood of the page being spam or not spam. If its spam you want it removed, if its not spam, then its page rank determines its relevence not some function of pagerank and trustrank.

    i.e. they should not promote or demote pages because on trust rank, they simply define a cut off value K, if the trust is less than K then its likely spam and should be removed.

    Since spam follows money terms, they should have K(keyphrase), so they can change the value of K on each keyphrase to remove the spam. Otherwise they will filter non money terms where no spam exists and their algo can only do harm!

    • Suppose you had the perfect Oracle that could check every search result and clean it of spam.

      Ranking by onpage text, links etc., the items that make a page relevant or not gives you:

      A. 1st most relevant.
      B. 2nd most relevant
      C. Spam
      D. 3rd most relevant
      E. 4th most relevant.
      F. Spam

      After your Oracle has hand checked every site you get:

      A. 1st most relevant.
      B. 2nd most relevant
      C. 3rd most relevant
      D. 4th most relevant.

      Not:

      A. 10th most relevant
      B. 2nd most relevant
      C. 8th most relevant
      D. 5th most relevant

      Rankin
  • maybe for Gmail (Score:2, Informative)

    by C_Lo_Fresh ( 700907 )
    I think TrustRank would be more useful in Gmail to give a reading on how "spammy" an email is. They already have something like it, where a box shows up warning you that the sender may have spoofed their address.
  • hmmm ...... (Score:2, Interesting)

    by thempstead ( 30898 )
    ... would be nice if you could use adblock style filtering on Google search results, then if you wanted to get rid of certain results, (i.e. from blog or "sales" sites), you could block their domains.

    Probably wouldnt be that difficult to get around it but might help a bit

    t

I do not fear computers. I fear the lack of them. -- Isaac Asimov

Working...