Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Internet Businesses

Hits or Misses: Who is Your Website's Audience? 146

securitas writes "The Christian Science Monitor's Gregory M. Lamb wrote a story interesting to anyone who runs a website: How do you accurately and reliably measure the audience for your website? From the article: 'Most websites have no idea how many people view their content. This inherent fuzziness is causing problems for commercial websites, especially online publications desperate to make money from Internet advertising... How can you charge for ads when it's nearly impossible to tell advertisers how many people will see them?' The article discusses the flaws and problems with Nielsen/NetRatings and comScore Media Metrix - they grossly undersample workplace users - and the rise in the number of sites requiring user registration."
This discussion has been archived. No new comments can be posted.

Hits or Misses: Who is Your Website's Audience?

Comments Filter:
  • Not too hard (Score:1, Interesting)

    by Anonymous Coward on Monday June 21, 2004 @09:51AM (#9483680)
    1. Cookies
    2. IP addresses
    3. Mandatory registration
  • Holds true for me (Score:5, Interesting)

    by tuomasr ( 721846 ) on Monday June 21, 2004 @10:15AM (#9483938)

    I found this article to be rather insightful. I personally run a small IT/science-news site (in Finnish) and I'm really having a hard time figuring out visitors of the site. Of course I can get some data from the log analyzing software (awstats and webalizer are being used for the site) but it really doesn't tell me what I want. It seems that the website logs don't always tell the truth. For example I'm getting about 20-30 hits a day with a referrer pointing to a site that's a search engine for blogs (${god} knows why the site has been tagged as a blog) but browsing through the actual logs reveal the hits to belong to a indexing-robot of the site that's a little too enthusiastic.

    The most reliable way to find out about the visitors on a given site would be a user survey, although not complete as not everyone would fill it out, but it would give an idea about the habits of your most frequent visitors. I, if I were an advertiser, would be interested in more than just number of hits and visits and most advertisers would be baffled by stuff like "we got XXXYYYZZZ HTTP requests last month". Personally I would prefer to advertise on sites with a well-built sense of community and an active userbase that's keen to interact with the website, when I browse a site for the first time or a site that I visit infrequently, I rarely click on banners or ads. I'm more prone to clicking ads on sites which I visit daily or so, it gives me a feeling of supporting the site I like and I just might buy something from the advertiser if they are offering something that I need, therefore focused advertising is the key, hence again you need to know your users.

    Logs tell you numbers but you need the visitors themselves to tell you who they really are and how often they visit your site.

  • by Moryath ( 553296 ) on Monday June 21, 2004 @10:18AM (#9483975)
    Alexa's model is interesting - they hand out a "free" toolbar that gives you google search, as well as pinging Alexa and showing you every page's Alexa rank.

    Unfortunately, the toolbar also slows down your browsing (especially if you're on dialup). And the more tech-savvy a user is, the less likely they are to want that toolbar on their system. Thus tech sites are going to be depressed in those rankings, always.

    Alexa also can't tell a subdomain from a regular domain - so subpages of IGN.com or UGO wind up just increasing IGN or UGO's rank, and blogs hosted at X.BlogHost.Com just raise BlogHost.com's rank without being able to tell what the particular blog's rank might be.

    Finally, the biggest flaw in Alexa's ranking system is that it's based on voluntary input; rather than finding 'Net users and trying to get a representative sample (which is the goal of the Nielsen TV setup), they take anyone who'll put in their toolbar. Sure, they can get a pretty large number of idiots to install the thing, but they're still idiots - there are demographics that the toolbar just won't get adopted by in that fashion.

    The other sad thing is, there are companies that use Alexa's page rankings to decide how much they'll pay for ads. Go figure.
  • by Lord Zerrr ( 237123 ) on Monday June 21, 2004 @10:18AM (#9483976)
    I use webalizer, cookies, and a two stats packages for my cms system (geeklog). One stats package only admin has privalige to, which gives me very detailed acurate info such as time, ip, which page viewed, referers, UID (user id), links followed, country browser, platform ect. All open source. Does the job for me.
  • Re:Easy (Score:5, Interesting)

    by blowdart ( 31458 ) on Monday June 21, 2004 @10:23AM (#9484012) Homepage
    While that's almost an amusing troll I've noticed a trend recently where fake referrals are sent to random pages. I would guess this is to boast google page rankings, as some people will publish lists of referring sites on a crawlable page. In the last two weeks a certain canadian IP sent fake referrals for various pages on
    • www.spankarchive.com
    • www.spanking-adult.com
    • www.spanking-porn.com
    • www.spanking-punishment.com
    • www.spankingstories.us
    • www.spankphotos.com
    • www.spankpics.net

    Their ISP killed their account after 3 reported strikes.

    Then there's em3.net, a scumware site that tried this last year. Following the links triggered attempted spyware downloads.

    (If anyone is truely interested I have a partial list at http://idunno.org/misc/referralSpammers.aspx [idunno.org])

  • by yppiz ( 574466 ) on Monday June 21, 2004 @11:34AM (#9484718) Homepage
    Moryath writes:
    Alexa also can't tell a subdomain from a regular domain - so subpages of IGN.com or UGO wind up just increasing IGN or UGO's rank, and blogs hosted at X.BlogHost.Com just raise BlogHost.com's rank without being able to tell what the particular blog's rank might be.
    I wrote much of Alexa's early traffic counting software (I worked there in the late 1990s).

    Your description is partly right. Alexa "rolls up" clicks on subdomains into the doman. So clicks on www1.foo.com, www2.foo.com, and www3.foo.com all count towards foo.com.

    Alexa does this primarily to deal with site mirrors, but also because some sites create subdomains for various functions related to serving pages. So someone interested in Google's overall popularity might prefer to see gmail.google.com, news.google.com, and www.google.com as one site, and not three.

    That said, the site counting software has (or at least had, I don't know if this is still true) rules for detecting home pages as stats-worthy sites independent of their domains. For instance, any URL with a tilde after the domain, like www.foo.com/~bar, has its own statistics. Similarly, there are special rules for recognizing "home pages" on domains like AOL and other big ISPs.

    It's a huge problem deciding what people consider to be websites - it borders on serious AI. For instance, is each Sourceforge project a separate site? How about several subdirectories off of someone's home page, each with a very different focus?

    If you think that your favorite domain should be divided into sites, and that it isn't happening correctly in the Alexa toolbar, you might try sending email to Alexa and asking them to take a look.

    Finally, the biggest flaw in Alexa's ranking system is that it's based on voluntary input; rather than finding 'Net users and trying to get a representative sample (which is the goal of the Nielsen TV setup), they take anyone who'll put in their toolbar. Sure, they can get a pretty large number of idiots to install the thing, but they're still idiots - there are demographics that the toolbar just won't get adopted by in that fashion.
    I am not familiar with Neilsen's current methodology, but I was unimpressed by their marketing claims when they first started their web metrics. At the time (late 1990s) I believe they were saying they had a representative sample of the internet, even though their sample size was: 1) tiny, and 2) made up of volunteers. I cannot say what goes on in Neilsen, or any other web ratings company, currently, but while companies may have very careful statisticians on the inside, often, the caveats and possible biases get stripped out by the marketing department. The moral of this story is, assume that any web rating (or television rating, for that matter) is biased, and understand those biases as well as you can.

    --Pat / zippy@cs.brandeis.edu

Neutrinos have bad breadth.

Working...