Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Internet IT

New Method of Tracking UIP Hits? 174

smurray writes "iMediaConnection has an interesting article on a new approach to web analysis. The author claims that he is describing 'new, cutting edge methodologies for identifying people, methodologies that -- at this point -- no web analytics product supports.' What's more interesting, the new technology doesn't seem to be privacy intrusive." Many companies seem unhappy with the accepted norms of tracking UIP results. Another approach to solving this problem was also previously covered on Slashdot.
This discussion has been archived. No new comments can be posted.

New Method of Tracking UIP Hits?

Comments Filter:
  • uhm, what? (Score:3, Funny)

    by Prophetic_Truth ( 822032 ) on Tuesday August 23, 2005 @01:31AM (#13377417)
    new, cutting edge methodologies for identifying people....the new technology doesn't seem to be privacy intrusive

    The Wookie defense in action!
    • ...the new technology doesn't seem to be privacy intrusive...

      Give me a break. How can this be possible when the approach suggests using multiple tests rather than one, ranging from analyzing dated cookies, IP addresses and Flash Shared Objects?

      Their approach seems to be common-sense. I believe most sites worth some salt do not use just one metric. Maybe if someone can get a hold of the research paper and post it, then we can see if their implementation is really revolutionary. Another problem is that

      • Re:uhm, what? (Score:5, Insightful)

        by Shaper_pmp ( 825142 ) on Tuesday August 23, 2005 @06:13AM (#13378119)
        "Their approach seems to be common-sense."

        Their suggestion may be common-sense, but their approach borders on messianic:

        "This article is going to ask you to make a paradigm shift... new, cutting edge methodologies... no web analytics product supports... a journey from first generation web analytics to second."

        Followed by a lengthy paragraph on "paradigm shifts". In fact, the article takes three pages to basically say:

        "In a nut-shell: To determine a web metric we should apply multiple tests, not just count one thing."

        Here's a clue, Brandt Dainow - It's a common-sense way of counting visitors, not a new fucking religion.

        The basic approach is to use a selection of criteria to assess visitor numbers - cookies first, then use different IPs/userAgents with close access-times to differentiate again, etc.

        The good news is there are only three problems with this approach. The bad news is, that makes them effectively useless, or certainly not much more useful than the normal method of user-counting:

        Problem 1
        There is no information returned to a web server that isn't trivially gameable, and absolutely no way to tie any kind of computer access to a particular human:

        "1. If the same cookie is present on multiple visits, it's the same person."

        Non-techie friends are always wanting to buy things from Amazon as a one-off, so I let them use my account. Boom - that's up to twenty people represented by one cookie, right there.

        "2. We next sort our visits by cookie ID and look at the cookie life spans. Different cookies that overlap in time are different users. In other words, one person can't have two cookies at the same time."

        Except that I habitually leave my GMail account (for example) logged in both at work and at home. Many people I know use two or more "personal" computers, and don't bother logging out of their webmail between uses. That's a minimum of two cookies with overlapping timestamps right there, and only one person.

        "3. This leaves us with sets of cookie IDs that could belong to the same person because they occur at different times, so we now look at IP addresses."

        This isn't actually an operative step, or a test of any kind. It's just a numbered paragraph.

        "4. We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour."

        FFS, has this guy ever touched a computer? For someone writing on technology he's pretty fucking out of touch. As an example, what about people who commonly telnet+lynx, VMWare or PCAnywhere, right across the world, hundreds of times in their workday? Sure, maybe most normal users don't (yet), but for some sites (eg, nerd-heavy sites like /.), it's likely enough to start skewing results.

        "5. This leaves us with those IP addresses that can't be eliminated on the basis of geography. We now switch emphasis. Instead of looking for proof of difference, we now look for combinations which indicate it's the same person. These are IP addresses we know to be owned by the same ISP or company."

        Except that one ISP can serve as many as hundreds of thousands of users. And proxy gateways often report one IP for all the users connected to them. For example, NTL reports one "gateway" IP for all the people in my town on cable-modems - that's thousands, minimum. So, we're looking at a potential error magnitude of 100-100,000. That's no better than the existing system for assessing unique visitors.

        "6. We can refine this test by going back over the IP address/Cookie combination. We can look at all the IP addresses that a cookie had. Do we see one of those addresses used on a new cookie? Do both cookies have the same User Agent? If we get the same pool
      • ranging from analyzing dated cookies, IP addresses and Flash Shared Objects?

        What about those of us who kill our cookies at the end of every session and who don't use Flash? How are they going to find out if it's me or someone else?

        No cookies, no information. To them I'm a unique individual every single time. The only thing they could possibly track down would be information from cookies which already exist on my system from other sites and try to decipher that information.

    • Re:uhm, what? (Score:5, Insightful)

      by mwvdlee ( 775178 ) on Tuesday August 23, 2005 @03:25AM (#13377721) Homepage
      Since their "cutting edge methodology" is basically all the previous methods botched together, how can it ever be LESS privacy intrusive than the methods it's made up of?
  • CPUID (Score:4, Funny)

    by frinkacheese ( 790787 ) * on Tuesday August 23, 2005 @01:36AM (#13377434) Journal

    Sending your PCs unique CPUID along with every HTTP request would be ideal for this. You could also group up websites and use this to track people across websites. It would be great for marketing and for law enforcement.

    Oh, you all disabled your nice Intel CPUID? Why ever would you want to do that?
    • I don't even think there is a MB manufacturer that ships with the CPUID turned on anymore...
    • Re:CPUID (Score:4, Interesting)

      by KillShill ( 877105 ) on Tuesday August 23, 2005 @01:45AM (#13377458)
      Treacherous/Insidious Computing to the rescue.

      no need for cpu id's when your entire system and its OS will generate a 128bit id for you. and give them out to "trusted" "partners".

      remote attestation never sounded so good.
      • no need for cpu id's when your entire system and its OS will generate a 128bit id for you. and give them out to "trusted" "partners".

        Which Linux distro does this? I'd like to avoid them.
        • Which Linux distro does hits? I'd like to avoid them.

          What happens when the Linux distros that support custom HTTP happen to be the only Linux distros supported by your ISP's DHCP server? Once the major cable and telephone companies begin to require support for "Trusted" Computing before they'll give you an IP address, will you go back to dial-up to escape Trusted Network Connect [slashdot.org]?

        • The Linux kernel has had driver support for Trusted Platform Modules chips since 2.6.12.

          Gentoo [gentoo.org] appears to be the distro leading the charge.

          -
    • Sending your PCs unique CPUID along with every HTTP request would be ideal for this.

      I understand that you are being ironic, but in fact, CPUID won't be a silver bullet either. These researchers are trying to calculate the amount of different persons visiting the site, not the amount of different CPU's.

      • Indeed, but generally I would say that 1 person = 1 cpu, apart from shared cpus such as in schools, web cafes and such. But I guess that a combination of IP address and Browser information can pretty much od that already.

        OK, so what is really needed is a RFID implant - take yer CPUID with you, then software can really be licensed to a PERSON rather that a processor. Pay Amazon every time you click(tm) on a link(tm).
        • Re:CPUID (Score:4, Insightful)

          by aussie_a ( 778472 ) on Tuesday August 23, 2005 @03:10AM (#13377685) Journal
          Indeed, but generally I would say that 1 person = 1 cpu

          Not really. I surf the internet at home and at school. I imagine I'm not alone. So I would be registered as two different people.

          Indeed, but generally I would say that 1 person = 1 cpu, apart from shared cpus such as in schools, web cafes and such

          You forgot "pretty much anyone who doesn't alive alone and has a computer with internet access at home." Let's not forget that tiny percentage of people (I know, most slashdotters visit slashdot while avoiding work, but there are people out there who have families that have more then one person using a single computer. It's crazy I know).
    • You will never know for 100% sure, unless your site DISALLOWS all nonauthenticated user logins, and shows zero content except a signup page.

      So a rough guess of

      UNIQUE USERS = (UNIQUE IPS - REAL UNIQUE LOGINS) /2

      Will be spot on across the average of the whole planet for all 2 billion websites.

      Mmmm gota luv statistics. Averages are your friend.

  • UIP? (Score:5, Funny)

    by XanC ( 644172 ) on Tuesday August 23, 2005 @01:36AM (#13377436)
    I tried to find out for myself, I really did. I can't figure out if any of these dictionary.com results apply. This is the complete list, and none of them seemed to fit. There's one kind of humorous one...

    International Union of Private Wagons
    Quimper, France - Pluguffan (Airport Code)
    Ultimate Irrigation Potential
    Uncovered Interest Parity
    Undegraded Intake Protein
    United International Pictures
    Universidad Interamericana de Panamá
    Unusual Interstitial Pneumonitis
    Upgrade Improvement Program
    Urinating In Public
    User Interface Program
    USIGS Interoperability Profile
    Usual Interstitial Pneumonia of Liebow
    Utilities Infrastructure Plan

    • Unique IP? I think? Just a guess from context...
    • International Union of Phlebology
      Paraguayan Industrial Union
      UCAR Intellectual Property
      Unintended Pregnancies
      Union Interparlementaire
      Universal Immunization Program
      University Interaction Program
      Update In Progress
      Urban Indicators Program
      Utility Interface Panel...

      ...and that's enough for now. Bedtime for John.
    • Re:UIP? (Score:3, Informative)

      by 1u3hr ( 530656 )
      And strangely enough, this acronym isn't used in TFA at all. In fact, if the submitter did mean "Unique IP" that's not at all what the article is about (after all, that's trivial to record). They're looking for the number of unique individuals, and trying to deduce that from Cookies, IP, and other data.

      Unique Individual? P???

    • Could be "Unique Individual People" I suppose, but this is a classic example of the rule that all acronyms (other than those in universal use) should be explained on first use.
    • User Identification Persistance? Something that allows you to track users (ala a cookie), but is persistant in some way?
    • I was learning communications in a class with a professor doing a pretty poor job. He mentioned "CTS" without saying what it meant. I have a lappy in front of me to view the lecture slides so I did a quick search. Finally I knew that when a communication needs to occur, a computer sends out a Cattle Tracking System.
  • Step 4. . . (Score:5, Insightful)

    by SpaceAdmiral ( 869318 ) on Tuesday August 23, 2005 @01:36AM (#13377438) Homepage
    We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour.

    If my company had computers in New York and Tokyo, I could ssh between them in much less than 60 minutes. . .

    • What percentage of people do you think do that?

    • After some thought, I'd probably agree that step 4 is valid for the vast majority of web users.

      The only way this might break is if a large number of people are sitting behind a proxy/cache. But if it is the case they have fallbacks.
    • If my company had computers in New York and Tokyo, I could ssh between them in much less than 60 minutes. . .

      The point is, most people wouldn't do that, and those who do wouldn't be significant to skew the metrics too badly.

      However, having said that, it is quite possible to have a network configured for high availability such as that if you lose your local internet link traffic gets routed via your internal network out another internet link in another office. Frequently this office is in another country

      • However, having said that, it is quite possible to have a network configured for high availability such as that if you lose your local internet link traffic gets routed via your internal network out another internet link in another office. Frequently this office is in another country...

        Even more frequently your "internal" link to the remote office is via VPN over your local link, so this isn't really an option. Redundant local links are more likely.

    • A corporate WAN with multiple routes to the internet and load-balanced http proxies would do it, too.
    • So if only 1% of users are like you, then we will take all hits from your 'case' and divide em by 100, and add that to our unique users count. There you will be counted, but fairly. Its all about have 2 trees of decision, and counting both totals and using a percentage of each for your final tally. A great accurate result at a low resolution of time, (think audio khz, 1hr = ~.000027 hz) So just as a photo of mars showing the whole thing - vs a photo of a one tiny rock. A wider low res gives us a new 'glob
  • Field test (Score:2, Funny)

    by enoraM ( 749327 ) *
    iMediaConnection starts a huge field test of tracking unique slashdot readers with their cutting edge technologies.
  • by elronxenu ( 117773 ) on Tuesday August 23, 2005 @01:42AM (#13377449) Homepage
    He fails to consider the possibility of the same user using different browsers (and hence the same IP address, but different cookies, and a different browser identification string).

    So you can use probabilistic means to identify unique visitors. That's not a paradigm shift, except for those whose paradigms are already very small.

    Somehow I don't think this research is worthy of an NDA.

    • Mod this parent up.

      I don't mean to be a poo poo here, but this isn't as huge a deal as the author has made it sound (i.e. it certainly is not a "paradigm shift").

      Instead, what we have here is an evolutionary suggestion in how we can track users more accurately. Kudos.

      As with all solutions in CS, there are problems. As the parent has correctly observed, this doesn't solve the "multiple browsers, same user" problem (which is common -- you probably use a different computer at work than at home). I am not cer
      • While I agree this is hardly a paradigm shit I think the poster is grasping at straws with his/her example. How many people surf between two browsers? I switch browsers when FF can't handle something. I migrate to a news browser every time something compelling comes along. How many people switch browsers in the same month?

        Computers, that might be a larger percentage. But even then more tests could be done. Message boards you distract youself with at work that have a login-system which sets an everlast
  • by Anonymous Coward on Tuesday August 23, 2005 @01:44AM (#13377454)
    "This way Flash can report to the system all the cookies a machine has held. In addition to identifying users, you can use this information to understand the cookie behavior of your flash users"

    I'm not sure what the Flash is, but to me, scanning all the cookies your computer has had IS privacy intrusive.

    • Not to mention a security flaw.

      When you visit my site, you agree to download and run a Flash/ActiveX control that downloads all your cookies to slashdot.org, and then sends them to me, so that I can now present false credentials to slashdot.org to make it think that I have auto-login privledges.

      Awesome design flaw there, but I highly doubt anyone is THAT stupid to put THAT big of a security flaw into a system.
      • by Moraelin ( 679338 ) on Tuesday August 23, 2005 @04:42AM (#13377889) Journal
        "I highly doubt anyone is THAT stupid to put THAT big of a security flaw into a system."

        Read the article, and the guy is proposing to build exactly that kind of a security flaw into the system.

        Flash can use, basically, some local shared storage on your hard drive. This isn't really designed as cookie storage, and doesn't have even the meager safeguards that cookies have. (E.g., being tied only to a domain.) It's really a space that _any_ flash applet can read and write, and currently noone (with half a clue) puts any important data there.

        This guy's idea? Basically, "I know, let's store cookies there, precisely _because_ any other flash applet, e.g., our own again from a different page, can read that back again."

        Caveat: so can everyone else. I could make a simple flash game that grabs everything stored there, just as you described, and sends it back to me. Including, yes, your session id (so, yes, I can take over your session in any site you were logged in, including any e-commerce sites or your bank) and anything else they stored there.

        Since it's used to track your movements through sites, depending how clueless that's programmed, I may (or may not) also be able gather all sorts of other information about you.

        So in a nutshell his miracle solution is to build _exactly_ that kind of a vulnerability (not to mention privacy leak) into the system.

        So, well, that's the problem with assuming that "noone could be THAT stupid". Invariably when I say that, someone kindly offers himself as living proof that I'm wrong. Soneone CAN be that stupid.
    • "I'm not sure what the Flash is"

      In this case I think the "Flash" being referred to is Macromedia's Flash plugin. He's not very clear though is he?
    • by buro9 ( 633210 ) <david@buro9 . c om> on Tuesday August 23, 2005 @04:20AM (#13377839) Homepage
      Macromedia have a page that allows you to modify what sites can do on your computer in regards to Flash:
      http://www.macromedia.com/support/documentation/en /flashplayer/help/settings_manager02.html#118539 [macromedia.com]
      • http://flashblock.mozdev.org/ [mozdev.org]

        Get it because it'll make you cool like everyone else (Go Go Gadget Peer Pressure!), keep it because you don't miss the ads and just one click brings up any content you do want, as well as whitelist features.

  • What's so new about this? How is this news? Very little substance to the article, plus I've been using IPs, Cookies and Logins to track people for a long time.
  • Paradigm shift ? (Score:3, Insightful)

    by l3v1 ( 787564 ) on Tuesday August 23, 2005 @01:49AM (#13377472)
    No single test is perfectly reliable, so we have to apply multiple tests.

    No kidding. This guy probably needs a wake up call.

    We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour.

    Ok, so this is what normally is called a really stupid argumentation. I don't say that it can't be accounted for, but stating such a thing is nothing more than plain stupidity. Has this guy ever heard about that Internet thing ?

    Flash can report to the system all the cookies a machine has held.

    Uhmm, not a great argument to make people use it.

    No one wants to know.

    I don't think they don't want to know. They just don't want to see a sudden drop of ~50% of their user count from a day to the other. And it really doesn't matter if it's the truth or not. A drop is a drop.

    • I don't think they don't want to know. They just don't want to see a sudden drop of ~50% of their user count from a day to the other. And it really doesn't matter if it's the truth or not. A drop is a drop.

      Replace the word "they" with "companies that are deriving revenue from web traffic." This guy makes his money from selling analytics software so that companies can track the success of thier web sites and based on tracking, make modifications, sell advertising, marketing, and so forth.

      Seeing a 30-50%

  • by JohnGrahamCumming ( 684871 ) * <slashdot.jgc@org> on Tuesday August 23, 2005 @01:51AM (#13377476) Homepage Journal
    What's more interesting, the new technology doesn't seem to be privacy intrusive

    The only mention of the word "privacy" on the linked web page is the term "Privacy Policy" at the bottom of the page.

    John.
  • crap again. (Score:4, Insightful)

    by gunix ( 547717 ) on Tuesday August 23, 2005 @01:52AM (#13377483)
    From the article:

    " We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour."

    Everheard of ssh and similar tools to make that travel?
    And they put this on slashdot. Ignorance, just pure ignorance...
    • The majority of people using computers will never use ssh in their lives. It's not the perfect solution, but it's not complete crap either.
    • Not SSH, but web caches. If corporate insist that you browse via a cache setup that fails over from the New York link to the Tokyo link whenever the internal network conditions make it worthwhile, you'll merrily surf from all sorts of addresses.
  • by mattso ( 578394 ) on Tuesday August 23, 2005 @01:53AM (#13377488)
    They make some silly assumptions that I don't think work with users using proxy agents, but in the end it still boils down to the existence of cookies. Which would be ok, if the problem they are trying to solve wasn't that users are deleting and not storing cookies at all. They do mention using Flash to store cookies, which I suspect will have to be the next area users will have to start cleaning up. But just because cookies don't overlap in time and the IP address is the same doesn't mean it's the same person. A bunch of users that use the same browser and share an IP address that always delete their cookies with this system will look like one user. Vastly under counting. Which I don't think web sites are interested in. Vast over counting is profitable. Under counting, not so much.

    In the end there is no way they can even mostly recognize repeat web site visitors if the VISITOR DOESN'T WANT THEM TO.

    The big problem is stated at the top of the article:

    "We need to identify unique users on the web. It's fundamental. We need to know how many people visit, what they read, for how long, how often they return, and at what frequency. These are the 'atoms' of our metrics. Without this knowledge we really can't do much."

    If knowing who unique users are is that important they need to create a reason for the user to correctly identify themselves. Some form of incentive that makes it worth giving up an identification for.
    • AFAIK (someone correct me as I don't have a test machine right here) these programs don't delete ALL cookies they delete _ad_based_ cookies. So say, /. and Amazon will still have it's cookies while known ad companies cookies will be gone.
      The less effort it takes too make an account/log in will require less incentive. Please go through as few steps as possible, with log-in and account creatons on the same page as a reply box. Having that reply box on the same page is nice too. Go and read up on [http: [wikipedia.org]
    • "We need to identify unique users on the web. It's fundamental. We need to know how many people visit, what they read, for how long, how often they return, and at what frequency. These are the 'atoms' of our metrics. Without this knowledge we really can't do much."

      So... radio doesn't exist?
  • Tragically flawed (Score:5, Insightful)

    by tangledweb ( 134818 ) on Tuesday August 23, 2005 @01:54AM (#13377490)
    The article's "Sky is Falling" tone rests on a single factoid. "30 to 55% of users delete cookies" therefore current analytics products are out by "at least 30 percent, maybe more".

    That is of course complete nonsense. Let's say we accept the author's assertion that different studies have given cookie deletion rates across that range. I can accept that a significant number of users might delete cookies at some point, but what percentage of normal, non-geek, non-tinfoil-hat-wearing users are deleting cookies between page requests to a single site in a single session? If it is 30%, then I will eat my hat.

    Most cookie deletion amoung the general populace will be being done automatically by anti-spyware software and is not done in realtime.

    The author clearly knows that even the most primitive of tools also use other metrics to group page requests into sessions, so even if 30% of users were deleting cookies, it would not result in a 30% inaccuracy.

    Of course "researchers propose more complex heuristic that looks to be slightly more accurate than current pracice" does not make as good a story as "paradigm shift" blah blah "blows out of the water" blah blah "We've been off by at least 30 percent, maybe more." blah blah.
    • If it's true that "30 to 55% of users delete cookies" therefore current analytics products are out by "at least 30 percent, maybe more". then all they need to do is add 42.5% to their numbers then they'll be at most 12.5% out. Did I just shift a paradigm?
    • by Stephen ( 20676 )

      The article's "Sky is Falling" tone rests on a single factoid. "30 to 55% of users delete cookies" therefore current analytics products are out by "at least 30 percent, maybe more".

      That is of course complete nonsense. [...] I can accept that a significant number of users might delete cookies at some point, but what percentage of [...] users are deleting cookies between page requests to a single site in a single session? If it is 30%, then I will eat my hat.

      The author clearly knows that even the most pr

      • Most conversion processes occur in under 14 days. While 3rd party permenant cookies are an easy way to track this, there are methods that are as reliable over this time frame.

        For conversions taking longer than 7 days, you are generally looking at products with high 'consideration' (to use marketing speak), such as expensive consumer products or travel. These people do have problems when relying on cookies. Not to downplay their pain, but they hardly make up the majority (in number, not dollars) of online

      1. The Jupiter report stating that 37% of cookies are being deleted has not really been accepted wholly by the web analytics community. See the recent NY Times article that was linked from Slashdot a few weeks back.
      2. The main reason that companies are not willing to try this new paradigm shift UIP technology is that most people in the industry are already doing it.
      3. The paradigm shift is simply using a bundle of already known tricks and throwing them in a big soup. There is nothing amazing here.
      4. The big pro
  • by DroopyStonx ( 683090 ) on Tuesday August 23, 2005 @02:00AM (#13377505)
    I develop web analytic software for a living.

    There's only so much you can do to track users.

    IP address, user agent, some javascript stuff for cookieless tracking.. the only real "unique" identifiers for any one visitor. It stops there.

    Of course, using exploits in flash doesn't count, but supposedly this new method is "not intrusive."

    I call BS because it simply can't happen.

    If a user doesn't wanna be tracked, they won't be tracked. This story is just press, free advertisement, and hype for this particular company.
  • Why do I have this feeling like this "cutting edge technology" involves the entrails of an animal and some form of divination?
  • Paradigm shift ?!? (Score:5, Insightful)

    by rduke15 ( 721841 ) <(rduke15) (at) (gmail.com)> on Tuesday August 23, 2005 @02:38AM (#13377618)
    When I read "paradigm shift" in the very first paragraph, my bullshit sensor sound such a loud alarm that it's hard to continue reading...
    • by fbg111 ( 529550 )
      And the fact that he actually felt the need to explain what a "paradigm shift" is to his audience - undoubtedly consisting of cynical techies - as if we'd never been (over)exposed to the concept before, quadrupled the BS meter. Honestly, was he born yesterday?

      Oblig Dr. Evil Quote: [about his new "laser"] You see, I've turned the moon into what I like to call a... "Death Star".

  • and are already tired explaining customers, why the unique visitors differ from ther built-in log-file analysis.

    See CheckEffect [checkeffect.at] for details.
  • by Saggi ( 462624 ) on Tuesday August 23, 2005 @02:58AM (#13377659) Homepage
    The article uses a lot of time to establish that this is a paradigm shift, when it's actually not. I do believe their idea is good, but basically it's just applying a lot of "possible" user identifiers and merge them together to form a unified result.

    Some of the identifiers they haven't used are linkage on the site. If one page links to another, it might be the same user, if the pages are called in sequence.

    On top of links "time" might be applied. Some links are expected to be clicked fast, others after some reading on the page.

    Some may argue that linkage is what you want to determine in the following analysis, and can't therefore be used to determine the use in advance, but this is not true. The determination of the user uniqueness looks to see if its possible for the user to get from one page to an other, while the analysis want to determine if they did it.
  • Excuse me, but "proactive" and "paradigm"? Aren't these just buzzwords that dumb people use to sound important?

    I mean, seriously folks-- there is a reason why these things are mocked.

  • by wranlon ( 540319 ) on Tuesday August 23, 2005 @03:01AM (#13377665) Homepage

    ROI is mentioned, along with the 'atoms' of their metrics: page hit count, popular URL count, URL dwell time, and returning visitors. When these metrics are used to produce reports, how valuable are these reports in ascertaining how ROI is affected by said metrics? For example, getting a neat funnel report of the path people take through a site and where the traffic drops off offers insight into popular paths and locations where people bail out, but apart from listening for errors, there is no further insight into why a person bailed.

    What seems to be missing is gathering insightful information into what transpires while someone is on a particular page. I'd like to know the general trends in behavior [whitefrost.com], not just the server requests. I've found it more useful to be able to see the interactions with the content than reporting where people enter, traverse, and exit a site.

  • Cutting edge? ha! (Score:3, Insightful)

    by ZeroExistenZ ( 721849 ) on Tuesday August 23, 2005 @03:14AM (#13377693)

    "If the same cookie is present on multiple visits, it's the same person. We next sort our visits by cookie ID"

    Only after that they seem to continue the analys ("We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible", etc)

    Thus turning off or regulary removing cookies will render their bleeding cutting edge technology useless? And how are cookies a 'breakthrought'?. Their only alternative to this seems to be;
    You can also throw Flash Shared Objects (FSO) into the mix. FSOs can't replace cookies, but if someone does support FSO you can use FSOs to record cookie IDs.

    I don't know what the fuzz is about

    This is just basic logic, which any decent programmer should be able to come up with, even the M$ certified ones.

  • by RAMMS+EIN ( 578166 ) on Tuesday August 23, 2005 @03:17AM (#13377701) Homepage Journal
    For those who can't be bothered to read through all the buzzwords, here's the actual method used:

    Each of these steps is applied in order:

          1. If the same cookie is present on multiple visits, its the same person.

          2. We next sort our visits by cookie ID and look at the cookie life spans. Different cookies that overlap in time are different users. In other words, one person cant have two cookies at the same time.

          3. This leaves us with sets of cookie IDs that could belong to the same person because they occur at different times, so we now look at IP addresses.

          4. We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it cant be the same person because you cant get from New York to Tokyo in one hour.

          5. This leaves us with those IP addresses that cant be eliminated on the basis of geography. We now switch emphasis. Instead of looking for proof of difference, we now look for combinations which indicate its the same person. These are IP addresses we know to be owned by the same ISP or company.

          6. We can refine this test by going back over the IP address/Cookie combination. We can look at all the IP addresses that a cookie had. Do we see one of those addresses used on a new cookie? Do both cookies have the same User Agent? If we get the same pool of IP addresses showing up on multiple cookies over time, with the same User Agent, this probably indicates the same person.

          7. You can also throw Flash Shared Objects (FSO) into the mix. FSOs cant replace cookies, but if someone does support FSO you can use FSOs to record cookie IDs. This way Flash can report to the system all the cookies a machine has held. In addition to identifying users, you can use this information to understand the cookie behavior of your flash users and extrapolate to the rest of your visitor population.
    • 4. We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it cant be the same person because you cant get from New York to Tokyo in one hour.

      I don't know. Switching proxies from one in New York to one in Tokyo takes me certainly less than 60 minutes...

      But I guess no method can be perfect.
  • by Sinner ( 3398 ) on Tuesday August 23, 2005 @03:18AM (#13377702)
    About 20% of my time on my last job was spent doing web analysis. It drove me insane.

    The problem is with the word "accurate". To management, "accurate statistics" means knowing exactly how many conscious human beings looked at the site during a given period. However, the computer cannot measure this. What it can measure, accurately, is the number of HTML requests during a given period.

    You can use the latter number to estimate the former number. But because this estimate is effected by a multitude of factors like spiders, proxies, bugs, etc., management will say "these stats are clearly not accurate!". You can try to filter out the various "undesirable" requests, but the results you'll get will vary chaotically with the filters you use. The closer you get to "accurate" stats from the point of view of management, the further you'll be from "accurate" stats from a technical point of view.

    Makers of web analysis software and services address these problems by the simple of technique of "lying". In fact, a whole industry has built up based on the shared delusion that we can accurately measure distinct users.

    Which is where this article comes in. The author has discovered the shocking, shocking fact that the standard means of measuring distinct users are total bollocks. He's discovered that another technique produces dramatically different results. He's shocked, shocked, appalled in fact, that the makers of web analysis software are not interested in this new, highly computationally-intensive technique that spits out lower numbers.

    My advice? Instead of doing costly probability analysis on your log files, just multiple your existing user counts by 0.7. The results will be just as meaningful and you can go home earlier.
    • 10+5/2

      Seriously, whats important REALLY is not the current statitic total for NOW, or TODAY,

      its.... yes... TRENDS!!!

      that PAGE X is increasing by 6% weekly.

      or that page y is dropping in interest.

      Its just like TV ratings, everyone knows its all CRAP and nonsense, except the DELTAS, the changes

      if TV show X is going up 30% week, you know its HOT.

      Think of it like qantum physics, you dont really know the location of the electron, just its DIRECTION. TIME CANNOT STAND STILL.

      DIRECTION of MOTION is what you want whi
      • You seem to assume that they want to improve their site. In which case, yes, anonymous trends and anonymous user movement grouped by session id suffices. (E.g., to see if users give up and leave your site half-way through the marketting bullshit pages, before even reaching the product pages.)

        But that's not the problem.

        Whenever you see someone going on about how the _need_ to track and identify each user, and they _need_ accurate numbers and even personal details... that's your clue that it's purely an ad mo
    • Otherwise your production app would have used some kind of persistency and your job would have been a lot nicer. Thats what I would have explained to them anyway. Stupid as an excuse can only really go so far and even managers in such technical fields should be able to get that.
  • ``The author claims that he is describing 'new, cutting edge methodologies for identifying people, methodologies that -- at this point -- no web analytics product supports.''

    And when you read down to how these "new, cutting edge methodologies" actually work, it comes down to: plant cookies, if that doesn't tell you what you need to know, look at the IP address. Then take into account that different cookies and different IP addresses can still be the same user, if they occur at different times.

    It's clever, b
  • Why? (Score:3, Insightful)

    by RAMMS+EIN ( 578166 ) on Tuesday August 23, 2005 @03:32AM (#13377736) Homepage Journal
    Somebody please explain to me: why would you go to all this trouble to get a close estimate of how many unique visitors your site draws?

    I'm personally always more interested in how many pages get requested, and which ones. The first gives me an impression of how popular the site is*, the second tells me which pages people particularly like, so I can add more like that.

    The only reason I see for really wanting to track people is if your site is actually an app that has state. In those cases, you have to use a more bullet-proof system than the one presented in TFA.

    * Some people object that it counts many people who visit once, then never again; but I consider it a success that they got there in the first place - they were probably referred by someone, followed a link that someone made, or the page ranks high in a search engine.
    • Somebody please explain to me: why would you go to all this trouble to get a close estimate of how many unique visitors your site draws?

      Tracking the success of an advertising campaign. Ad-buyers want to get thier message out to eyeballs and they want to message to as many eyeballs as often as possible for their target demographic. Picking sites in their demographic is easy and they know how to do that. Picking a site that drives enough unique traffic is much more difficult.

      I would expect, though I don

  • One single method that would reliably allow a site to track its users would be that each user needs to log in, and then needs the "session cookie" on each page they visit, and if they delete it, hard luck, log in again. This method is just a step away from another one: Make the pages password-protected and give the password to nobody. Users tracked: 0. Pages visit: 0. Tracking reliablity: 100%.
  • While web usage stats may indeed be inaccurate, it is so across the board. This means, everything that relies on it has the same amount of inaccuracy... Which in turn makes it, accurate in the market place.

    For instance, considering everything else to be equal, an ad buyer wanting to pay $1 for one thousand unique eyeballs won't care whether it's spent at site A or site B, as long as they are using equivalent methods to measure traffic.

    Another example. Say Google puts out a press release saying they have X
  • If I request page A, then request page B and then go back to page A and grab it with a conditional request (and the server returns 302 not modified) wouldn't this obviously indicate I had been to page A before fairly recently? (assuming you have set cache headers such as to only allow private non-shared ISP proxies to store them)

    What about people following a link with a referer from page A to page C when they haven't (according to your logs) been on page A? Doesn't this likely indicate page A has been cache
  • Magdalena and Thomas have run some preliminary tests on three large sites that indicate the number of unique visitors is really around half what existing metrics tell us. Both they and I are anxious to run more detailed tests to validate this methodology.

    So how do you determine if your methodology is accurate? The fact that preliminary tests give you different answers than traditional methods doesn't really tell us anything. It just informs us that two different methods present two different results.

  • I was debunking the poor logic, inappropriate assumptions and overall lack of fundamental understanding held by these researchers. After debunking the first four points, I changed gears. I'm tired of all these marketing bullshit artists trying to track my every page view and metric on what I do at their site. I'm especially tired of having to manage cookies and delete them on a regular basis. Sure each site only sends 1-10 cookies of a few bytes each, but that starts to add up when you don't stick to your
  • Many times what is as important isn't the existance of a piece of information, but the lack of a piece of information. If a particular object is referenced and that object is flagged to be cacheable at a browser (cache-control: private), and the reference wasn't an if-modified since request, then you could consider it a new visit. If however, a user references the page the object is imbedded in, but the object itself isn't referenced, then it is cached, and could be considered a return visit. This would
  • If you want to measure the success of your web site, look at the net income it generates. If you want to identify problem areas, use the available data intelligently, with full understanding of its limitations, and perform a well reasoned statistical analysis of that data.

    The only thing gained by uniquely identifying users outside of financial transactions is the opportunity to violate their privacy.

    I defy the "new" methodology to uniquely identify me on jrandomwebsite.com -- I block cookies until I know t

"Imitation is the sincerest form of television." -- The New Mighty Mouse

Working...