Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Chrome Advertising Google

Google Starts Testing Its Replacement for Third-Party Cookies for Chrome (engadget.com) 72

"Google has taken one step closer to banishing third-party cookies from Chrome," reports Engadget. The internet giant has started testing its trust tokens with developers, with promises that more would move to live tests "soon." As before, the company hoped to phase out third-party cookies in Chrome once it could meet the needs of both users and advertisers.

Trust tokens are meant to foster user trust across sites without relying on persistent identifying data like third-party cookies. They theoretically prevent bot-based ad fraud without tying data to individuals. This would be one framework as part of a larger Privacy Sandbox including multiple open standards.

The company still hopes to eliminate third-party cookies by 2022.

This discussion has been archived. No new comments can be posted.

Google Starts Testing Its Replacement for Third-Party Cookies for Chrome

Comments Filter:
  • Trust? (Score:2, Insightful)

    by Anonymous Coward

    Some how I doubt Alphabet is going to give up dipping its hand into the privacy cookie jar, after all, they are an advertising company.

    • Re:Trust? (Score:4, Insightful)

      by Joce640k ( 829181 ) on Sunday August 02, 2020 @03:23AM (#60356947) Homepage

      Yep, I think they're much more interested in the "prevent bot-based ad fraud" part of this.

      • by Anonymous Coward
        Google said:

        once it could meet the needs of both users and advertisers

        The needs of (sane) users and the "needs" of advertisers conflict. They just want a monopoly on spyware.

        • s/on spyware//

          They just want a monopoly, or at least its little brother, lock-in.
          Like any for-profit business, driven by the mad-laughing-in-a-base-under-a-volcano insane requirement of infinite exponential profit growth. Requiring either bully tactics and abuse of power, criminal behavior, or self-writing/buying regulation, to destroy the competition, in the long run.
          Something fundamentally incompatible with a free market.

          Which is precisely why there is no such thing as self-regulating / self-balancing by

      • Yes Google wants to reduxe click fraud because it undermines their business. Google also wants to be able to serve us ads for an ESP32 or whatever we're most likely to buy. That's what Google wants. (Check out the ESP32 if you haven't).

        Pesky little folks like the US Senate are poised to make it harder for Google to do what they need to do unless they can do it in a way that better protects privacy. Ad blockers are already causing them problems. Third-party cookies are not a reliable feature that Google ca

        • I recently based a hobby project on an ESP32. I recently had to get a pacemaker and I made a personal EKG holter monitor that logs to SD with real-time display over wifi. SD was straight forward. Wifi was straight forward. Creating tasks was as straight forward as FreeRTOS makes it. A lot of example code is exceptionally crufty and would be confusing to someone with a lot of experience. The internal ADC is a steaming pile. In software triggered mode, it intermittently acts as if the external pin isn't elec
          • > probably spent 30 hours battling that ADC.

            Sounds like what I would do, before spending 10 minutes hooking up an external ADC. :)

            • I actually tried that, but I had already finished the compact 3D printed enclosure, and there wasn't much room left. I had a digispark on hand and was going to deadbug the 8 pin AVR from it in the available space, but midway through debugging the firmware I blew it up. So that's when I gave I2S a try and got good enough results.
              • Whenever I hear about someone who actually knows what they are doing dead bugging a component like I do, that makes me feel better about myself.

    • Re:Trust? (Score:4, Interesting)

      by arglebargle_xiv ( 2212710 ) on Sunday August 02, 2020 @04:03AM (#60356987)
      Oh great, instead of easily blockable and manageable third-party cookies we're now going to get Google-controlled evercookies tracking everything we ever do.
      • by AmiMoJo ( 196126 )

        This isn't a cookie or some way of identifying the user. In fact it's designed to be the opposite of that. It's some kind of variation of or based on Privacy Pass, which you can read about here: https://medium.com/@alxdavids/... [medium.com]

        Basically when you solve a captcha instead of it being a one-time thing it generates a cryptographic token. Then the browser hands out those tokens when requested as proof-of-work so that you don't see captchas again. The tokens are anonymous and can't be traced back to the initial w

        • by Z00L00K ( 682162 )

          It serves the same purpose as the third party cookies but can be a lot worse for privacy than any third party cookie is.

          When something from a major company says "trust" it means that it's quite the opposite unless it can be mathematically proven.

          There is actually no reason for me as a user that there shall be any cross-site tracking whatsoever.

          • by AmiMoJo ( 196126 )

            In what way does it serve the same purpose as a third party cookie? It doesn't store arbitrary information or allow identifiers to be passed, it's just a proof-of-work token. It specifically does not allow for tracking, it's actually designed to prevent that.

            It's an open standard, you can participate in the ratification process with the W3C. If you have specific criticisms or issues then you should bring them up because apparently everyone else missed those insights.

            • by Alumoi ( 1321661 )

              It specifically does not allow for tracking, it's actually designed to prevent that.

              Yeah, right.
              User u from IP i using browser b solved captcha c. Now let's tell the world + dog about it.
              No way we can track this particular user and build a profile about him. No siree!

              • > User u from IP i using browser b solved captcha c. Now let's tell the world + dog about it.

                That's what third-party party cookies do, so as you said there is a privacy concern with cookies.

                With this zero-knowledge proof system, the site can tell only that the current visitor once solved SOME captcha from Google; it can't tell which captcha. The Google ad network openly publishes a public key. The browser shows that it solved a math problem using that published key.

              • by AmiMoJo ( 196126 )

                They can't tell which captcha you solved, the tokens are designed to be untraceable. There is mathematical proof of that.

            • In what way does it serve the same purpose as a third party cookie? It doesn't store arbitrary information or allow identifiers to be passed, it's just a proof-of-work token. It specifically does not allow for tracking, it's actually designed to prevent that.

              I was trying to find out more information on how it will be used but still have one question about it will be implemented. Once a site receives the token and you logging, they can ties the token to you. This would make sense because they would know it is you and be able to tailor information to you, including what ads to serve up. Separate from the ad issue, if the token is unique then there is value in sharing or selling the information to others. Once you know who is associated with the token, you can

              • >"In the end it's still about identifying a user to serve ads."

                Yep, and someone will find ways to abuse it. Certainly it will be used as part of browser fingerprinting. And use will spread beyond just ads.

                And [I assume] this scheme is proprietary, so what happens when other browsers don't follow this non-official "standard" and sites block you for not having it? Welcome to people allowing Google to control the web by using Chom*.

                And how can you trust a binary blob browser to do what they say it will d

              • Re:Trust? (Score:5, Insightful)

                by infolation ( 840436 ) on Sunday August 02, 2020 @07:44AM (#60357261)

                What am I missing?

                What you're missing is any useful information in the summary or linked article about what 'trust tokens' actually are, or how they work from a privacy preserving/destroying perspective in the context of Google's 'privacy sandbox'.

                Fortunately the EFF have written a pretty good summary [eff.org] of trust tokens.

                I'd advise you to read that link, but the summary is:

                Good:
                fewer CAPTCHAs, fighting fingerprints

                Good:
                "privacy budget" for fingerprinting

                Bad:
                Conversion measurement API.
                Apple's version stores 6 bits (1-64)
                Google's version stores 64 bits of information (1 to 18 quintillion.
                = Profiling tool.

                Very bad:
                Federated Learning of Cohorts. FLoC allows users to build their own, local machine learning models by sharing little bits of information at a time. A 'behavioral credit score'. Incomprensible to users, reveals incredibly sensitive information to third parties. Trackers will use it to profile build.

                • What am I missing?

                  What you're missing is any useful information in the summary or linked article about what 'trust tokens' actually are, or how they work from a privacy preserving/destroying perspective in the context of Google's 'privacy sandbox'. Fortunately the EFF have written a pretty good summary [eff.org] of trust tokens..

                  Thanks. The EFF explanation was very informative. Seem line old wine in a new bottle...

                • by AmiMoJo ( 196126 )

                  I would say the Conversion Measurement API is a good thing. At the moment sites use Javascript to do it. Having a single API makes it easier to block both the Conversion Measurement API and any Javascript trying to access 3rd party sites.

                  Also note that clearing site data also stops the conversion being recorded (https://github.com/WICG/conversion-measurement-api#clearing-site-data)

                  I agree Google's maximum 64 bits is too high. However Apple's isn't much better because it's 6 bits per ad server, so you can ju

                  • so you can just have loads of subdomains to increase the number of bits.

                    Well spotted, hadn't thought of that. Feels like eTags [lucb1e.com] all over again.

                • Apple's version stores 6 bits (1-64)
                  Google's version stores 64 bits of information (1 to 18 quintillion.
                  = Profiling tool.

                  Or maybe Google doesn't subscribe to "640k should be enough for everyone" engineering. After-all you can profile every computer on the planet with less than 64bits. FAR less.

              • by raymorris ( 2726007 ) on Sunday August 02, 2020 @08:01AM (#60357295) Journal

                > What am I missing? Will the token be unique each time it is used but still trusted?

                Yep, that's what you were missing - it's the returned value ia different every time, so the site has no way of knowing if it's the same person from yesterday or not.

                The underlying cryptography is a called a zero-knowledge proof.

                One part of the math gets complicated because it's based on elliptic curves, but we can describe and understand the general idea without delving into the hard math part. Elliptic curves are a way to set up a math problem that's arbitrarily hard to figure out, but easy to check, and vice versa. For now, let's just call the solutions "odd numbers".

                The ad network (Google) openly publishes their public key, which I a big number. The client takes a second to compute an "odd number" that is a multiple to of the Google key. That's the proof of work. The client keeps that "odd number" secret. The site wishing to check that the captcha has been done then sends over a large arbitrary number, such as "738384641...739462". The client's task is to multiply this challenge number with their secret odd number and send back the last two digits.

                Based on the last two digits, the server can tell if you successly used an odd number in the multiplication, but can't tell which odd number you used. Each time, the last digits (what the client sends) is different, there is no way to tell what the client's secret number is, but we can tell that they do have a secret number.

                It's more complicated than that, of course, and the client actually chooses two random numbers that go into the calculation, but that's the general idea. Just like of I tell you this:
                ( X * 63826281 ) % 10 = 3
                You can tell X is odd, and that's all you can tell; you can never have any idea what X is, other than it must be an odd number. Similarly, with zero-knowledge proofs, the receiver can't tell what the secret is, it can only know that the sender does have a secret. Just with much harder math. :)

                Elliptic curves are also the state of the art in TLS (SSL) cryptography, so breaking ECC would require the person to be able to read all TLS traffic.

              • by AmiMoJo ( 196126 )

                Once a site receives the token and you logging, they can ties the token to you.

                They solved that too. The token is sent at a random time a day or two later. Because the timing is random unless they have an extremely low number of click-throughs they won't be able to tie it to a specific user.

                Will the token be unique each time it is used but still trusted?

                Yes, the tokens can be unique. The trust comes from examining the proof of work.

              • by MassacrE ( 763 )

                Once a site receives the token and you logging, they can ties the token to you. This would make sense because they would know it is you and be able to tailor information to you, including what ads to serve up. Separate from the ad issue, if the token is unique then there is value in sharing or selling the information to others. Once you know who is associated with the token, you can track.

                These tokens are specifically designed so that they can't be correlated from when they were issued to when they were used. That you asked for tokens, or that you redeemed tokens, is something someone could associate with your profile - if they were able to create a profile. Once you eliminate third party cookies, you would have to do this with heuristics (e.g. the browsers with this user agent at this IP range, the traffic coming with this TLS session, etc).

                But the point here isn't to build a system that re

              • by Z00L00K ( 682162 )

                Through the Privacy Sandbox, Google aims to let advertisers still display personalized ads without users giving up too much of their personal data or browser history.

                The personalized ads part tells me that this is a new way of tracking that might be a lot harder to avoid. I think that the true reason is that it's introduced so that it's going to be harder to avoid being tracked.

            • proof of what work though, you see, if that can be used ot prove you did something - like a captcha, or looked at an advert - then that's sufficient to track you and invade your privacy.

              Sure, it doesn't need to store arbitrary data, it is arbitrary by itself. You just need one per advert (or campaign, or advertiser).

              • > proof of what work though

                Proof that you did a math problem (based on elliptic curves) using Google's public key.

                > if that can be used ot prove you did something - like a captcha, or looked at an advert - then that's sufficient to track you

                That's what makes zero-knowledge proofs so interesting, and to me, just plain cool. They prove you know the answer, without revealing the answer. Here, they show your computer did the work to solve the math problem, without revealing the answer you got.

                If that seem

              • by AmiMoJo ( 196126 )

                Proof that *somebody* did *a* captcha at *some point*. Pretty useless since that happens millions of times a day on different sites.

                Anyway they won't use a captcha in this case, that was just an example of how a similar system works. The browser will do the work internally without user intervention.

            • Thanks for sharing this information, AmiMoJo.

              You're pretty smart for a fuckwit. :)

          • > it means that it's quite the opposite unless it can be mathematically proven.

            Hey that's a great idea. Such a good idea that's exactly what WC3 did here.

            The mathematical proof in this instance is that if you can figure out a user's secret token, you can break elliptic curve cryptography, which means you can read most of the TLS (SSL) traffic on the internet.

            For more information, Google "elliptic curve zero-knowledge proof".

        • This isn't a cookie or some way of identifying the user. In fact it's designed to be the opposite of that. It's some kind of variation of or based on Privacy Pass

          They can call it what they want, but it's still a form of a cookie.

          Basically when you solve a captcha instead of it being a one-time thing it generates a cryptographic token

          How about we stop the stupidity of using captchas? 2FA is harassment enough, captchas are just pawning off the work to someone else.

          but for advertising fraud the obvious use-ca

          • by AmiMoJo ( 196126 )

            It's not a cookie.

            Privacy Pass was designed for sites that need a captcha to stop spam, if you have a better method then let's hear it.

            If you use an as blocker then you won't ever click an ad so this doesn't affect you at all anyway.

            • No, not to 'stop spam.' To identify all visitors and block non-authenticated contributions. Spam is a bad thing that people get in email. Not user input that is inconvenient to a website operator.

              Nice try at making the job of trying to make identifying visitors to a website easier, though.

        • by MrL0G1C ( 867445 )

          The tokens are anonymous and can't be traced back to the initial work you did to create them.

          What's to stop Google or some other 3rd party from taking note of who did the captcha? and then tying that info to the unique captcha 'token'.

          Who can access the token?

          IMO so long as advertisers and ad firms get to run code in the browser then they are not to be trusted.

        • It will be the advertisers that pay the next piper. I guarantee a trust pass will both be portable and forge-able, as the incentive for tricking advertisers is just too great. Truthfully things like privacy Badger and noscript, and to a small extent incognio(if not in the Google alliance) is driving people flogging stuff you don't need - nuts. Hopefully the new Politicians will soon work out these trust certificates will rob their ability to push their spiel out, or garner support/donations. Soon their cam
      • by Zocalo ( 252965 )
        I'm pretty sure that the usual privacy related plugins will figure out and include a method to block, or otherwise manage, your trust tokens really soon. Also, since this is almost certainly 100% driven by ads, it's possibly going to make life much easier as you'll just be able to block them by default, rather than having to selectively whitelist as some sites require you do with non-ad/tracking related "third party" cookies across their various domains.
    • Oh they are, absolutely they are. There will be zero 3rd-party cookies in Chrome soon, "for your privacy and safety".

      Google will then replace them with a different mechanism that does exactly the same thing, but one that cannot be blocked by all the ad and cookie blockers out there.

    • Maybe there's a zero knowledge way to do this instead of putting google in the middle.

  • which is impossible.
    Good for Users = Bad for Advertisers
    and
    Bad for Users = Good for Advertisers
    The relationship is 180 degrees out of phase.
    • I disagree.

      No advertisers = paid websites = bad for users = no/low users = no advertisers

      A middle ground of completely anonymous "trust cookies" that provides the advertisers the info they need (not want) and lets users continue browsing a "free" internet is both good for users and advertisers.

      It is the extremes of this argument that are bad for either or both.

      • Why are paid websites bad for users? People pay for many things. I would argue that the proliferation of ad-based content has had a negative impact.

        With the current advertising model used by Google and Facebook we are paying with our lack of privacy. TV advertising worked without the ability to deliver targeted at the individual level.

        • It used to work better when ads were served by the website industry - eg I'd go to a hifi website, and get ads for hifi stuff.

          Today I get ads for whatever Google thinks they can sell to the advertisers for more money because they're "relevant", and usually that means stuff i have already bought.

          I guess it might work better for Facebook and co that have no discernable target to work with, and perhasp that's why they took off so much, though I think the ability to charge a lot more for "targetted" ads is prob

        • Why are paid websites bad for users?

          Because most paid websites want you to stump up the funbux with real money, rather than some anonymized crypocurrency micropayment equivalent.

          As soon as the financial transaction is linked to a real human identity (with all the browser-based antifraud measures that entails) an even more perfect method of identifying and tracking users is in place.

          • That could be an outcome, but we have the tracking anyway. My thesis is that privacy-hostile advertising approach is due to the lack of paid web content.

            The balance of power has tipped so much from the consumer to the advertising-based companies (e.g. Google and Facebook) and information brokers, that government regulation is the only solution--and it pains me to say that as a business owner.

      • by radja ( 58949 )

        Advertisers do not need any user info. They just want it to target ads. Targeting ads is a want, not a need.

      • it's the targeted advertising that is so bad because it allows companies (and worse, politicians) such an insanely fine degree of control that they can do things like target ads at extremists meant to appeal to them

        like why have doomsday peppers gotten so big? because an industry sprung up around selling them products for doomsday prepping and that then becomes a self-reinforcing system where the true believers feel like their beliefs are "proven" because they are constantly seeing affirmation of those bel

  • by Anonymous Coward

    More work for the various ad blockers, script blockers, and other filters that have become necessary to get anything reasonable out of the web.

  • Comment removed based on user account deletion
  • "...The company still hopes to eliminate third-party cookies by 2022."

    Google is so large in advertising it answers to no one.

    And I expected to find this confirmation of feel-good-go-nowhere marketing bullshit.

    When it comes to Google and privacy or trust, Nothing will Change. That's not just a campaign slogan.

  • And I'll go ahead and block it.

    "If anyone here is in advertising or marketing, kill yourself" -- Bill Hicks
  • I don't ever recall clicking on an ad that was served to me online. The percentage that generate legitimate clicks that lead to purchases must be infinitesimally small. I guess it's still enough to make it worth their while to pester the other 99.99% of us, though.
    • I don't ever recall clicking on an ad that was served to me online. The percentage that generate legitimate clicks that lead to purchases must be infinitesimally small. I guess it's still enough to make it worth their while to pester the other 99.99% of us, though.

      Online advertising in the 21st Century likely has far more to do with spying/tracking than selling.

      I strongly doubt it's justified by actual product revenue driven by clicks, and the advertising whores will have to prove it for me to believe otherwise.

      And yeah, it's going to get worse.

  • These must be double-plus good. And we sheep will believe that they are any better than third party cookies.

    Let me guess, these will require a signed certificate from an advertising agency (or two or three or four), and they 'promise' not to link your identity to them.
  • I banished third-party cookies in 1995. What is taking them so long?

GREAT MOMENTS IN HISTORY (#7): April 2, 1751 Issac Newton becomes discouraged when he falls up a flight of stairs.

Working...