Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google Privacy Social Networks

Google's Selfish Ledger is an Unsettling Vision of Silicon Valley Social Engineering (theverge.com) 254

An anonymous reader shares a report: Google has built a multibillion-dollar business out of knowing everything about its users. Now, a video produced within Google and obtained by The Verge offers a stunningly ambitious and unsettling look at how some at the company envision using that information in the future. The video was made in late 2016 by Nick Foster, the head of design at X (formerly Google X), and shared internally within Google. It imagines a future of total data collection, where Google helps nudge users into alignment with their goals, custom-prints personalized devices to collect more data, and even guides the behavior of entire populations to solve global problems like poverty and disease.
This discussion has been archived. No new comments can be posted.

Google's Selfish Ledger is an Unsettling Vision of Silicon Valley Social Engineering

Comments Filter:
  • Fermi's paradox (Score:3, Insightful)

    by olsmeister ( 1488789 ) on Thursday May 17, 2018 @10:02AM (#56626528)
    I believe we've found the answer.
  • by LordHighExecutioner ( 4245243 ) on Thursday May 17, 2018 @10:07AM (#56626566)
    ...why they do not use it to drive Google's development itself ?!?
    • by Barny ( 103770 ) on Thursday May 17, 2018 @10:55AM (#56626942) Journal

      Probably because this was a thought experiment. This was a video that the source reports was released internally with the intention of showing unsettling things they do not plan on doing.

      Slashdot just loves them some controversial headlines and stories, so they conveniently left that out with their blurb.

      Another non-story.

      • by jenningsthecat ( 1525947 ) on Thursday May 17, 2018 @11:38AM (#56627264)

        Probably because this was a thought experiment.

        That's irrelevant. The idea has been conceived and disseminated. The initial dissemination was among people with the power and the resources to make it a real-world experiment. Do you really Google doesn't have the arrogance, the hubris, and the power-lust to start implementing this?

        This was a video that the source reports was released internally with the intention of showing unsettling things they do not plan on doing.

        They may "not plan on doing", but do they "plan on not doing"? Besides, to hear Google tell it, they planned to not be evil - and look at them now.

        Another non-story.

        Google has a history of at least trying out the wild shit their people dream up. And I'm pretty sure the insularity of Silly Valley's denizens renders many of them immune to the consideration that using the rest of as lab rats is in any way immoral or inappropriate. Even at that, this would be a non-story only if Google wasn't already fully capable of rolling out such a scheme in a short time frame.

        • by hawguy ( 1600213 ) on Thursday May 17, 2018 @01:35PM (#56628030)

          Probably because this was a thought experiment.

          That's irrelevant. The idea has been conceived and disseminated. The initial dissemination was among people with the power and the resources to make it a real-world experiment. Do you really Google doesn't have the arrogance, the hubris, and the power-lust to start implementing this?

          They have pretty much all the data they need to do this for some people. They have your search history, your email history, your SMS history, your phone calls and voicemails (google voice), your detailed location history, your purchase history, (google wallet) every photo you've taken in the past N years, all of your files in Google Drive, and more.

          They know more about you than Facebook.

        • by q_e_t ( 5104099 )
          The USA maintains contingency plans for invading Canada. It doesn't mean it intends to do so. Sometimes people just think of things and decide they are bad ideas.
      • Probably because this was a thought experiment. This was a video that the source reports was released internally with the intention of showing unsettling things they do not plan on doing.

        And your confidence in this arises from Google pinky-swearing to that effect after the video was leaked?

    • because of Seeber's Social Placebo Uncertainty Paradox: Letting the people know you are collecting data on them changes their response to the data.

    • Comment removed based on user account deletion
  • 1984 (Score:5, Insightful)

    by Train0987 ( 1059246 ) on Thursday May 17, 2018 @10:08AM (#56626568)

    George Orwell was a visionary.

    • Comment removed (Score:5, Interesting)

      by account_deleted ( 4530225 ) on Thursday May 17, 2018 @10:28AM (#56626714)
      Comment removed based on user account deletion
    • He applied linear thinking and took it to the extreme to make a good story. But societies go through cycles. The tide turned in 2016 -- perhaps as a consequence of the 2008 crash -- where people have rejected, democratically speaking, the vision that had been offered to them, of which this is a part. Try as he might have, Schmidt couldn't help Hillary win.

      • by lgw ( 121541 )

        While that's an optimistic thought, 2016 was not a rejection of totalitarianism or post modernism, merely a rejection of the most corrupt presidential candidate in a century. There's little evidence thus far that people are rejecting identity politics, which is the lever by which modern totalitarians move themselves into power.

        Orwell, an ardent socialist, saw well the dangers socialism presented for descent into totalitarianism.

        • I imagine that such a candidate was even offered as a choice in the first place was a direct reflection of the state we were in -- totalitarian-leaning people thought they could get away with it. If so then Hillary, endorsed by the likes of Schmidt, was part of the package that was rejected, which maybe includes identity politics as well. That's at least the rationalization for my optimism.

          Also there was Brexit...

    • Re:1984 (Score:5, Interesting)

      by RobinH ( 124750 ) on Thursday May 17, 2018 @01:28PM (#56627984) Homepage
      1984 was about absolute and total control through fear, whilst Brave New World was all about social engineering. In 1984 there's also some controlling of what people thought too, but Brave New World is much closer.
  • by Anonymous Coward

    Google and Youtube already "nudge users into alignment with their goals" by manipulating search results, pushing sites/producers with opinions they prefer and hiding those they disagree with.

    I suppose 2018 is the future they were thinking about in 2016.

    • Google and Youtube already "nudge users into alignment with their goals"

      In that context, *their goals* = Google's goals.
      You know the "feelgood" piece at the end about solving poverty, disease and world peace.
      So Google can justify what they're doing by deluding themselves it will lead to "greater good".

      by manipulating search results, pushing sites/producers with opinions they prefer and hiding those they disagree with.

      in your context *they prefer* and *they disagree* are the users :
      - algorithms are optimized for one single target : bring more clicks in (because that's what makes more money going in by providing more eyeballs to sell to advertisers)
      - and the machine "learning/AI/NN/whatever is p

  • by argStyopa ( 232550 ) on Thursday May 17, 2018 @10:12AM (#56626604) Journal

    Start with "don't be evil"
    ends up with a terrifying Big Brother-y quasi police state* 'managing' everyone's behavior "for the public good, of course Mr Smith"

    *you might say that Google is merely gathering data and at most 'nudging' behavior. I'd say that when Google can concatenate & save forever EVERYTHING YOU DO to a degree that would make FB and Cambridge Analytica (you know, the guys being publicly lynched for doing exactly this?) blush, and use that data against you in ways ranging from subtle to blatant including simply handing your data over to authorities, then yeah, I'm going to call that a quasi-police state whose 'public/private' partnership borders on Fascism.

    • One of the things that struck me most about all the panic around Cambridge Analythica is that what they did wasn't all that different from what much bigger companies like Google, Facebook and Twitter have been doing for well over a decade and making quite a lot of money on. Only significant differences I can think of are that they were much smaller, didn't actually collect the data themselves and analyzed the data just for external clients rather than their own gain.

      Particularly Facebook behaving the way
      • by anegg ( 1390659 ) on Thursday May 17, 2018 @10:52AM (#56626912)

        Only significant differences I can think of are that they were much smaller, didn't actually collect the data themselves and analyzed the data just for external clients rather than their own gain.

        You forgot about the part where they data was used to the possible benefit of conservative politicians instead of for liberal objectives. I'm not sure that that wasn't what goaded some folks into being really upset.

        • I'm pretty particularly Facebook, who is known to censor all kinds of things for various autocratic governments, has provided their services to conservatives and are just more hush-hush about it than Cambridge Analythica was.
      • I'm not sure you understand the issue then. It wasn't CA that people had a problem with, it was Facebook selling user data wholesale to external parties. This is vastly different than Google's business model and vastly more of a privacy problem. Facebook should have done the analysis for CA internally and only sold them the anonymized results.
    • I'd say that when Google can concatenate & save forever EVERYTHING YOU DO to a degree that would make FB and Cambridge Analytica (you know, the guys being publicly lynched for doing exactly this?) blush, and use that data against you in ways ranging from subtle to blatant including simply handing your data over to authorities, then yeah, I'm going to call that a quasi-police state whose 'public/private' partnership borders on Fascism.

      I'm actually concerned that people's data will be used against them to derive more than just a social score like China. Imagine if the Nazis had access to what religion, ethnicity, and political leanings of all within its borders who were tracked in real time - the damage that could be wrought would be far, far greater. There is the finnancial havoc you could wreak as well given the ability to effectively use this data. There needs to be more oversight and counterbalance to this because there is no putt

      • Imagine if the Nazis had access to what religion, ethnicity, and political leanings of all within its borders who were tracked in real time - the damage that could be wrought would be far, far greater.

        Don't worry; social justice warriors would have taken to the streets demanding safe spaces for gypsies and homosexuals.

        The Jews would have still been fucked though.

      • Why not? All it would take is some legislation.
        • Why not? All it would take is some legislation.

          No, it wouldn't. First off the NSA (along with other US and foreign agencies) isn't going to listen as they were doing this before the patriot act, but more importantly companies will continue to do this around the world outside European or American influence. Data collection, storage, and processing is only getting easier through moores law.

    • by alexo ( 9335 ) on Thursday May 17, 2018 @11:04AM (#56627014) Journal

      If it's objectionable when the Russians do it, it should be equally objectionable when Google does it.

  • What is the maximizing stockholder wealth justification for "solving global problems like poverty and disease"?

    As much as tech companies may want to be a new religion, they aren't. The incentive models of capitalism support a limited scope of activity effectively.

    At least Bill Gates is honest in this regard--charitable focus requires an organizational structure supporting that. Corporate virtue signaling and technological handwaving aren't it.

    • Poor diseased people do not buy or watch Google ads. It is in our selfish self-interest to uplift developing countries so as to create new markets where to sell our stuff, new products and services to buy off of them, add manpower to the global research effort or whatever lofty goal you prefer.

      • by Empiric ( 675968 )

        You're conflating "our" abstract interests with Google stockholder interests.

        I'll leave aside the question of how specifically Google's data mining will "uplift" these countries. I'll even leave aside the question of how companies in general will do so.

        At least until you give me a reason not to consider your position summarily refuted by actual history and practice, with one simple link. [nytimes.com]

        The Marshall Plan actually "uplifted" countries devastated by World War Two. Companies did not volunteer to absorb these

        • Fun fact, nobody cares about America. The world is much better off today, with a lot less people living in poverty and cushy desk jobs for people who used to farm the land with medieval tools as recently as the 80s and 90s. Most of the world is thrilled about globalization. I myself made a pile of money as an online freelancer when I was younger.
          In Portugal, my day job is launching web pages for the developing countries we once colonized... as part of that, I need to know where to host things and how the ne

          • Fun fact, nobody cares about America.

            Didn't read past that, but you might want to look it up. ;)

            Turns out, we're the most important country in the world, and even our enemies agree. You're just an idiot jousting with a windmill.

            If you ever get internet access, check to see what exists outside your tiny country. There is a whole big world out there, and wherever you're from, your countries significance in the world is very small. Railing at clouds doesn't change that in any way; you probably have no idea that sort of more serious complaints peo

    • by JBMcB ( 73720 )

      What is the maximizing stockholder wealth justification for "solving global problems like poverty and disease"?

      If you can mitigate/fix/whatever disease, which makes workers less productive, and make poor workers more productive, generating more wealth for themselves and others, governments will pay for that technology.

      Heck, simply knowing what the problem is gets you halfway to solving it. If the government is spending tons of money mitigating bird flu, which affects maybe a few dozen people a year, but, in aggregate, people miss hundreds of thousands of days of work from regular flu, then maybe some funds should sh

      • Heavily impoverished countries are generally countries where their governments would declare war on somebody trying to make their citizens lives better, not places that would pay to make their citizens lives better.

        I guess this is how they're going to do it; idiots like you will get sites that explain these things at the top of your search results, and they can just increase the volume until you stop saying such stupid shit.

    • by swb ( 14022 )

      Reduced social welfare costs should result in less government growth and reduced or at least not increasing levels of taxation.

      Lower or stable payroll taxes means reduced salary demands and higher profits?

      I'm not saying this isn't a flawed argument (government reducing taxes, etc) but from an economics standpoint, reducing social welfare externalities results in less drag and deadweight losses.

      There are probably other arguments, like turning impoverished people into Google customers which would be an expans

      • Only in a rich country with a strong system of civics.

        In poor countries that is often not the case at all, and the ruling class doesn't care about things like "less drag and deadweight losses," instead their focus is on maximizing the divide between the rich and the poor, by making sure the poor have maximized social welfare externalities. People don't care about money, they care about power, and people from rich countries measure power by their money. Governments in poor countries often measure their power

  • by sinij ( 911942 ) on Thursday May 17, 2018 @10:20AM (#56626672)
    So they finally switched to "Be evil?"
  • Insidious and evil (Score:5, Insightful)

    by Okian Warrior ( 537106 ) on Thursday May 17, 2018 @10:21AM (#56626678) Homepage Journal

    There's an old saying about democracy being "two wolves and a sheep voting on what to have for lunch".

    The point being, the republic was set up to aspire to higher goals than can be achieved by pure democracy alone. We have people in power who are not bound by the will of the people, they can vote their conscience based on what they think is right. We take guidance from a bunch of enlightened people 250 years ago who set up basic guidelines to do this.

    The idea of a bunch of like-minded people getting together and trying to "nudge users into alignment with their goals" is the same thing, it's "two wolves and a sheep" writ large.

    We're seeing this today with the changes in user policy. YouTube used to be a bastion of free speech, everything that wasn't explicitly illegal was allowed... until that changed, and you can no longer talk about guns, or have conservative views, or cast aspersions on certain races or religions. (But it's OK when those races or religions cast aspersions back.)

    Their goals are well-meaning today so that people will get behind the efforts and help, tomorrow their goals may be different.

    Even when you agree with their goals, not everyone agrees with their proposed solutions - and yet they still try to influence public debate. Climate change is one of these issues, where a lot of people would agree that it's a problem and something should be done, if only the solutions weren't politically motivated.

    What they are proposing is control over social thought. Unlike PACs or advertising, it's done without oversight or transparency. We complain about PACs not having enough transparency, and not knowing who pays for political ads - are we going to allow Google to be similarly opaque?

    Next election it won't be "Russians hacked the election", it'll be "Google hacked the election".

    Nudging behaviour like this is insidious and evil.

    • Next election it won't be "Russians hacked the election", it'll be "Google hacked the election".

      Next election?

      What's interesting to me is that Google, Facebook, etc. have already been trying this. It's no secret who's side they have been on, and no doubt that they have been "nudging" (as blatantly as they could have? maybe not, but nudging for sure).

      It may have worked, in 2012. It failed, in 2016. What interests me is why/how it failed.

      Is Trump just The Mule [wikipedia.org] or something, a one time anomaly? Or are we more reliant against this stuff than previously thought?

      • That was supposed to be resilient, not "reliant".
      • Re: (Score:2, Insightful)

        by Shotgun ( 30919 )

        I personally believe that we are resilient to this sort of manipulation. Humans are a naturally suspicious species, and these manipulations never seem to "feel right". For instance, my "feel" of the global warming debate:

        OMG! The world is getting too hot! It's our fault! Let us have control!
        Wait!? What!? CO2 is a trace element that has barely moved up, and we only have a small amount of accurate data composed of a few years over a limited area. Can we have a look at your data?
        HELL NO!

        • The best propaganda is always that which subtly manipulates the emotions of those exposed to it. "I may not remember what you said, but I remember how you made me feel" is the principle behind it, and your post seems like a fine example of it in action. Strong emotional reactions put up barriers to what would otherwise be reasonable arguments.
    • Their goals are well-meaning today so that people will get behind the efforts and help, tomorrow their goals may be different.

      EXACTLY! This is what the SJW's asking for more government intervention never seem to grasp. Today's noble cause is tomorrow's tool of the oppressor/tyrant. Those who want to empower the government (and/or large corps with government's help) to effect social change refuse to understand this.

    • by anegg ( 1390659 )

      What they are proposing is control over social thought. Unlike PACs or advertising, it's done without oversight or transparency. We complain about PACs not having enough transparency, and not knowing who pays for political ads - are we going to allow Google to be similarly opaque?

      Much of the problem comes down to the transparency/visibility of the process. Open political debate depends on knowing who is proposing what, and being able to understand their motives. When the "nudging" and other pressures are

    • There's an old saying about democracy being "two wolves and a sheep voting on what to have for lunch".

      That what representative democracy is. (And is huge chunk of the reasons why your system is so screwed in the US).

      Meanwhile, direct democracy works. You should try this sometimes.

      • We have it in Oregon, it works great. Oh, BTW, we're one of those United States. Many other States have direct democracy, too.

        There aren't many other places in the world that have it at all, I doubt there is anyplace that does it better than Oregon.

        Direct Democracy is an important idea, I'd encourage you to learn more about it.

      • by q4Fry ( 1322209 )

        There's an old saying about democracy being "two wolves and a sheep voting on what to have for lunch".

        That what representative democracy is. (And is huge chunk of the reasons why your system is so screwed in the US).

        Meanwhile, direct democracy works. You should try this sometimes.

        I am pretty sure that direct democracy is 6,563,729 wolves and 5,235,973 sheep voting on what to have for lunch.

    • by alexo ( 9335 )

      We have people in power who are not bound by the will of the people, they can vote their conscience

      People that have a conscience do not rise to positions of power.

    • by TheRaven64 ( 641858 ) on Thursday May 17, 2018 @11:45AM (#56627324) Journal

      There's an old saying about democracy being "two wolves and a sheep voting on what to have for lunch".

      The old saying is from someone who doesn't understand game theory. The outcome of such a vote would be that the stronger wolf would be eaten. The weaker wolf knows that it would be dinner tomorrow if it eats the sheep, the sheep knows that it has a better chance of running away from just the weaker wolf than from either both wolves today or the stronger wolf tomorrow.

      • that people aren't wolves and sheep. The majority of folks aren't looking to devour each other. A small group of us are and they use a simple set of tricks (mostly Bigotry to divide the working class into manageable groups). The way out is for folks to realize this and work together. The way out is reason and science. E.g. the things that make us human.
    • "The point being, the republic was set up to aspire to higher goals than can be achieved by pure democracy alone. We have people in power who are not bound by the will of the people, they can vote their conscience based on what they think is right. We take guidance from a bunch of enlightened people 250 years ago who set up basic guidelines to do this."

      Good post; I would only append that the FF who wrote the constitution were *very* aware of this, and (tried, at least) wrote a constitution which was in ever

    • the constitution and replace our crap system designed by and for wealthy landowners with actual democracy, meaning a parliamentary system. That still won't save us from a total collapse of the country's economic systems (e.g. what's going on in Venezuela right now) but it'll put a stop to the systems we built that are designed intentionally to limit democracy, to wit: The Senate & The Electoral College (there are others, it's a complex topic).

      The problem is if we call a convention to fix the Constit
  • Psychohistory? (Score:5, Insightful)

    by Edweirdo ( 449577 ) on Thursday May 17, 2018 @10:22AM (#56626690)

    Is Hari Seldon running Google now?

  • by Bongo ( 13261 ) on Thursday May 17, 2018 @11:20AM (#56627132)

    It seems kinda scary that a big brother org could shape the environment of information so as to influence people's behaviour.

    But then I remember that humans are not so simple. To us the world is not a mere stream of information, rather, it is a world of meanings which we create and organise, where meaning is within a context which is within a context and so on. Just think of a famous piece of art, and all its parodies. Consider fashion and how it changes. The way that people's aspirations and goals, their likes and dislikes, their moods and opinions, all flow in an ever-changing, re-created anew, stream of reactions and counter-reactions. Life is change. And the "facts", the "data" which tech people are so enamoured of, is only one half of reality. The other half is inter-subjective re-creative re-authored re-organising meaning-making. Today you love X and feel it is the best person or thing in the world, tomorrow you're bored with X. Show me an AI that can cope with that, and then I'll say you've passed some kind of fancy test. An AI that understands new ironies. What a joke.

  • by gotan ( 60103 ) on Thursday May 17, 2018 @12:03PM (#56627438) Homepage

    It's not 1:1 but some aspects like exploiting groupthink to "do good", "nudging" people to conformity etc. are common.

  • It imagines a future of total data collection, where Google helps nudge users into alignment with their goals, custom-prints personalized devices to collect more data, and even guides the behavior of entire populations to solve global problems like poverty and disease.

    Fuck you sideways with a rusty chainsaw, Google, I neither need, want, or will allow you to 'guide my behavior', so how about you go fuck yourselves, you fucking fucks?

    Mad? Yes. If shit like this doesn't make you mad, then there's something wrong with you.

    You want to win this game, people? DON'T PLAY AT ALL. Dump Google, dump so-called 'social media', and take your lives back. You don't need anyone to 'guide your behavior'. Google and others need to stay out of our lives.

  • Don't use their search engine any more, don't use any of their services, don't use any of their products, and block their trackers while web browsing.

    I fail to see how they'll do anything with me.

  • One search engine to rule them all

    one search engine to find them

    One search engine to bring them all

    and in the darkness bind them

  • All about power (Score:5, Insightful)

    by ChatHuant ( 801522 ) on Thursday May 17, 2018 @04:26PM (#56628968)

    Knowledge is power. As Google knows you better and better, they have more and more power over you. This video shows they're already considering how to exercise this power. This is the obvious next step for them (and, FWIW, I had already called it: https://slashdot.org/comments.... [slashdot.org]).

    Google, Facebook and the other data vampires really need to be stopped. The EU GDPR is a step in the right direction (though I, personally, would prefer both companies, and other privacy infringers, like Equifax, to be dismantled, or broken up). Unfortunately, the US government is already in Google and Facebook's pockets (it's not for nothing that Google is the largest corporate lobbyist in the USA), so I don't expect any useful legislative action.

  • Comment removed based on user account deletion
  • Narrating the video, Foster acknowledges that the theory may have been discredited when it comes to genetics but says it provides a useful metaphor for user data.

    How can I be reading Robert Sapolsky from 2017 (Behave) who is talking like Lamarckian epigenetics is still a thing, while a narrator from 2016 is saying it's not a thing?

    Because Lamarckian epigenetics is still a thing in nematodes. It just hasn't been much demonstrated in mammals yet.

    Whew! For half a second I was afraid that the hairiest Out of A

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...