Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Google Australia Privacy Wireless Networking Your Rights Online

Google Audits Street View Data Systems 229

schliz writes "Google's plans to upgrade to high-definition Street View in Australia are on hold until it completes a rigorous internal audit of the processes, it announced today. The company is currently being investigated by international regulators about possible privacy breaches when it became known that its Street View vehicles were capturing not only publicly available SSIDs and MAC addresses, but also samples of payload data transmitted over these networks."
This discussion has been archived. No new comments can be posted.

Google Audits Street View Data Systems

Comments Filter:
  • by Tom ( 822 ) on Tuesday May 25, 2010 @08:43AM (#32335294) Homepage Journal

    I'm really looking forward to the comments. When BP lets the oil spill continue day after day, the /. crowd goes asking why we let them handle it at all, after all they're the ones responsible for the mess.

    Now Google has a mess, and is doing an internal audit. I'm curious if we will apply the same reasoning, or a different standard. And what justifications we'll see for it.

    • by shentino ( 1139071 ) <shentino@gmail.com> on Tuesday May 25, 2010 @08:48AM (#32335376)

      Simple.

      We trust Google more than we do BP.

      Personally, I think for a good reason too.

      • I don't trust EITHER of them.

        I just trust the government(s) less (almost 100 million of its owncitizens killed in the last century). For example, I don't want the German or EU government demanding copies of Google's hard drives and peering through our private data. Who knows what they use it for? During WW2 data was used to imprison millions of Americans who had done nothing wrong.

        • by Miros ( 734652 )
          I agree, the government should not be trusted with that kind of data, but how would you feel about government regulation and oversight of data collection and retention practices and policies?
          • Government regulatory agencies sound good in theory, but in reality they are often mere puppets of the corporations that bribe them, so they don't work.

            I'd rather see Google's corporate license revoked. Let them operate as a proprietorship whose owner(s) have full liability for his company's actions, and then I can sue the bastard in court for theft of my data. Or even better - boycott the company and drive them into bankruptcy (as happened to Circuit City).

            • by Miros ( 734652 )
              So you would argue that government regulation is completely useless in all instances? I'm no fan of the government either, it always screws everything up and is totally in the pocket of all of the various lobbyists. But at the same time, the incentives that exist for private corporations (and individuals) can lead them to do things that are just way not in the interests of the larger group. What do you do in those situations if not government regulation?
            • Re: (Score:3, Insightful)

              I'd rather see Google's corporate license revoked. Let them operate as a proprietorship whose owner(s) have full liability for his company's actions

              You are seriously arguing that a single person should be responsible for the actions of twenty thousand other people?

              REALLY?

    • by Miros ( 734652 )
      BP is handling the spill because the government does not have the technology/resources necessary to handle it better [nymag.com]. Google is a totally different situation though. They are acting in an arena where there is little government oversight/regulation at present, so the responsibility falls entirely on them to "do the right thing" from a moral standpoint, and they appear to be failing, once again, to act in the public's best interests. It's my opinion that this is yet another example of why government oversi
      • by LordLimecat ( 1103839 ) on Tuesday May 25, 2010 @09:26AM (#32335940)
        I like how both your comment AND TFS imply that Google got "caught" doing something. You DO realize that they openly disclosed (without coercion or prompting) this whole wireless mess, right? How is disclosing a mistake to those affected, and then working towards a resolution "failing to do the right thing"? What steps would you propose they take?
        • Re: (Score:3, Informative)

          by Miros ( 734652 )
          We don't know what prompted them to disclose the collection in the first place. Corporations have been "coming clean" on things that were on the verge of being exposed _forever_, there is nothing to suggest that such a thing did not happen here. They "failed to do the right thing" in collecting the information in the first place. Even if we take it to be an "accident" there still must have been employees who were aware of what was happening and chose to not act sooner. I don't know if you realize, they
      • >>>t's my opinion that this is yet another example of why government oversight of privacy standards is not only a good idea, it's a necessity.

        It's my opinion that this is yet another example of why government oversight of privacy standards is a BAD idea. Last time the US "overlooked" data they used it to imprison several million innocent Americans during WW2. Then they used it to do radioactivity experiments on blacks without their knowledge. Then they used the data to round-up Americans and thr

        • by Miros ( 734652 )
          I couldn't agree with you more on all of your points here. But I think you're overlooking one of the mechanisms of regulation -- it is typically reactive. The examples you cite are of government misappropriation of the implicit trust that we are all forced to put in it (to keep us safe I guess). Typically regulation of the private sector comes in the wake of various abuses that are eventually found to cause unnecessary harm to the public (pollution is obviously the easiest example, there was a time that t
    • by symes ( 835608 )
      I'm not sure if the same logic can be used across the situations BP and Google find themselves in. BP are in a difficult position, drilling for oil is risky and sometimes things go wrong and unpredictably so. That is the nature of the business. There are, of course, legitimate questions on what they could have done to prevent the accident but ultimately BP did not want this situation, it is bad for business. Google on the other hand created this issue, they actively ignored concerns over privacy and gave no
      • >>>Google seem to think that their bottom line is more important than users rights to privacy.

        Bullshit. It was Google was *voluntarily* told the world what they had done, and were erasing the data. If they were as you described, the managers would have kept silent and just kept collecting.

    • I'm pretty sure if BP could put the oil pouring out of the well "on hold" while they did their "internal audit" no one would care.

      Seriously, you can't see the difference between something that is outside the control of the company (BP haven't stopped the oil spilling even though they want to) and something that is (google has stopped collecting said data, for now anyway)?

      But as I've said before BP is doing all the can to fix the problem, they are drilling a relief well. But people don't want to be told "the

    • Tens of billions of dollars in environmental damages that were going to have to be cleaned up by the taxpayers.

    • I'm really looking forward to the comments. When BP lets the oil spill continue day after day, the /. crowd goes asking why we let them handle it at all, after all they're the ones responsible for the mess.

      Now Google has a mess, and is doing an internal audit. I'm curious if we will apply the same reasoning, or a different standard. And what justifications we'll see for it.

      I'm honestly shocked that you would be comparing Google's little accident to BP's massive catastrophe that could potentially have long-standing affect on the entire planet.

    • by Ephemeriis ( 315124 ) on Tuesday May 25, 2010 @09:34AM (#32336038)

      I'm really looking forward to the comments. When BP lets the oil spill continue day after day, the /. crowd goes asking why we let them handle it at all, after all they're the ones responsible for the mess.

      The whole BP thing is simply a giant WTF.

      I have a genuinely hard time wrapping my head around the fact that they're drilling in water this deep with absolutely no ability to deal with problems like this. They aren't just scrambling to deploy a fix, they're scrambling to come up with a fix.

      It doesn't seem like BP should be willing to do something that risky without a disaster plan.

      It doesn't seem like the Government should give them the go-ahead to do something that risky without a disaster plan.

      It doesn't seem like stockholders should allow them to do something that risky without a disaster plan.

      And yet, here we are.

      Now Google has a mess, and is doing an internal audit. I'm curious if we will apply the same reasoning, or a different standard. And what justifications we'll see for it.

      Google's mess isn't going to kill any wildlife or pollute any waterways. It's very unlikely to result in anybody losing their livelihood. They're also conducting the audit before going ahead, rather than after something has gone horribly wrong (at least with the HD thing in Australia).

    • Re: (Score:3, Insightful)

      by Morty ( 32057 )

      BP's oil spill has far greater scope and urgency:

      * The oil spill is a regional environmental catastrophe. It has scope well outside of BP or even the oil industry as a whole -- it's impacting marshlands, seafood industry, tourism, and other industries. So far, this privacy issue seems to only be present within google.

      * The oil spill is an emergency. We normally give companies a chance to "make it right". In the case of the oil spill, any unnecessary delay means definite short-term damage/impact to the e

    • by Kaboom13 ( 235759 ) <kaboom108@NOsPAm.bellsouth.net> on Tuesday May 25, 2010 @10:09AM (#32336450)

      Google's data mining is annoying at best, BP's oil spill is an environmental disaster that will harm millions of people (not to mention wildlife) in ways we can't even begin to calculate yet. Applying the same standard is stupid, because it implies the scale of the problem is in anyway similar. Furthermore, while it is fairly understandable to make mistakes in software systems that will at worst collect data about unencrypted wifi traffic, it is not understandable to make mistakes in a critical safety device that lives and the economic and environmental prosperity of an entire coastline depend on.

      Google is in the wrong, and so is BP. But to pretend that the seriousness of the way they are wrong is in the same ballpark is ridiculous, and therefore the expect the same reaction is ridiculous. If you do an employee background check, and one of your employees was fined for littering, the other convicted of theft, manslaughter, criminal negligence, bribing public officials, and destruction of property, you would react in different ways. Thats the difference in severity we are talking about.

    • I can see how some angry kid would equate these two incidents, but I'm shocked that something this idiotic would get voted the top comment of 110!!

  • by kyz ( 225372 ) on Tuesday May 25, 2010 @08:46AM (#32335340) Homepage

    I'm also interested in privacy galoshes, privacy longjohns and privacy jodhpurs

    • I don't know about the galoshes, but since the rest keep my junk covered, they'd definitely qualify for the "privacy" label. :)

    • Re: (Score:3, Insightful)

      by Megane ( 129182 )
      I was wondering about privacy trousers myself.
    • by psmears ( 629712 )

      I'm also interested in privacy galoshes, privacy longjohns and privacy jodhpurs

      Another member of the tinfoil trouser brigade?

  • by dward90 ( 1813520 ) on Tuesday May 25, 2010 @08:47AM (#32335356)
    While I'm not an expert on security or privacy, it seems to me like "publicly available" should mean that they didn't gather any data that citizens weren't openly broadcasting anyway. From an ethical perspective, it's shaky at best, but it's probably a huge difference legally.

    I'm not endorsing Google's collection, but aren't people who openly broadcast their data be at least *a little* at fault here?
    • Re: (Score:2, Informative)

      by Em Emalb ( 452530 )

      Yes, people should definitely secure their communications.

      That said, just because someone leaves their door open, doesn't mean Google should waltz right in.

      • by cbiltcliffe ( 186293 ) on Tuesday May 25, 2010 @08:55AM (#32335470) Homepage Journal

        Google didn't just "waltz right in."

        They collected it by accident, and when they realized they had it, they publicly stated that they had the information, and were purging it.

        They didn't need to say anything, because nobody knew they had it until they announced it. But in the spirit of openness, they stated what had happened, how it had happened, and their proposed remedy for the situation.

        The fact that various regulators are getting pissy about it isn't their fault.

        • by papasui ( 567265 ) on Tuesday May 25, 2010 @09:00AM (#32335568) Homepage
          **Cough**Bullshit**Cough** There's plenty of wifi scanners available that only collect SSID and mac addresses. They don't necessarily sniff the data and record it. Google or the company they contract made a decision to gather this data, the only accident was getting caught.
          • If the story had been "google accidentally gathers SSIDs and mac addresses", I would have been alongside you saying "baloney"... mapping that stuff out is exactly the sort of thing Google is into. But sniffing data in a way that is guarenteed to cause legal issues, and THEN announcing it to the world? Google is much more savvy than that, I dont buy that it was intentional.
          • by nedlohs ( 1335013 ) on Tuesday May 25, 2010 @09:44AM (#32336174)

            So you know their claim that they reused some software from another google project without noticing it recorded more than what they actually cared about is false?

            And you know that the programmer who did so either didn't realize at all or didn't just think "who cares if it wastes resources grabbing that stuff it's minuscule and we can just not use it" and just used it without mentioning it to anyone?

            Are you omniscient? Or do you just spend your life spying on google?

          • But why? (Score:3, Interesting)

            by Gorimek ( 61128 )

            I've yet to see anyone accusing Google of lying about this explain why they would want to get this data?

            It's hard for me to think of anything more useless than tiny random snippets of unidentifiable wifi traffic from German roads. What do the conspiracy theorists think Google is using it for? What would be a possible business plan to monetize it?

        • by Miros ( 734652 )
          You are suggesting that in the entire chain of people who were attached to that project, who knew the processes and methods that were employed, nobody noticed that they were "accidentally" collecting data? Either it was not an accident, or a whole lot of people at Google are completely OK with looking the other way when it comes to accidental user privacy errors. I don't know about you, but I think the latter of those two may actually be worse.
          • I dont work at a big company, but do managers always know the inner details of the settings used in the programs their employees use? Do CEOs know about compiler options used by their devs?
            • Re: (Score:3, Insightful)

              by Miros ( 734652 )
              No, they don't, which is exactly my point. In order to have an organization that could do something like protect the privacy of the users/customers/public effectively the culture of the corporation has to promote accountability and responsibility all the way down to the lowest levels. People on the bottom, the ones who actually do the acting on the part of the organization, have to have been given a good understanding of what management thinks is valuable from a moral standpoint and encouraged to act on t
        • by papasui ( 567265 ) on Tuesday May 25, 2010 @09:22AM (#32335866) Homepage
          They didn't offer it up, they got caught in Germany. It's spin that they are being the 'good guy' and offering it up in other countries. http://news.bbc.co.uk/2/hi/technology/8684110.stm [bbc.co.uk] And also, as a company that data would be deemed a record and needs to be treated in compliance. http://en.wikipedia.org/wiki/Records_management [wikipedia.org]
        • by ljw1004 ( 764174 )

          "Realized they had it"?? They're like the kid who only "realizes" his hand is in the cookie jar after his mother catches him.

      • by HungryHobo ( 1314109 ) on Tuesday May 25, 2010 @09:01AM (#32335582)

        and if someone publishes a web page you shouldn't be able to just waltz right in and view whatever's on it!

        If someone watches you walk around naked while you're in the bathroom that's a violation of your privacy.
        If someone watches you walk around naked in the middle of the street then they have done nothing to violate your privacy.

        people shouldn't be required to secure their communications *effectively* but some kind of symbolic security should be required to expect any kind of privacy.

        • by Miros ( 734652 )
          This is assuming that most common users understand that their networks are not properly secured and are making a conscious and informed decision to share their data with anyone in range of their network. That is a stupid assumption from a societal standpoint. A good parallel would probably be analog cell phones, which could be monitored using specialized radio scanners, which were then made effectively illegal to prevent eavesdropping. The argument you just made could have been applied to that same situa
          • by cynyr ( 703126 )

            Ignorance is no excuse, "Sorry officer, i didn't know that driving under the influence of alcohol was a crime here, it's not in Uzbekistan"... Same thing here, "I didn't know leaving my AP unencrypted would let everyone see my LOLCATS!".

            From everything I have heard about this incident, they only collected from open APs, they did not in fact break any encryption. So as far as i'm concerned the data was "public" with all intentions of it being that way, like painting your name, SS, and DoB on your garage doo

            • by Miros ( 734652 )
              This isn't a legal issue (yet) it's an ethical one. So fine, as you have pointed out, many of the people who had their data collected may have been ignorant of the fact that their data could be gathered by any passers by. Does that mean that they wanted that data to be shared and disclosed? Obviously not, it may even suggest that many of them, if not ignorant, would have chosen to protect that information (as you pointed out with your SS, DoB Garage Door analogy, that's information that you would obvious
          • the little lock symbol you see when connecting to a secure wireless is a clue as is it's conspicuous lack.

            In your world cheap walkie talkies would be illegal because someone might be using a pair and be too stupid to understand that anyone else with a similar walkie talkie could be listening in.

            with the old phones you had no real options.
            the devices couldn't be used otherwise.

            Wireless routers with the exception of stunningly ancient ones have a handy little dropdown menue where you can select an open or sec

            • by Miros ( 734652 )
              It's not a legal argument, it's a moral argument. The fact that the person you're snooping may or may not know that they can be snooped does not make it right for you to do so.
              • in other words whatever you think is wrong is wrong and no consistent or solid justification is needed.
                If you don't like it then it's wrong and should be punished!

                • by Miros ( 734652 )
                  That's a pretty limited perspective I think. There are commonly held moral beliefs many of which have been codified and studied in depth by academics and practitioners in a variety of disciplines. I think in this case I don't need to justify my claim that this is a moral issue. It's obviously a big deal for people and a burning question as to the acceptability of what Google did. Their actions were not in any way illegal, but even they appear to believe that it was a serious serious breach (cue internal
              • On a related note how do you know I actually intended to share anything on an FTP server I set up?
                it's quite easy to share folders you didn't intend to share so by that logic browsing any open FTP directory is immoral until you contact the owner, double check with them that they only shared what they intended to share.

                • by Miros ( 734652 )
                  If it were likely that the vast majority of anonymous FTPs were configured that way accidentally or out of ignorance, employed by a huge portion of all internet users, and by default contained detailed logs of all activity that people engaged in that involved a network connection: then yeah, I would agree with you. Fortunately this is not the case in reality.
                  • The vast majority of people I know who run open wireless networks are fully aware they're open- coffee shop owners, or techies who think it will give them an excuse when they're caught torrenting stuff.

                    detailed logs?
                    since when were google pulling the logs off the routers?

                    • by Miros ( 734652 )
                      Uhm, "Logs" in this case would be the equivalent of sniffed payload packets, which are way worse than router logs from a privacy standpoint. And I don't think any part of this discussion has anything to do with open public WiFi hotspots.
                    • This is all about open public WiFi hotspots because that's what you create when you set your wireless network to "open".

          • In the world I live in, it is called irresponsible (and illegal) to purchase and drive a car with neither the training nor knowhow to drive one. Why is hooking up a wireless router any different-- just because our culture has decided to promote irresponsible and reckless behavior?
        • And according to another story on Slashdot today, an employer visiting an employees public Facebook page is a violation of your privacy. Its amazing how many double standards there are.
          • where did anyone say they have no right to view a public profile?
            firing someone for a trivial offhand joke on a public facebook page on the other hand is a different matter.

        • Comment removed based on user account deletion
          • So to protect yourself from your own potential stupidity cameras and video recorders would be effectively illegal in public for anyone who can't afford a legal team.

            Great.

        • brb. Reading the letters in your mailbox. It wasn't locked or anything, and it's right out there for the public to access, so it's cool.

      • Walk through the door? More like you were standing inside your house and yelling, "My name is ___ and my password is ____ and I'm visiting the following sites: (insert list)." The neighbors are not to blame if they can hear your loud mouth, and neither are any passersby.

      • Yes, people should definitely secure their communications.

        That said, just because someone leaves their door open, doesn't mean Google should waltz right in.

        Nobody waltzed right in... Google drove by on the street and collected what it could see from the road.

        If you leave your front door open and stand in the hallway naked, you can't complain too much about Google snapping a picture of you.

      • Yes, people should definitely secure their communications.

        That said, just because someone leaves their door open, doesn't mean Google should waltz right in.

        On the opposite side though, they are broadcasting that information to the public in clear form.

        To use another analogy:

        What if the noisy neighbor got into shouting matches with another tenant in their apartment and you, unfortunately, became aware of some very personal details? Are you to blame for having those very personal details burned into your memory? Are you to blame for having ears and not being deaf?

        If you actually did sneak into their house and listened while they had a private conversation, then

    • by ATestR ( 1060586 ) on Tuesday May 25, 2010 @09:02AM (#32335596) Homepage

      Look at it another way. If there was a company - call it "Gaggle" - that drove up and down the streets and roads of the world making sound recordings to present a "Street Sounds" feature to their new mapping program. Would there be such a fuss if they recorded the voices of two people shouting across the street at each other? Its about the same thing.

      • by Miros ( 734652 ) on Tuesday May 25, 2010 @09:16AM (#32335788)
        The people shouting *know* that other people can hear them.
        • by cynyr ( 703126 )

          operating a radio transmitter should mean you know this. If not you should consult with people that know this stuff, much like you do with a mechanic, or plumber. Ignorance isn't an excuse, to use an analogy i used earlier on this topic;

          "Sorry officer, i didn't know that driving under the influence of alcohol was a crime here, it's not in Uzbekistan"

          you still get your nights accommodations for free.

          • by Miros ( 734652 )
            You're again countering a moral argument with a legal argument. However, what is legal and what is right are not one and the same.
        • The people shouting *know* that other people can hear them.

          And people communicating on a CB know that other people can hear them.

          And people communicating with an unencrypted wireless device should know that other people can hear them.

          The fact that they're ignorant doesn't really make it my fault that I overheard their conversation.

          • by Miros ( 734652 )
            You also don't have a choice when it comes to overhearing someone's conversation. It's a little different if you go through the effort of sniffing traffic from someone's open WiFi. Again, the issue here is not a legal issue, google didn't do anything illegal, it's an ethical question. Is it right to eavesdrop on someone's network traffic, particularly if there is a good chance that they don't even know it's possible? I feel like most people's gut reaction to that is a resounding "no."
            • You also don't have a choice when it comes to overhearing someone's conversation.

              Nor do you have a choice when it comes to overhearing someone's wifi traffic. Normally, when you're intentionally trying to talk to someone else, that's considered noise. It's other traffic cluttering up the spectrum, getting in the way of what you're trying to do. It's always there. If you're listening to wifi traffic, you'll hear it.

              It's a little different if you go through the effort of sniffing traffic from someone's open WiFi.

              No it isn't.

              There's no effort involved, they're simply capturing packets of traffic, not h4x0r1ng teh interwebs.

              If I'm conducting an interview with someone in a public pla

              • by Miros ( 734652 )

                Nor do you have a choice when it comes to overhearing someone's wifi traffic. Normally, when you're intentionally trying to talk to someone else, that's considered noise. It's other traffic cluttering up the spectrum, getting in the way of what you're trying to do. It's always there. If you're listening to wifi traffic, you'll hear it.

                Exactly, if you're listening and deliberately capturing the traffic. Your NIC is not in promiscuous mode by default, your OS is not logging the packets the card receives to a file somewhere.

                There's no effort involved, they're simply capturing packets of traffic, not h4x0r1ng teh interwebs.

                I'm sorry, what percentage of the general public do you think casually sniffs their neighbor's WiFi traffic, or would even know the basic principals involved in the process if you stopped them on the street and asked them? No, it's not "h3x0r1ng teh interwebs" but it's not taking out the trash either.

                As far as people not knowing it is possible... Why is their ignorance Google's problem?

                Because it r

        • Re: (Score:3, Insightful)

          by rotide ( 1015173 )

          If you buy radio equipment you should also *know* that other people can pick up the signals as well. If you don't want other people listening in on your data, simply "whisper" by using encryption.

          Ignorance is no excuse. RTFM when you purchase your radio transmitter (read: WAP/Wireless Router). Don't just bitch that you had no idea what security was and everyone listening is wrong for doing so.

        • Re: (Score:3, Insightful)

          by skywire ( 469351 ) *

          Precisely. That is why it is the perfect analogy. Just as a person shouting from a window has no reasonable expectation that passersby will somehow "shut their ears", neither does a person broadcasting unencrypted information have a reasonable expectation that the public will not receive that. This is not just a legal technicality; it is practical reality.

          • by Miros ( 734652 )
            "shutting your ears" requires making a conscious decision to wear ear muffs or something. Capturing someone else's WiFi traffic is not something that you just involuntarily do when you are within range, you have to make a decision to sniff the traffic. It's entirely different. And no, the issue is not a legal one. What google did was not illegal. It's a moral/ethical issue: was what they did wrong?
          • Just as a person shouting from a window has no reasonable expectation that passersby will somehow "shut their ears" [...]

            Just as I should have a reasonable expectation that it will not be recorded and that such a recording would be published without my consent by a passersby when I talk to a friend on the open street, I should have a reasonable expectation that no large corporation is peeking over my fence into my garden or sniffing my WLAN traffic in order to publish/sell/give away that data.

      • If "Gaggle" used highly sensitive microphones and could record a normal conversation inside your house or in your backyard from the street, would that be a breach of privacy? Should be expected that you need the proverbial "Cone of Silence" because someone might be walking/driving down the street with a sensitive microphone?

    • by Rayonic ( 462789 )

      The best analogy would be if the Street View cars had microphones to record... I dunno, traffic noise level, and they accidentally recorded you and your wife having a shouting match out in your yard. All recorded from public property (the street), and all quite legal.

      If it's not legal, then all those TV shows, filmmakers, and news gatherers who like wander around with a camcorder are in trouble.

  • Google should wear pants that hides more than its show. Because when your show is public, there's no privacy.

  • They could have gotten away with this scot-free without doing a full internal audit, not to mention temporarily halting data processing. Given the assumption that there's no hidden underlying cause pushing them towards this, it's slightly above-and-beyond in my opinion.
    • by Miros ( 734652 )
      Usually when these kinds of things happen (companies apparently acting against the public interest for their own gain and then getting caught in the process) there is a big backlash and a call for government investigations and regulations. Internal audits are just a classic tactic to try and squelch that knee-jerk reaction. Banks, manufacturing companies, heck, pretty much any kind of company caught in the government/public cross-hairs will do that. It's just a defensive play, it doesn't mean that they d
      • there is a big backlash and a call for government investigations and regulations. Internal audits are just a classic tactic to try and squelch that knee-jerk reaction.

        Didnt they CAUSE that backlash when they chose to disclose the issue in the first place? Are you saying they decided, "Lets cause a massive public PR disaster, and then lets attempt to appease the masses with a phony internal audit"?

        • Re: (Score:3, Informative)

          by Miros ( 734652 )
          The PR disaster could have very well been inevitable. Even if we take the story that they provided as true, that it was an accident, it is still likely that the truth would come out eventually in which case it would look far far worse than it does now. It's always better to come clean in those cases, particularly if discovery appears inevitable (believe me, lots of large corporations sweep all kinds of things under the rug, as long as they know for a fact that they stand little to no chance of being disco
  • by Anonymous Coward

    "not only publicly available SSIDs and MAC addresses, but also samples of **publicly available** payload data transmitted over these networks"

    There, fixed it for ya. At least half of the responsibility lies with those owning unsecured networks. If you don't want your data public, learn to secure it. Google is still at fault for breaking a public promise, mind you. However, the news stories seem to miss the crucial piece of information: _anybody_ can listen to these packets (and chances are many people do).

  • Stumble This! (Score:5, Informative)

    by Anonymous Coward on Tuesday May 25, 2010 @09:29AM (#32335972)

    This entire wireless thing is total BS. From what I have read, they were using kismet for their wireless collection program. and if they were channel hopping like any good war-driver I assure you they were not around long enough to get anything useful. (DNS,netbios,MDNS packets etc) All of it was open to begin with and all ready up for grabs. most people know what they are buying now when they get an AP that is not setup properly (Big warning stickers printed on box for setup).

  • With the promise of HD street view, what's the legal ramifications of Google taking a picture that allows someone to see into your house through a window? What about license plates? Could someone write an application that "walks" down the streets and OCRs all the visible license plates?

    Are we expected that if we want privacy we have to keep our blinds/shades closed at all times?

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...