Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Networking Google Wireless Networking IT

Bad Connections Dog Google's Mountain View Wi-Fi Network 144

itwbennett writes "Google launched its Mountain View, CA public public Wi-Fi network in August 2006. It was one of the first public wireless Internet services in the U.S. and was intended to provide free service across the city. But in 2012, one year after Google signed a 5-year agreement to continue the service, it started a slow decline to the point of being unusable. 'We started noticing it in very large files, things like operating system updates, but now it's on files as small as 500 kilobytes,' said Rajiv Bhushan, chief scientist of pharmaceutical startup Livionex and a long-time user of the network. A recent test by IDG News Service resulted in a total failure to get a working Internet connection at a dozen sites around Mountain View, including in the city's main downtown area and directly in front of Google's headquarters." I've had disappointing results trying to connect to several other public wireless nets around the U.S., both privately sponsored and municipal. Do you know of any that work especially well?
This discussion has been archived. No new comments can be posted.

Bad Connections Dog Google's Mountain View Wi-Fi Network

Comments Filter:
  • by TechyImmigrant ( 175943 ) on Saturday August 10, 2013 @01:19PM (#44531455) Homepage Journal

    Just unplug it and plug it back in again.

    • by Richard_at_work ( 517087 ) on Saturday August 10, 2013 @01:31PM (#44531527)

      No no no, they are holding it wrong!

    • You forgot, they also have to strap it to a blimp.
    • That worked for Kyle https://www.youtube.com/watch?v=ckIMuvumYrg [youtube.com]
    • Not enough. You have to take out the battery and put it back.
    • Re: (Score:3, Insightful)

      "In other news, something made by Google turns out to be a half-assed implementation of a good idea, unfavored by management and consequently determined to be a career-limiting move for Googlers unfortunate enough to be assigned to it. Consequently it is allowed to fall into disrepair, and will be scheduled for decommissioning at a time carefully calculated to maximize user inconvenience. Ric Romero has film at 11, so stay tuned for that."

      • Re:The solution (Score:5, Interesting)

        by AlphaWolf_HK ( 692722 ) on Saturday August 10, 2013 @10:46PM (#44533667)

        Or perhaps it's just oversaturated. Wifi doesn't have unlimited bandwidth you know. After enough people find out that they can stop paying for their regular ISP and just hop on a free wifi you'll start to run into problems.

        • Or perhaps it's just oversaturated. Wifi doesn't have unlimited bandwidth you know. After enough people find out that they can stop paying for their regular ISP and just hop on a free wifi you'll start to run into problems.

          ===
          This network problem is one which I call "teething". At low internet speeds (700-1.2kbits) everything is tuned for arrivals of packages and the queuing of packages for forwarding.

          At higher speeds, the buffering of packets must be much much larger, as the number of queued packets can vary from instant to instant. Ergo, forwarding and resending of packets takes more cushioning. Peeks and valleys in buffering need to be placed at every interchange point. Timeouts and the like are also important.

          In effect

        • by zizzo ( 86200 )

          I can tell you as a bonafide resident of Mountain View the net work is not over saturated. It is simply unusable. There was a brief time where the secure variant worked passably well but that doesn't even work now. I honestly suspect the problem is just the access points are not receiving any physical maintenance and are falling into disrepair. There's enough alive to maintain the visible SSID but that's about it.

    • Re:The solution (Score:4, Informative)

      by AmiMoJo ( 196126 ) * on Saturday August 10, 2013 @05:50PM (#44532847) Homepage Journal

      That's not actually a completely daft suggestion. A lot of wifi gear tries to automatically pick the clearest channel, but usually sucks at doing it while in operation. Rebooting disconnects all clients and gives it a change to do a full scan of all channels before selecting one, possibly switching to a less congested frequency.

    • Re:The solution (Score:4, Informative)

      by Que_Ball ( 44131 ) on Sunday August 11, 2013 @05:54AM (#44534617)

      Would it shock anyone to know I actually did this reboot to a malfunctioning public Wifi base station recently and it worked?

      I had a client moving into a new commercial location where the local cable company (Shaw) has one of their public Wi-Fi terminals installed.

      They did not have their own network connection yet (booked for a few days later) so we just joined their computer to the public network but it was horrible. The connection showed moderate to high packet loss which was strange because the base station was in the roof a few feet away. Even doing a ping test to first hop (the base station) was showing the packet loss problem. Increasing the packet size on the ping tests showed the problem was got worse as you increased the packets so anything that wanted a sustained download and not small little transactions was suffering worse effects.

      So I went into the back, found the power injector for the base station and cut the power. Plugged it back in, and after the reboot it was working well. No more packet loss, and a usable connection.

      Maybe Shaw needs to update the firmware on these Cisco base stations they are using.

      • Maybe Shaw needs to update the firmware on these Cisco base stations they are using.

        More likely you need to enable blackhole route MTU TCP probing by writing a 1 or to 2: /proc/sys/net/ipv4/tcp_mtu_probing

        to deal with the fact that Shaw blocks ICMP for PMTU autodiscovery, just Like Cox and TCI do, and then runs PPPOE so an MTU of 1500 fails any time you get close to the actual 1500 bytes being sent. Like, oh, say if you foolishly were using Facebook, which likes to pile cookies into your HTTP header until it gets to the point where you're sending 9 packets back and forth just to get one H

  • by alen ( 225700 ) on Saturday August 10, 2013 @01:20PM (#44531461)

    you transmit into the air and everyone receives the signal and the receiver has to filter out any traffic that is not meant for itself

    too much data being transmitted by people in the area and the connection is useless. even my home wifi is almost useless during peak times at night since i have two dozen or so other people with wifi around me

    • by deanklear ( 2529024 ) on Saturday August 10, 2013 @02:42PM (#44531993)

      (These are general ideas and may not be technically accurate... feel free to correct me)

      There are several problems with WiFi technology itself. First, there is no contention management for wireless. When you're wired in, collisions are detected quickly, so you can saturate the connection near its theoretical limits without too many errors. (There's a promotional video about this from Meru Networks [youtube.com], but it is fairly educational.) By contrast, WiFi will roll through a larger bit of data and then ask for confirmation of receipt, which can lead to a lot of problems as radios talk all over each other. This is not a problem in regular office environments, where walls, floors, and furniture can provide separation so the radios can "hear" things that are closer. However, get into an open air environment and add a bunch of devices at once, and everything flatlines as the access points attempt to orchestrate several hundred devices in range, including interference from other radios within "hearing" distance on the same channel.

      The second issue is one of limited channels. Originally WiFi was designed to move a tiny amount, and I think you could actually split off 802.11b into 11 discrete channels. As data needs grew, they consolidated 11 channels into 3 discrete channels for 802.11g (4 in the EU, I believe) and that's where it stands: a 3 lane road for 2.4GHz. 5GHz has more channels, depending on where you are in the world, but right now they are unreliable as the requirement for many of them is to be compatible with DFS, which means that if there is a certain signal being broadcast, your access points are expected to abandon that channel immediately. I think there are changes in the works from the FCC [fcc.gov] and although it only introduces 30% or so of new spectrum, it happens to cross multiple channels, so it may be like going from 9-12 channels to 20 or so. Combined with the more limited range of the higher frequency, having 20 discrete channels opens up a lot of options for basic broadband in public spaces. (Well, it did until the new ac standard came out, and I haven't even bothered to read it because these massive spectrum widths are going to be a nightmare, and I'm in a different line of work these days.)

      However, none of this solves the "microcell" design of WiFi, where the client makes the decision on what radio to connect to instead of the access point. Your cell service, for instance, works well because the tower instructs the client so it can perform handoffs, reduce the data rates, and make other adjustments to keep things from choking up. I have sat and watched an iPhone cross over multiple access points and hundreds of feet to connect across a stadium for no explicable reason. (That's true for every wireless device, but I'm picking on iOS because they are notoriously noisy, always flooding the air with useless beacons, trying desperately to connect to stored wireless networks even when they aren't around.)

      I have deployed Xirrus, Aruba, Extricom, Unifi, and some other products in dense situations, but as far as I know, the only pseudo non-microcell options available are from Extricom and Meru. Although I haven't used Meru, I can say that Extricom has been the most reliable in very dense environments, since they use some tricks to keep the air quiet, and they do not introduce beacon traffic with the addition of more radios. (Disclaimer: I have worked with the guys from Extricom quite a few times, and I think they are very capable, so take that opinion with a grain of salt.) Xirrus works pretty well in corporate environments, and their reporting interfaces are great, but I was disappointed that their sales staff continued to deny problems in 2.4GHz long after it was obvious that they didn't have a workable solution for super dense deployments. But maybe they just didn't know.

      Anyway, ignoring all of that technical garbage, the

      • Re: (Score:3, Informative)

        by egamma ( 572162 )
        Very informative, nice post. A couple of comments.

        First, there are three nonoverlapping channels for 802.11b/g. So there's 11 channels, but transmitting on channel 1 means that you are putting noise on channels 1-5; transmitting on channel 6 means that you are putting noise on 1-10; and transmitting on channel 11 puts noise on 7-11. If you transmit on channel 3, that means you are disrupting 1-8, so it's best to simply use 1, 6, and 11.

        5Ghz doesn't have as much of a saturation problem because of the small

        • Forget the 802.11b. The b-only devices are for museum or landfill. So you have 4 distinct channels - 1,5,9,13. And you should have a community wide policy to use only them and strongly discourage any use of other ones as well as use of b. You may interpret "strongly discourage" as you like. http://en.wikipedia.org/wiki/List_of_WLAN_channels [wikipedia.org]

          • by AK Marc ( 707885 )
            1-4-8-11 in the US. No 12 and 13 for us.
            • http://en.wikipedia.org/wiki/List_of_WLAN_channels [wikipedia.org]

              In the USA, 802.11 operation in the channels 12 and 13 are actually allowed under low powered conditions. The 2.4 GHz Part 15 band in the US allows spread-spectrum operation as long as the 50-dB bandwidth of the signal is within the range of 2,400–2,483.5 MHz[10] which wholly encompasses both channels 12 and 13. A Federal Communications Commission (FCC) document clarifies that only channel 14 is forbidden and furthermore low-power transmitters with low-gain antennas may legally operate in channels 12 and 13.[11] However, channels 12 and 13 are not normally used in order to avoid any potential interference in the adjacent restricted frequency band, 2,483.5–2,500 MHz,[12] which is subject to strict emission limits set out in 47 CFR 15.205.[13]

              Channels are 5 MHz apart. Channels 1-4 and 8-11 are 15 MHz apart - less than 20 MHz required by 802.11g or 802.11n in 20-MHz mode. It means that your list 1-4-8-11 will generate interference and make both channels unusable. If you cannot use 12 and 13 - use 1-5-9 and leave 13 for low-power applications and the people who can observe restrictions above.

              • by AK Marc ( 707885 )

                Channels are 5 MHz apart. Channels 1-4 and 8-11 are 15 MHz apart - less than 20 MHz required by 802.11g or 802.11n in 20-MHz mode. It means that your list 1-4-8-11 will generate interference and make both channels unusable.

                By excluding your comments to g and n, you are indicating that you think I'm right for b? Note, I didn't include n in my comments, and g falls back to b when required, so for something to be called g, it must do DSSS in the presence of interference. So g will work as well, as I describe, but not necessarily at speeds above 11 Mbps.

                That, and because 12/13 are "illegal" for b, and g should fall back to b for compliance, how do you make something fall back to DSSS on a channel where the settings would be il

        • So there's 11 channels, but transmitting on channel 1 means that you are putting noise on channels 1-5; transmitting on channel 6 means that you are putting noise on 1-10; and transmitting on channel 11 puts noise on 7-11.

          Nitpicking here, but I think you meant to say "transmitting on channel 6 means that you are putting noise on 2-10", otherwise 1 & 6 aren't nonoverlaping.

      • by jandrese ( 485 ) <kensama@vt.edu> on Saturday August 10, 2013 @04:06PM (#44532423) Homepage Journal
        One thing you didn't touch on: A lot of Wifi chips are really really bad. Like they'll crash randomly and repeatedly when connected to certain kinds of access points. Sometimes it is the access point that crashes. For the most part the chips reset themselves and continue on, so it's just a momentary interruption, but when it happens over and over you'll really start to notice.
        • Add some portable phone handsets and it gets even worse. I had a client with an older Panasonic 2.4GHz handset that would immediately take any wireless connection it was with in a few hundred feet of to a few Kbps. The problem with public wireless is anyone, either accidentally or maliciously can make the entire network useless.

      • by AmiMoJo ( 196126 ) * on Saturday August 10, 2013 @06:16PM (#44532923) Homepage Journal

        One major problem I see is that many APs default to channel 11, and their auto-channel-selection code only seems to pick channels 1, 6 or 11. Those kinda made sense for 802.11b, but now they just waste available spectrum.

        802.11g is 20MHz wide, so should use channels 1, 5, 9 and 13. 802.11n can go up to 40MHz, in which case we pretty much just have channels 3 and 11 left...

        There is also channel 14. It is only supposed to be used in Japan and only for 802.11b (10MHz channel width), but I find you can usually activate it for 802.11g in any part of the world with a few simple software tweaks. Probably a bad idea but because channel 14 is spaced 12MHz above channel 13 instead of the usual 5MHz it is usually uncongested, even if channel 11 is flooded.

  • by Animats ( 122034 ) on Saturday August 10, 2013 @01:21PM (#44531469) Homepage

    The vast majority of attempts didn't even get as far as the log-in screen, which requires signing into a Google account to connect.

    That's Google. "Public" WiFi with data mining.

    • +1 ...if they'd throttle back on the mining and auto-completes, it might be useable.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Maybe we need a "free beer" type quip to help people distinguish public-access and public-ownership. and probably to understand the difference between them,

    • That's Google. "Public" WiFi with data mining.

      Hey, Google "mined" all the local users and has no more need for them, so grind down the speeds.

  • by doubledown00 ( 2767069 ) on Saturday August 10, 2013 @01:22PM (#44531473)
    This is just one user's opinion, but slow gradual declines seem to be the hallmark of Google projects. They work well when they're shiny and new, but over time the projects are neglected and deteriorate. Similar things have happened with Google Voice and Google Docs.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      As both a Docs and Voice user, I can't related to your comment of deterioration. However, they don't get much in the way of new development, which I suppose could be considered neglect.

      Though compared to Maps where there is significantly development and "improvement," they also don't makes sure a new version has the same feature set as the old. Navigation in particular has really taken a nosedive on Android.

      • by pspahn ( 1175617 )

        Though compared to Maps where there is significantly development and "improvement,"...

        Funny thing is, I still use Voice every single day, but the only times I use Maps is when it runs as the map platform for some other service. If I want to look something up on a map, I use Bing Maps now. Google maps have become too slow and clunky in the last year. Tiles simply do not update fast enough.

        If that's what you call "development and improvement", I'll stick with "neglected" just about every single time.

    • by eclectro ( 227083 ) on Saturday August 10, 2013 @01:58PM (#44531711)

      I wouldn't necessarily agree with you, look at google maps which have improved in quality, But this happened through a dedicated commitment to the project.

      But in the other hand I have noticed that there are google "quality control" issues which I have experienced. A few of the patents in Google Patents have bad and unreadable pages (especially with the drawings) or are scanned crooked as an example.

      I suspect that the wireless network is oversubscribed where you have people jamming each other (and the router) to get into a local router. This can be a problem with any radio technology when you have too many users on a channel. And google probably does not make money on this project and hence allocate resources for other things.

      But this is a larger issue with WiFi networks in general. People think it's an infinite resource and can replace traditional wires everywhere. I assure you, it is not and can not.

      • the trick is marketing teams that should know better argue otherwise and say wifi networks can replace everything a wire can do.

        At my apartment i can see nearly 2 dozen wifi access points on 2.4 ghz. So I adjusted my router to broadcast b/g on 2.4ghz on an off channel and on 5 ghz I setup an N only system. My older smart phone can't connect to the N router but my laptop and shiny new nexus 7 can. The best part is there are no other 5ghz routers nearby.

  • by Anonymous Coward

    Why should it be any different than many other high visibility projects? Ribbon cutting at 11.
    Hurry, it's getting closer to 12 now, everybody get on the wagon, move on to the next big thing.

  • "I cannot download a new OS or Gone with the Wind in HD on free wi-fi."

    Shocking!

    • Re:Abuse of tool? (Score:5, Interesting)

      by spire3661 ( 1038968 ) on Saturday August 10, 2013 @01:46PM (#44531633) Journal
      Did you know that HP printer drivers are now up to 160 MB? its not just movies that are large files anymore.
      • by alen ( 225700 )

        how often do printer drivers change and people have to download them?

      • Did you know that HP printer drivers are now up to 160 MB? its not just movies that are large files anymore.

        Sorry? My bright and shiny color HP printer easily accepts

        $ sudo cat file.ps >/dev/ulpt0

        Where are the 160 MB drivers here?

      • I just downloaded and installed the latest version of the display drivers for my on-board Intel HD-3000 GPU on my laptop. Total size: 338MB, For fuck's sake, what the hell in a graphics driver needs to be so big?
  • consumer products (Score:5, Insightful)

    by fermion ( 181285 ) on Saturday August 10, 2013 @01:49PM (#44531655) Homepage Journal
    Pretty much Google's consumer products seem never to be improved or maintained. At first the consumer products were critical because this is why people like me allowed Google to set cookies, while 2o7 and the like were blocked. Of course we are now in world where tracking on the web goes beyond cookies, so maybe Google does not think it needs to provide a service beyond search to entice users.

    In any case the decline of the WiFI is not surprising. It is like Google docs, now Drive, that started off as a really competitive product, but the office applications has never been updated so the features continue to lag. OpenOffice makes it look like vintage 1990.

    Seriously. If MS were competent they could destroy Google with Bing and MS Windows Phone. But that is how the game works. Google does not have outrun the bear, it only has to outrun MS, which isn't that hard.

  • It's free. C'mon. (Score:5, Insightful)

    by skidisk ( 994551 ) on Saturday August 10, 2013 @01:56PM (#44531701)

    I live and work in Mountain View (not for Google). Look, the thing is free. What do you expect? I can log in and use it reasonably well. I certainly wouldn't depend on it for my only connectivity, but it works well enough when I need a quick piece of data or need to send something and don't have cell service or am using a wifi device. Just chill.

    • The only people that can reasonably be complained to, are those that you are paying.
      • by gl4ss ( 559668 )

        ...they can complain to google and to the city buying from google all right.

        to top it of, you need to log in to google to use the service.

  • What I have seen (Score:4, Interesting)

    by Groo Wanderer ( 180806 ) <{charlie} {at} {semiaccurate.com}> on Saturday August 10, 2013 @01:59PM (#44531723) Homepage

    While I haven't used the Google service yet, I see similar problems in a lot of public areas like airports where I happen to find myself a lot. It seems to be more of an issue with the non-direct data traffic like the auth services, ads/gateway tasks, and DNS. More often than not it is one of these 'services' that are unrelated to the traffic that are acting up.

    One example is the wi-fi networks in the Minneapolis or San Fran airports. You can log on, and then getting an IP, getting on the "I agree" screens, the videos you have to watch etc etc are all dog slow to one degree or other. The Delta lounge in the Minneapolis and San Fran airports are very extreme examples of this problem especially when they were T-Mobile (damn their black souls). You would 'get on' and then nothing or something trivial really slow.

    Once on you would have decent ping times and some speed tests would be OK but anything that needed 'extra services' was pain. Changing your DNS to something you have or a know fast provider helped a lot which tells me the NAS/Radius/whatever server they use was overwhelmed. Now that I am thinking about it I should do a traceroute next time I am on to see what is happening in more detail, I am curious.

    My first bet is that the majority of these services go through a single auth/security box that is under-CPUd and forces everything out a single overloaded link. If anyone has the time. I also wouldn't be surprised if DPI had a hand in it too, especially from Google.

    • ... [on the] wi-fi networks in the Minneapolis or San Fran airports. [...] the videos you have to watch [when going through authorization/configuration steps] are all dog slow ...

      Which got me thinking...

      Lately (at work with a company-IT-mandated Chrome browser and thus no flashblock/noscript/...) I've noticed that advertisers on many services I look at (typically due to following news links from Slashdot) are feeding multiple, self-starting, full-motion videos per page.

      Videos require ENORMOUSLY more traffi

  • by terminalhype ( 971547 ) on Saturday August 10, 2013 @02:19PM (#44531821)
    What is a "Bad Connections Dog" and why is it Googling Mountain View's wi-fi Network? Possibly it is looking for a Good Connections Cat?
  • Comment removed based on user account deletion
  • This wasn't a story about a dog getting a bad connection because a mountain that google owns was viewing wireless data. So disappointing. Seriously, who wrote that technically correct but stylistically garbage headline?
  • Not Surprising (Score:5, Informative)

    by sigipickl ( 595932 ) on Saturday August 10, 2013 @02:49PM (#44532031)

    There are dozens of reasons why Wi-Fi doesn't scale to the masses. Especially outdoors or in large spaces. Here are a few:
    - Wi-Fi is half-duplex. Only one transmitter can broadcast on a channel at any given time. If the transmitting radio is slow (weak connection, older technology, bad-driver, etc...), then all other devices must wait for the transmission to end before they can get their airtime to transmit.
    - A Wi-Fi radio that conforms to the Wi-Fi spec must co-operate when on the same channel as other wireless networks near it. This means that the google APs should be honoring the management traffic and broadcasts from other Wi-Fi radios near them. In a place like Mountain View, there is a *LOT* of Wi-Fi.
    - 802.11n performance is dependent on multi-pathing. An AP on a pole in the middle of a park doesn't give much in the way of surfaces to reflect a signal off of. You end up at my first point- slow transmission, lower cell capacity.
    - While two clients on an AP each can "hear" the APs transmissions, they may not "hear" each others'. Collisions galore.
    - The ISM bands that Wi-Fi operates in are full of non-Wi-Fi interference. Wireless baby monitors are notorious for killing Wi-Fi, as are cheap wireless video cameras. Cordless phones,motion detectors, microwave ovens, remote control toys all play a part in the general noise within these RF bands.

    • As a reference for "good" wifi, the new 49ers stadium is claiming some crazy real-world useability. Something like every single person in the park will be able to fully utilize their 802.11n or 802.11g bandwidth. If this actually turns out to be true it could be the model for how to do large wifi projects right.
      • by AK Marc ( 707885 )
        The only way to get something like that is to turn every AP to the lowest transmit setting, and pack them in as compact as needed (and no more) for full coverage. Simply put, for *everyone* to get full 802.1g bandwidth requires one radio per person, or about 70,000 radios, no less, is needed to get that stated level of throughput (less any "assumed" oversubscription).
    • - 802.11n performance is dependent on multi-pathing. An AP on a pole in the middle of a park doesn't give much in the way of surfaces to reflect a signal off of. You end up at my first point- slow transmission, lower cell capacity.

      http://en.wikipedia.org/wiki/OFDM_system_comparison_table#OFDM_system_comparison_table [wikipedia.org] states that the symbol length of 802.11a is 3.2 uS (which is about 1 kilometer of length) and the guard interval is 1/4 of symbol length. It means that path length difference should be at least bigger than 100 meters to create problems. I believe 802.11n has similar parameters. And you can easily use a directional antenna to suppress such bad paths.

      - While two clients on an AP each can "hear" the APs transmissions, they may not "hear" each others'. Collisions galore.

      Then, you all forget about TDMA profile of WiFi. It's rarely implemented

      • > If you have a single network in your area (as Google has)

        Google does not have a single network in that area. They share with every other 2.4 GHz device in their range. Portable phones, baby monitors, Users AP's, etc. Also, every device that connected to Googles network would have to support TDMA wi-fi.

    • by tibit ( 1762298 )

      I think that simple ISM band devices that are point-to-point should simply use broadband transmission with a gold code, like GPS does. Heck, they should use a keystream code, so that your neighbor won't be able to listen to your baby monitor. This would allow for very graceful degradation, since such transmissions look to everyone like an increase in the noise floor, not like narrowband interference. If there are two monitors in the household, they'll only have shorter range, there'll be no other signs of i

  • And what's so bad about it googling the Mountain View Wi-Fi Network?

  • They are using 5 GHz Alvarion BreezeAccess VL's at 54mbs for their short backhauls. (The diamond looking things on the telephone poles) Alvarion is dead and the BreezeAccess is dead. My experience using that hardware is that is can be a little flaky and it takes someone in the know to get them running adequately. I always got a lot of dropouts and could never stream YouTube reliably through an Alvarion pipe. There are a couple hundred of them in Mountain View. Also, the original agreement was back whe
  • by Anonymous Coward

    Disclaimer: I work as a sales engineer for a distributor of several wireless vendors and related products, which include Wi-Fi portal hardware/software. There are many ways to skin a cat in this game and I've seen it done right and wrong. I have the benefit of not being required to drink the vendor kool aid by virtue of one degree of separation and the need to sell what actually works.

    There's a lot of FUD around Wi-Fi from the uneducated (i.e. those that never underwent the education to actually understand

  • Bad Connections, dog, Google's mountain, view Wi-Fi network.

    Bad connections, Dog, Google's Mountain View Wi-Fi network.

    Bad Connections Dog Googles, "Mountain View Wi-Fi Network."

    Bad, Connections Dog, Google's Mountain View Wi-Fi Network!

  • by Brett Buck ( 811747 ) on Saturday August 10, 2013 @09:46PM (#44533509)

    Just down the road from Mountain View, the old (now defunct) Sunnyvale minicipal Wi-Fi worked pretty well when it was running. I was a long way away from the nearest transceiver but I never had any significant problems with connecting or with throughput.

        It went defunct because the funding went away, I think, not because it didn't work.

            Brett

  • Do you know of any that work especially well?

    Yep. It's a small post-industrial midwestern town, but its electric meter system is WiFi based: http://www.cityofanderson.com/wifi.aspx [cityofanderson.com]

    Don't expect any technical details in the link, but essentially every household utility meter is WiFi enabled and networked to send the data to a central server downtown.

    The network is slow, obviously, but it works for email and youtube better than dial up. Faster with a better signal but still ADSL speeds.

  • Who is "Bad Connections Dog" and why is he interested in the Mountain View Wi-Fi Network?

  • I recall an article here that wifi antennaes degrade within a few years.

    I've had to replace wifi antennae after three or four years because they longer had a strong enough signal across the house.

  • http://ask.slashdot.org/story/12/10/21/1335208/ask-slashdot-why-does-wireless-gear-degrade-over-time [slashdot.org]

    Don't think there was a consensus, but some ideas

    (You can thank firefox's nice address bar for me being able to find this in less than a minute. Maybe google would have done it too, though)

  • http://en.wikipedia.org/wiki/Garden_path_sentence [wikipedia.org]

    Who or what is "Bad Connections Dog"? Is that like the Bad Idea Bears?

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...