Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Networking

Optical Tech Can Boost Wi-Fi Systems' Capacity With LEDs 96

chasm22 writes: Researchers at Oregon State University have invented a new technology that can increase the bandwidth of WiFi systems by 10 times, using LED lights to transmit information. The system can potentially send data at up to 100 megabits per second. Although some current WiFi systems have similar bandwidth, it has to be divided by the number of devices, so each user might be receiving just 5 to 10 megabits per second, whereas the hybrid system could deliver 50-100 megabits to each user.
This discussion has been archived. No new comments can be posted.

Optical Tech Can Boost Wi-Fi Systems' Capacity With LEDs

Comments Filter:
  • by Archangel Michael ( 180766 ) on Tuesday April 21, 2015 @12:12PM (#49521021) Journal

    Drawbacks are huge.

    • Yeah, that's why television remote controls are always so fussy.

      • Yeah, that's why television remote controls are always so fussy.

        It sure is a funny coincidence that television viewers generally have line of sight to the set. OTOH, you also have these computer users that store every file on their "desktop", because it doesn't exist if you can't see it, so that demographic might also benefit from the existing line of sight to teh internets.

        • It sure is a funny coincidence that television viewers generally have line of sight to the set.

          Huh. Growing up, were you one of those kids that never actually played with the remote?

          • It sure is a funny coincidence that television viewers generally have line of sight to the set.

            Huh. Growing up, were you one of those kids that never actually played with the remote?

            Well of course I'm familiar with the back wall bounce. It probably won't work across too many reflections in weird angles. In fact, it seems that newer remotes have narrower working angles to avoid conflicts with the bazillion other receivers. The point is that light works for remotes for obvious reasons, it's really the bug that turns out a feature. It's also well known physics that higher frequencies are more easily restricted by obstacles.

            • Okie doke, just wondering. One time I had two laptops start talking to each other via IRDA because the signal was bouncing off my white shirt.

    • by mlts ( 1038732 )

      The drawbacks are significant. In fact, this has been done before in the early 1990s with Macintosh LocalTalk-based NICs which one would aim every NIC in a room (assuming the usual cubicle based office) to a spot on the ceiling, adjust aim until all the devices sported a green LED, then they all could communicate with each other. It wasn't fast (LocalTalk did most of its stuff via broadcasts), but it was a way to network a bunch of machines in a dynamic environment without hardwiring and before the days o

      • Until 2012, MacBooks have had this built in... perhaps this might be something useful to put in a spec as a NIC option?

        The late PowerPC and Intel through 2012 consumer Macs did have an infrared receiver, but it is only a receiver for the sometimes optional, sometimes bundled remote and can not be used for two-way communications. The last Macs to have IrDA were the G3 Powerbook and Bondi Blue iMac. The multicolor iMacs, iBook, and Powerbook G4 all dropped it.

  • by iamwhoiamtoday ( 1177507 ) on Tuesday April 21, 2015 @12:13PM (#49521029)

    Wouldn't this technology be limited to line of sight, in addition to being annoying anywhere outside of a rave?

    If you really need solid bandwidth.... RUN AN ETHERNET CABLE. Please. I keep running into people who insist on running everything over wireless..... no. just no.

    • being annoying anywhere outside of a rave?

      I know it's not fashionable to RTFA at Slashdot, but really....

      The prototype uses LEDs that are beyond the visual spectrum for humans

      • Doesn't negate "line of sight". Infrared is typically line of sight, but also can bounce off walls. And if anyone walks between the remote and the TV, it just doesn't work.

        • If you consider the fact that WIFI in many cases has a better chance at making it to it's target by bouncing around I'd say there's not much difference.

          Considering this from the following wiki: "Thus any obstruction between the transmitting antenna (transmitter) and the receiving antenna (receiver) will block the signal, just like the light that the eye may sense"
          http://en.wikipedia.org/wiki/L... [wikipedia.org]

        • by ceoyoyo ( 59147 )

          The point is that it's line of sight. You have to be directly under the emitter. So, for example, you could stick one over every chair in the airport. Everybody gets their own bandwidth, no interference.

          The trick in the article seems to be a system where you can switch between the optical units and regular wifi if you lose contact.

    • They don't use visible light so it wouldn't be so bad. But there have been a million articles like this in the past. It is a very niche technology.
    • Re: (Score:2, Funny)

      by Anonymous Coward

      I keep running into people who insist on running everything over wireless.

      Then slow down and watch where you are going.

    • Wireless is convenient, even if not entirely reliable. When I rented the landlords often didn't like the idea of running cable even if professionally installed.

    • by haruchai ( 17472 )

      802.11ac is pretty darn awesome if you design it well.
      I was at Cisco Live in SanFran last year where the only connectivity anywhere was through wireless and during a breakout session attended by over 300 people, i updated 2 Android devices to Ice Cream in under 20 min while using my Windows laptop to VPN back to my company's network to troubleshoot some an outage.

  • Can they do this without it being visible light?

    I'm pretty sure you could really mess up some epileptics this way.

    Not to mention I can see this giving some people migraines ... I know many many people who can see the flickering of fluorescent lights.

    Cool, awesome, yay progress. But I don't want to be in a place where I am aware of the flashing lights.

    • by Z3n1th ( 1772866 )
      Read the article, it doesn't use visible light.
    • by Jamu ( 852752 )
      I wonder if ultra-violet light would work. This should then provide even more bandwidth than visible.
    • Sheesh, it's only about 10 sentences in.

      The prototype, called WiFO, uses LEDs that are beyond the visual spectrum for humans and creates an invisible cone of light about one meter square in which the data can be received.

    • by hawguy ( 1600213 )

      Can they do this without it being visible light?

      I'm pretty sure you could really mess up some epileptics this way.

      Not to mention I can see this giving some people migraines ... I know many many people who can see the flickering of fluorescent lights.

      Cool, awesome, yay progress. But I don't want to be in a place where I am aware of the flashing lights.

      Even if they used visible light, no one is going to see the flickering of a multi-megahertz flickering light.

    • by idji ( 984038 )
      the frequencies are so millions of times higher than what a human would notice.
    • by ceoyoyo ( 59147 )

      Even if it were visible (it's not) there's no way you can see or be affected by an LED modulated at 100 mHz. It would look around half as bright as a regular light of the same size though (and use half the power).

      • It would look around half as bright as a regular light of the same size though

        I know some people like that.

  • Oh great (Score:5, Informative)

    by jeffmeden ( 135043 ) on Tuesday April 21, 2015 @12:15PM (#49521039) Homepage Journal

    IRDA is back. Hey I have an idea, why not just have an access point that, for each user, drops a little cord out of the ceiling (where all access points are, right) and you plug it in for GIGABIT SPEEEDZZZS!!!1.

    No but seriously why are we doing this when channels in the 5 Ghz spectrum are easy to come by.

    • The lure of non-shared bandwidth is huge, if you have more than a handful of clients on a single radio, they can still consume the device easily. Clearly this won't be the answer, but it's nice someone is trying to think outside the box.

    • by hawguy ( 1600213 )

      IRDA is back. Hey I have an idea, why not just have an access point that, for each user, drops a little cord out of the ceiling (where all access points are, right) and you plug it in for GIGABIT SPEEEDZZZS!!!1.

      No but seriously why are we doing this when channels in the 5 Ghz spectrum are easy to come by.

      This is as close to IrDA as RS-232 is to ethernet.

      This technology purportedly creates small one meter hot-zones of light, so instead of an AP having one (or a few) 5Ghz channels shared by everyone in range, it can have dozens of separate hotzones so each , and an AP in one room won't interfere with one in the next room.

      I could see this being very useful in offices -- instead of spending tens of thousands of dollars pulling wire to each desk back to a central wiring closet, a few AP's can be hung on the ceil

      • by itzly ( 3699663 )

        with receivers on top of everyone's monitor with much better total throughput and less interference than RF.

        What about laptops, tablets and phones ?

        • by hawguy ( 1600213 )

          with receivers on top of everyone's monitor with much better total throughput and less interference than RF.

          What about laptops, tablets and phones ?

          The same thing they do now -- use RF Wifi.

          Everyone in my office plugs their laptop into their large monitor at their desk (which is why I said to put the receiver on top of the monitor, just as they now get their wired connection through the monitor), though a laptop may still be able to get good optical signal with a receiver built into the top of the display. Tablets and phones tend to have lower bandwidth needs than laptops and desktops (few people are editing uncompressed TIFF files on a tablet), so the

          • by itzly ( 3699663 )

            But if you're already have the power cord plugged in, it should be easy to provide a wired network connection right next to it.

            • by hawguy ( 1600213 )

              But if you're already have the power cord plugged in, it should be easy to provide a wired network connection right next to it.

              Getting the wired network to the laptop is not a problem -- most laptops in the office get to the wired network through the same cable they use to plug in to the monitor, but that wired network doesn't come for free, my company paid $50,000 to wire up cat-6 for an office that we only plan on being in for 2 years - and it already constraints where we can place desks. This doesn't include the $40 - $50K spent on access switches in the server room.

              We have Wifi, which works well for phones, tablets, and laptops

            • But if you're already have the power cord plugged in, it should be easy to provide a wired network connection right next to it.

              Bu't its not wireless. Wireless is awesome, and I bet it you asked 10 normal people, they'd tell you its faster than wired. Because, like its Wireless!!!

  • As more and more people demand wireless access, yet bandwidth is limited, I figures we were going to need to go to something like this eventually. Fiber as absolutely par as possible, then IR to the device. IT woun't be as "handy" as a whole house rf setup, but there are only so many RF channels available, and services like GPS are kind of reluctant to give up a vital resource so we can surf porn.
  • by jandrese ( 485 ) <kensama@vt.edu> on Tuesday April 21, 2015 @12:18PM (#49521065) Homepage Journal
    Optical networking startups are littered through history. Ultimately the tech works, but has caveats like you can't move your machine around without losing connectivity, and you also lose connectivity whenever someone walks in front of the beam. Also, they tend to be expensive, and since the machine ends up having to be basically immobile anyway it usually makes sense to just run cables instead.

    Even for Point to Point links where you can't easily run cables (to a building across the street for example), you end up with a reasonably fast link that still cuts out when there is heavy rain or a bird lands in front of it or something. 100Mbps is really nothing to write home about either. In 2015 you should be pushing more like 1Gbps over an optical link to make it even somewhat attractive compared to plain old WiFi.
    • it's LiFi. Don't believe it.

    • "cuts out when...a bird lands in front of it or something"

      With a sufficiently-powerful light, that situation would only last for a fraction of a second.
      • "cuts out when...a bird lands in front of it or something" With a sufficiently-powerful light, that situation would only last for a fraction of a second.

        And then you get Internet and fried chicken!

      • by jandrese ( 485 )
        I don't know who exactly, but someone will probably complain about the invisible death rays crossing the street. People tend to notice when you start cutting cars in half when they drive in between your buildings.
    • Networking in the 80-90s is not comparable with networking today. The H/W has been improved significantly but more so the software that drives it and surrounds it. Your WIFI network has interruptions but they aren't apparent to you because the layers of software provide a padding to deal with the lost of connectivity (talking a few milliseconds up to less than 10 seconds).

      I can already see great applications for this technology as long as it's combined with WIFI.

    • by Lisias ( 447563 )

      If you are talking about Light only networks, you are right.

      But using a hybrid system, Light and WiFi, you can use WiFi automatically when you lose the beam, and switch back to another beam as soon it's in range.

      I would use it here at home.4 beam spots around the house where people tends to be, and WiFi coverage when in transit between rooms.

      • by adolf ( 21054 )

        You don't want this at home.

        This is technology helps solve density issues. Your home (unless you're into regular LAN parties) isn't a place that has those issues.

        If you've got cat5 to 4 spots in your house where people tend to be, just put a dual-band 802.11 access point in each of those spots, dial down the power output, and done.

        You can do this today.

  • by TechyImmigrant ( 175943 ) on Tuesday April 21, 2015 @12:20PM (#49521079) Homepage Journal

    All schemes that involves knowing which direction to point the EM waves ahead of time is structurally incapable of being a WiFi physical layer.

    • All schemes that involves knowing which direction to point the EM waves ahead of time is structurally incapable of being a WiFi physical layer.

      Ruckus and others have had good luck with beam-forming technologies, so having some directionality on the physical layer doesn't necessarily render it incompatible with Wi-Fi (or its various add-ons, AirMax and their ilk). What would render it incompatible with Wi-Fi in my opinion is: it isn't Wi-Fi, your existing Wi-Fi equipment won't work with it (you need a receiver device). It's another physical protocol, they could layer ethernet or anything else over it since they are going to have to implement driver

      • So it might work in some constructed scenario, but not in the general case, which is what a WiFi physical layer needs to address.

        There are plenty of directional antennas that are manually configured by the installer, but that's different to trying to track a STA with a beamformed ray from an AP. That's crazy talk.

    • i could see this working at a desk, where the user plops down their laptop and instead of jumping on their company's wifi, they're linked to the company's optical net by an LED on their desk which transmits at gigabits/second.

  • You know what else would increase the bandwidth of Wi-Fi? We could run run wires from the router to each individual device.

    The beauty of Wi-Fi is that it just works as long as you're in range of the signal without having to mess with wires or aiming.

  • At light frequencies we get higher bandwidth than at typical RF frequencies !

    Quick ! Let's invent a transparent fiber and build cables fwith them inside !

    • by Himmy32 ( 650060 )
      I think you took away the wrong lesson. The idea is to keep increasing the frequency and transmit power. I look forward to a future with gamma ray emitting wifi AP death rays.
      • I currently have some problems with wifi through thick concrete walls (bottom of a 15 story building). Your gamma wifi would probably solve that quite nicely.

  • by Aqualung812 ( 959532 ) on Tuesday April 21, 2015 @12:38PM (#49521263)

    First off, this has nothing to do with Wifi in your home or office where there is little line of sight and lots of RF-soaking walls to help isolate your access points.

    When you're dealing with a large area with dense users (airport, lecture hall, arena, etc), wireless becomes really hard. The shared medium and limited number of non-overlapping channels becomes a real issue.

    You can get directional antennas to try to isolate the overlapping channels, but there is reflection to deal with. It is a constant battle of too little power to work, and too much power and you are interfering with another access point.

    Are you really going to run Cat6 all over the lecture hall or airport? To everyone's handheld device? No.

    LED lights are far more directional, so even though you still have a shared medium, you're not dealing with the same issues at gigahertz RF.

    This is a niche, but a very important one.

    • by itzly ( 3699663 )

      When you're dealing with a large area with dense users (airport, lecture hall, arena, etc), wireless becomes really hard

      That's because they insist on using the small (3-4 channels) and crowded 2.4GHz band.

      • That's because they insist on using the small (3-4 channels) and crowded 2.4GHz band.

        First off, there are still a LOT of devices without 5ghz support. I know many companies that are still ordering 2.4ghz-only laptops in 2015. Seriously. 2.4ghz is going to die as slow of a death as IPv4.

        Second, 5ghz gives you 9 channels instead of 3, true. In a room that can have 500 people, though, that is still 55 people per channel. That is slow.

        • by itzly ( 3699663 )

          First off, there are still a LOT of devices without 5ghz support.

          There are even less with optical network support.

          Second, 5ghz gives you 9 channels instead of 3, true.

          No, there are 23 channels.

          • by afidel ( 530433 )

            And each of those 23 channels can use space division multiple access (aka beam forming, aka multi-user MIMO) so if you lay things out right you can get as few as 3-4 users per channel per conversation domain which ends up providing plenty of bandwidth.

          • Beat me all you want Cardassian. There are only 3 channels!

          • No, there are 23 channels.

            In the USA, excluding DFS, it's 9.

            DFS channels are all secondary to other uses, so you can't plan on them.

    • and since we're all dropping LED lights into our home already, one could imagine a future scenarios where the power socket of your bulbs can transmit data from your router to a local sub-room data broadcast (i.e. directly above your location in the home). Uploads could still be wifi based.

      • Ohhhh, multicast video over LED? Streaming the live CCTV stream at whatever event you're at to any handheld without impacting your Wifi spectrum!

        Everyone that leaves the main area to get a beer or dispose of one can still stay connected to the event.

        • by itzly ( 3699663 )

          Ohhhh, multicast video over LED?

          That's called a TV.

        • If they all have their own devices and are living in their own little stream-world, what are they all in the main area for? Isn't that what great honkin' TVs are for?
  • From the Article:

    creates an invisible cone of light about one meter square in which the data can be received

    If you are effectively tethered to a 1 sq meter zone, you might as well be litterally tethered to a 1 meter ethernet cable .

  • I am almost certain I saw this kind of thing in a Radio Shack catalog in the 80's, lol i mean it was basically an LED and a photodetector and they sold it as a 'wireless communicator' or something but it was for hobbyists just mucking around, and it was only good for (what I assume was) low-quality voice communications.
    • I am almost certain I saw this kind of thing in a Radio Shack catalog in the 80's ...

      It's as old as infrared LEDs and networking.

      Datapoint did it with Arcnet in the late '70s: Both infrared office networking patches (though I don't know if those were productized or just experimental) and the "Arclight" building-to-building cross-town infrared link (which had a pair of lenses each about the diameter of a coffee can.).

      Arcnet was still a going technology when the first portable ("luggable") computer - the Osb

  • by John.Banister ( 1291556 ) * on Tuesday April 21, 2015 @01:16PM (#49521603) Homepage
    I've often thought connectorless optical data would be a nice docking technology for a water resistant smartphone.
    • by adolf ( 21054 )

      20-ish years ago, I saw a Jensen car stereo with that feature: The detachable face had electrical contacts for only power and ground; the rest of the signalling was optical.

      It made sense to me at the time, and I fully expected it to catch on....fast forward 20 years, and I'm still occasionally cleaning the contacts on my JVC car stereo to allow it to work at all.

      And I've mangled enough Micro-USB connectors (just one, but that's enough) that I really don't like dealing with them on a regular basis.

    • by ledow ( 319597 )

      The entire first generations of handheld devices had irda, which is basically this.

      Palm pilots etc. used it all the time.

      It died for a reason - bluetooth took over. Because there's nothing optical data can do that radio data can't, plus radio never requires line-of-sight (it may benefit from it, but that's another matter).

      What makes you think that optical connectors in docking stations are in any way superior to Bluetooth (which has stupendous data rates, more than enough distance, is incredibly low-powere

  • Although some current WiFi systems have similar bandwidth, it has to be divided by the number of devices, so each user might be receiving just 5 to 10 megabits per second...

    Current 80MHz 4x4 WiFi can reach speeds over 1Gbps... Even a 1x1 station can see about 350Mbps of throughput in a clean channel. This comparison is nonsense.

    Next generation WiFi having MU-MIMO support also won't split the bandwidth as described (I think this is a fair comparison since this is also a technology is not yet widely adopted).

    • Now split that 1Gbps over 100 devices in an office building. Not so fast anymore, is it?
      Not that I think this light based stuff is the answer. I think the answer was tested with the spatial streams in wireless AC. We just need more than 8 streams and higher carrier frequency so we can tune those streams better.
      We may end up with technology close to phased array radar.

      • by ledow ( 319597 )

        Depends what you're using it for.

        1Gbps backbone is pretty standard for offices and schools etc. Not many of them have moved to 10Gbps technology at all. Sure you might want more, but wireless is cheap and ubiquitous and this technology is pie-in-the-sky and has been for decades.

Sentient plasmoids are a gas.

Working...