Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Power Technology

USB SuperSpeed Power Spec To Leap From 10W To 100W 242

Lucas123 writes "While news stories have focused on the upcoming jump from 5Gbps to 10Gbps for USB SuperSpeed, less talked about has been the fact that it will also increase charging capabilities from 10W to 100W, meaning you'll be able to charge your laptop, monitor, even a television using a USB cord. Along with USB, the Thunderbolt peripheral interconnect will also be doubling it throughput thanks to a new controller chip, in its case from 10Gbps to 20Gbps. As with USB SuperSpeed, Thunderbolt's bandwidth increase is considered an evolutionary step, but the power transfer increase is being considered revolutionary, according to Jeff Ravencraft, president of the USB Implementers Forum (USB-IF). 'This is going to change the way computers, peripheral devices and even HDTVs will not only consume but deliver power,' Ravencraft said. 'You can have an HDTV with a USB hub built into it where not only can you exchange data and audio/video, but you can charge all your devices from it.'"
This discussion has been archived. No new comments can be posted.

USB SuperSpeed Power Spec To Leap From 10W To 100W

Comments Filter:
  • by account_deleted ( 4530225 ) on Monday April 22, 2013 @02:19PM (#43517459)
    Comment removed based on user account deletion
    • Re:we've had a few (Score:5, Insightful)

      by Anonymous Coward on Monday April 22, 2013 @02:24PM (#43517517)

      because fibre optics can't carry power?

      • What about making a cord that has two "wires". One would be fiber optic to enable the fastest possible data transfers with the lowest interference. The other would be a traditional copper wire that powers the device.
        • Re:we've had a few (Score:5, Interesting)

          by rjr162 ( 69736 ) on Monday April 22, 2013 @03:12PM (#43517975)

          because fibre is much easier to break/snap than copper. Same reason the company my friend works for who installs media distribution systems into Lufthansa aircraft don't spec it out with fiber lines.. they use CAT 7 with the TERA style ends, because an over-zealous mechanic is more likely to snap a fibre optic line with his zip tie than a copper line

          • Re:we've had a few (Score:5, Interesting)

            by Mashiki ( 184564 ) <mashiki&gmail,com> on Monday April 22, 2013 @04:40PM (#43518843) Homepage

            Odd, I thought using zip ties was illegal on aircraft. Due to the fact that they can cause vibration damage to cabling, and make it wear through exceptionally quick. While it's been awhile since I was last at a fab plant, they were using low abrasion cloth such as silk to tie cabling together.

            • Re:we've had a few (Score:4, Informative)

              by mirix ( 1649853 ) on Monday April 22, 2013 @06:48PM (#43520229)

              Old military electronics always had wires laced (maybe they still do this, haven't been into any new equipment).

              It's laced with a heavy waxed cloth, similar to extra wide tooth floss. Originally cotton, probably something synthetic now. There would be loops every inch or two down the wire bundle, connected to each other. I'm having a hard time explaining that for some reason.

              Do you mean something like this?
              Here's a picture [wikimedia.org]

        • by Dahamma ( 304068 )

          Sounds like that would make devices and cables larger and more expensive by requiring two completely different interfaces in one connector...

        • That's already been done in some Thunderbolt cables I think, though most of them right now are all copper... but the original idea was a fiber cable plus copper for power.

          • Re:Already done (Score:4, Informative)

            by SuricouRaven ( 1897204 ) on Monday April 22, 2013 @04:32PM (#43518775)

            Thunderbolt cables have part of the interface electronics physically in the connector body - that's why they cost so much. It also means you can swap a thunderbolt copper cable for a thunderbolt fiber cable without having to worry about the equipment at the ends having an exotic fiber interface.

            I don't know if you can even get a thunderbolt fiber cable yet. They don't go any faster than copper, but they do go longer, which could be handy in a few niche applications. I'm thinking supercomputer and cluster interconnects. Could be cheaper than infiniband, and lower latency than ten-gig ethernet.

    • by alen ( 225700 )

      how do charge your device via fiber?

      i through my apple lighting port cable into my bag and keep my iphone plugged in all day long to charge it

      • how do charge your device via fiber?

        [old homeless drunk from Terminator]: Hey, buddy, did you just see a real bright light?

    • Firewire goes to 30GB/s and 45 watts (30v @ 1.5 amps) and you can daisy chain it. Seems like a better idea than inventing a non-backward compatible serial port and pretending it is somehow related to USBs of yore.

      • Re:or firewire? (Score:5, Informative)

        by Baloroth ( 2370816 ) on Monday April 22, 2013 @02:49PM (#43517765)

        Firewire goes to 30GB/s and 45 watts (30v @ 1.5 amps) and you can daisy chain it. Seems like a better idea than inventing a non-backward compatible serial port and pretending it is somehow related to USBs of yore.

        Do you have a source on the non-backwards compatibility thing? Because the USB spec release [usb.org][PDF warning] for the new USB SuperSpeed states it will be.

        I should add that the newest FireWire specs only go up to 800mb/s, so also a source on that would be nice.

        • Didn't firewire have a more-or-less-vapor 1600mb/s flavor that worked over fiber runs and existed pretty much nowhere at all?

          • Even if it did, and even if we assume you meant megabytes (MB) not milibits (mb) per second, 1.6GB/s is hardly anywhere near the 30GB/s that goombah99 claimed. I call BS.

        • My USB connector on my Samsung Droid Charge is on the wiggly-loose fritz. If I plug in a 100 watt cord and wiggle it to get the connection to work, it's not gonna burst into flames is it?

          • The SuperSpeed spec requires that devices specifically request the increase in power, in order to remain backwards compatible with older USB specs. In other words, you'd have to wiggle that cable pretty damn particularly in order for it to happen.

            • The better question to ask, is how are manufacturers ever going to be able to offer 100w from a USB port. Do we really want our computer PSU's to have to be able to handle another 20A on the 5v rail just to be able to offer a single 100w capable port?

              What about your TV or anything else that ghas a USB port? One of the reasons USB is so popular is because you didn't need to re-engineer your entire power supply and all it's rails.

              Ever tried putting 5v @ 20A though a PCB trace?

              • by ozmanjusri ( 601766 ) <aussie_bob@hoMOSCOWtmail.com minus city> on Tuesday April 23, 2013 @01:40AM (#43522421) Journal

                Ever tried putting 5v @ 20A though a PCB trace?

                Yes, but I still pretend I didn't.

                Never speak of it again.

              • by AmiMoJo ( 196126 ) *

                TFA mentions that it will switch up to 20V, which is similar to what most laptop chargers use to reduce the amps required down to 5. Still fairly hefty but workable.

                The device will have to request 100W of power, and the host is then free to refuse if it can't handle it. I expect hosts will come with a sticker that says something like "50W USB power" so you know that your 35W laptop will be fine with it, but your 100W... er... pipe soldering iron won't.

                As an aside it's a shame USB didn't start out using 3.3V

      • Re:or firewire? (Score:5, Interesting)

        by fuzzyfuzzyfungus ( 1223518 ) on Monday April 22, 2013 @03:14PM (#43517991) Journal

        While I'm not impressed by USB's mutations over the years, Firewire had the major drawback that(at least in practice, not sure if the paper demanded otherwise) there was a very, very heavy emphasis on 'up to' when it came to how much power could be delivered.

        A small minority of actually-well-built workstations and the like wouldn't shrug at providing full specced power. More or less ordinary PCs usually had a floppy or molex connector to supplement PCI bus power; but didn't spring for a DC-DC converter, so (since 30v isn't readily available anywhere on the DC side of an ATX PSU) you generally got 12v, albeit at a decent amperage. Laptops? In practice, "firewire" pretty much meant 'whatever Apple did on the last couple of models of ibook and powerbook; because all the PCs omitted the power pins entirely for "i.link" or similar, which usually boiled down to ~19v, if on adapter, 12-ish if on battery.

        The nominal maximum was certainly fairly spacious; but a powered firewire peripheral was essentially always on the hook for a DC-DC converter, and had to deal gracefully with(or simply refuse to work with, ideally in a documented way) substantially inferior power supplies from many devices.

        5v 500ma was always pitiful; but (by virtue of being so pathetic) most devices actually did as well or better than they claimed to, and lots of peripherals could get away with only the cheapest of designs for handling bus power.

        That's my bet for why "100watt USB" will suck. Sure, it'll be cute and all that POS hardware vendors can now have USB printers and things that are 'standards compliant' and will actually work if purchased 100% from approved vendors and plugged in just right; but everyone else will have wildly unpredictable actual power levels.

      • Comment removed based on user account deletion
    • fairly robust fibre optic solutions to date that carry data and are far more energy efficient. im confused as to why our peripherals dont use them

      No power?

    • Re:we've had a few (Score:5, Insightful)

      by fuzzyfuzzyfungus ( 1223518 ) on Monday April 22, 2013 @02:41PM (#43517673) Journal

      fairly robust fibre optic solutions to date that carry data and are far more energy efficient. im confused as to why our peripherals dont use them

      Given what users can do to strain-relief-equipped multistrand copper power cables, they may not be quite ready for optical fiber...

    • Fiber optic is pretty fragile - far more so than a copper cable. Can't bend it past a certain radius, much less kink it. Optical's main benefit is distance, not speed...

      TOSlink and all that jazz worked because you connect stuff and that's it- the cable rarely gets disturbed. Think of your average business traveler - they'd go through optical cables like candy.

      Sure, you could make them heavier-duty since they don't have to stretch that far, but that grade of optical plastic or glass is $$$, and volume goes u

      • TOSlink is sort of a weird one because it was optical; but usually over very short plastic runs, and at a data rate so low that even fairly pitiful copper has no trouble with it(which is why it is now commonly replaced by, or lives along side with, an RCA connector providing the same output in a copper flavor). I'm sure that there is some reason why optical was dragged in in the first place; but it's always a bit jarring to see.

      • by Dahamma ( 304068 )

        This 100w power standard is pretty stupid, though. We're talking power levels where fires will definitely be possible from damaged USB cables.

        As opposed to all of the current laptop chargers, AC power cords, DC converter bricks, etc out there now?

  • by goombah99 ( 560566 ) on Monday April 22, 2013 @02:23PM (#43517501)

    I have an iphone 5 and like newer samsungs and ipads these want to draw 2.1 amps from USB, which is a no-no for standard USB. THere are a number of USB hubs that pretend that they are apple/samsung compatible, promising 2.1 amps. But what they don't tell you is that you can't have 2.1 amps if the hub is connected to a computer. It will only act as a USB high current charger when it is incapable of making a serial connection. It's either a serial port or a high current charger but not both.

    I'm guessing this is because a lot of devices expect their current overload regulation to come from the USB hub which is limited to 0.5 amps by spec.

    Will this superspeed use the same USB plug and thus have the same limit of either being a charger or a USB port, or will it do both at the same time.

    • They may, but I'd assume that the cables would be very different. The main question I have though - having too many speed grades - low speed, full speed, high speed, super speed and now a new super speed - would the higher speeds automatically be degraded since the USB bus controller has to manage both the keyboard buses as well as the drives.
      • Not if you put a USB hub of the higher speed inbetween. USB 2.0 and higher require that if a hub supports superspeed, then it has to retransmit the incoming lower speed data at superspeed rates to minimize the amount of time it ties up the bus. I assume 3.0 is the same.
    • Comment removed based on user account deletion
      • Unless its 220V power.

      • by CastrTroy ( 595695 ) on Monday April 22, 2013 @04:14PM (#43518615)
        I'd like to know how this is supposed to work. You are going to have a lot of trouble getting 100W out of a laptop USB port. Are these only going to only be available in desktops? Even there there's probably quite a few desktops that don't have 100 "extra" watts in their power supply to provide to some peripheral. Although you can get a very high wattage power supply, you don't really need that much with modern processors, and SSDs. Especially if you don't have a particularly fancy video card.
        • by jandrese ( 485 )
          My guess is that the 100W delivery will be optional, and most manufacturers simply won't support it. In fact that's big enough that I don't expect to see it on many devices at all. Even your average desktop has no way to deliver 100W from the motherboard without plugging in supplemental power from the power supply (which we already do, but it would be yet more cables), and you will of course need a bigger power supply to support that.

          What I really hope is that the spec has a negotiation protocol where
  • Dangerous (Score:4, Insightful)

    by DoofusOfDeath ( 636671 ) on Monday April 22, 2013 @02:27PM (#43517537)

    I'm not a fan of a "data" cable that can kill me.

    • by h4rr4r ( 612664 ) on Monday April 22, 2013 @02:30PM (#43517565)

      Then stop wrapping them around your neck.

    • What voltage is being proposed. At 5 v that's 20 amps!!

      • by rjr162 ( 69736 )

        I know.. which depending on the run, using DC.. you're talking around 16 gauge to 18 gauge wire I'd have to guess. They aren't going to be your thin USB cables anymore.. going back more to the size of the original USB cables before they thinned down

      • Re:voltage? (Score:4, Insightful)

        by synapse7 ( 1075571 ) on Monday April 22, 2013 @03:12PM (#43517971)
        And require 16 or 14 gauge wire, that will be nice and convenient to carry around. I can't see this adoption being too widespread, only special use cases.
      • by Dahamma ( 304068 )

        From TFA:

        "So with this new specification, you can go from very small devices with 5 volts, 2 amps or 10 watts -- where USB starts -- up to 20 volts 5 amps and 100 watts,"

        It's no worse than a current laptop charger (bit better, actually, MB chargers are only 16.5v).

        • by rjr162 ( 69736 )

          I guess I missed that part.. I figured they HAD to be used higher voltages but I just based off of a worst case 5v deal

          • by Dahamma ( 304068 )

            Given most household breakers are 15-20A, before I read that bit in the article I was thinking "am I going to have to upgrade my electrical just to charge my phone!?" ;)

    • Re:Dangerous (Score:4, Informative)

      by Anonymous Coward on Monday April 22, 2013 @02:34PM (#43517617)

      It will most certainly not kill you. The voltages supplied by the USB cable is far below what it needs to push enough amps through your body to disrupt any bodily function. People usually say "it's the amps that kill you", what it should say is that "it's the amps that PASSES through your body that kills you". If I remember correctly from the specs it will provide no more than 15 to 20 volts maximum. Which is still considered safe.

    • by skids ( 119237 )

      Having looked at the PoE specs, it would be very hard to start a fire or shock oneself with that technology. It is very careful about only providing juice when it sees a valid endpoint on the other side and ensuring that the line resistance is not too high. About the only way to defeat it would be to inject a point resistance on a very short patch cable (within a few feet of the switch) which would dissipate the heat budget for a 300-foot cable in a small area without exceeding the resitsance budget. Tha

      • About the only way to defeat it would be to inject a point resistance on a very short patch cable (within a few feet of the switch) which would dissipate the heat budget for a 300-foot cable in a small area without exceeding the resitsance budget. That would be hard to do simply by running over a cable with a chair.

        How about a dog or a cat biting into it?

        Or a small child deciding to lick an open end of an USB extension cable dangling from the desk?

        • by skids ( 119237 )

          Like I said with PoE, there is constant monitoring of the electrical characteristics of the line. A dog biting into a live PoE link that had already completed negotiation would most likely trip the detection, and power would be removed within a tenth of a second. The dog could be especially unlucky and manage to hit it in just the right way to cause itself harm, but the probability of this is low AFAICT. Stray ends of PoE cable do not supply power until they detect a signature using a low voltage, low cu

      • USB is strictly for short distances, due to timing concerns (if I recall correctly). PoE isn't going anywhere, but it's not like manufacturers were interested in adding complexity to their laptops for a very small subset of users (within a niche that's small in its own right) who would pay for such a thing.

        • by skids ( 119237 )

          but it's not like manufacturers were interested in adding complexity to their laptops for a very small subset of users

          You are aware that corporations, not just individual consumers, buy lots of computers and gadgets, no?

          • How many corporations, faced with the question: "Should I pay *non-insignificant amount of money* more so that we don't need power bricks, or should I just buy a couple of extras in case a few fail prematurely?", will decide to spend said amount, multiplied by the sometimes massive number of systems purchased?

            Very few, if any.

      • PoE had the advantage(in terms of safety) of being heavily designed around the "As much power as possible; but Do Not trip any limits that would cause this to be treated as a power cable, rather than a data cable, for regulatory or insurance reasons" constraint.

        Since the whole point was to make it cheaper to operate small ethernet-connected peripherals, it was absolutely necessary that nothing happen that would cause all that twisted-pair running through the walls to suddenly void your fire insurance or req

    • You can't get the full current out of the power supply without going through a negotiation phase. If that doesn't happen the interface defaults to normal current limit of 0.5A. The same thing happens today with the USB 2.0 charging ports capable of delivering up to 2.1A.

  • by Tvingo ( 229109 ) on Monday April 22, 2013 @02:27PM (#43517545)

    So if you have 4 USB SS ports on a motherboard that motherboard is going to have to be able to supply 400W @ 5V? You can't be serious. We'll need dedicated power connections on the motherboard just to supply this.

    The example of using a TV to power multiple devices raises the same concerns. Now the TV power supply will be much more complicated. Rather than power just the 60-70W the TV draws it needs to have a power supply that could supply 100's of extra watts?

    The only application I see for this is to use 100W USB SS ports on walls for a common household DC standard interface. That could be interesting, but integrating it into devices is not simple. It adds levels of complexities to the devices that will need to supply the power.

    • by msauve ( 701917 ) on Monday April 22, 2013 @02:37PM (#43517645)
      I doubt the spec will say a device must be able to deliver 100W. It will be allowed, not required. There will be negotiation involved to determine the max power a device will deliver/can draw.

      Really, the only use you can see is a wall outlet? How about standardizing laptop power on this, eliminating all the proprietary "brick on a leash" power supplies, much like has happened with (most) cell phones? How about a single cable connection between a desktop and printer (no separate power cable for the printer)? How about a USB air conditioner, not just a fan (jk :-) )?
    • You beat me to it. Plus, want to bet that even if the devices do claim to able to supply 100w simultaneously to 'n' ports this will be via a very inefficient power supply that will, for most people, most of the time, just be sitting there unused but wasting more power than it should have done if just designed to power the unit it was supplying...

      Mind you, if I can throw out the dozens of power supplies I have plugged in around the house for just ONE standard, that's a big win...

    • Comment removed based on user account deletion
      • I'm guessing what we'll see is hubs that support this, with their own power brick. You plug the hub into the PC to deal with data, and the device into the Hub. This is already the case if you want to support multiple devices that add up to more than 500mA. You have to plug them into a powered hub.
    • by c0lo ( 1497653 )

      So if you have 4 USB SS ports on a motherboard that motherboard is going to have to be able to supply 400W @ 5V? You can't be serious. We'll need dedicated power connections on the motherboard just to supply this.

      You'd better believe it.
      For the USB4.0, we're prepared the spec for an integrated mini-generator at 1kW at least: you see, your phone is not mobile if you are not mobile; and you are not mobile unless you can at least ride a scooter. But why stop here? use 20 USB4.0 hubs and you won't need a Tesla.

    • by tlhIngan ( 30335 )

      So if you have 4 USB SS ports on a motherboard that motherboard is going to have to be able to supply 400W @ 5V? You can't be serious. We'll need dedicated power connections on the motherboard just to supply this.

      Not to mention what voltages are involved.

      At 100W, you're not going to use 5V anymore - you're talking 20A, and your USB cables will be like jumper cables to have the ampacity without losing it all in the cable (cable losses are IIR losses - they go up with the square of the current).

      You're probabl

  • 100W Hot swappable. I really don't think the chinese are up to it. I'll have to double check the specs. (Will they)?
    • Inrush is still probably limited to a few hundred mA. After negotiation, the device can power up additional components that draw more power. Devices won't be drawing that much power while you're plugging it in.
      • Unplugging a 5-amp 20V connection is still going to produce some arcing. That takes a toll on both connectors, so you could easily end up with a warranty nightmare if there isn't a suitable plug-preemption mechanism going on.
        • Existing USB plugs are designed so that some pins disconnect quicker than others, letting the system know that you are unplugging. Theres no reason that couldnt cut the draw of current to prevent arcing, and I imagine that such a cut could happen quicker than you could finish pulling the plug.

        • by alannon ( 54117 ) on Monday April 22, 2013 @04:50PM (#43518937)
          Take a look at the conductive "pins" (strips) on the inside of a USB connector (cable side). See how they're not all the same length? When you're pulling out the plug, the shorter pins (that don't carry power, only data) lose contact first, triggering the hub end to cut off the power pins before the power pins break contact. The reverse happens when you plug it in. No power from the hub until the data pins connect. Thus, no arcing. Any connector designed to be hot-swappable has this type of design.
          • by dgatwood ( 11270 )

            I actually took advantage of that design to work around a hardware bug once. The driver set up the device correctly, but then was unable to actually talk to the device. By pulling the USB plug halfway out and putting it back in, the OS reenumerated the bus, discovered the device (now correctly configured), and everything worked. (I later added a reenumeration call upon detecting this bad behavior, which fixed the problem completely, but it was useful as a quick workaround.)

  • FTFA:

    "I think we'll see products in the market by the Christmas season in 2014," Ravencraft said. "The companies have to build silicon - device, host, bridge and hub silicon."

    So it looks to be quite a ways out. Still, I'd love to see a video output spec that doesn't have mandatory DRM. I didn't see any mention of HDMI in the article, so there's a slim chance of this new interface not being broken by design...

  • by markhahn ( 122033 ) on Monday April 22, 2013 @02:42PM (#43517683)

    the current micro-USB connector kinda sucks. if we're going O(100x) more watts, maybe we should take the opportunity to do a better connector, too.

    symmetric would be nice, and less prone to jamming, misalignment and torquing.

    • by steveha ( 103154 )

      I agree 100%: if we are going to mutate the USB standard this much, let's take the opportunity to make a symmetric connector. I don't want to buy Apple products, but I do think that they did a great job on the physical design of the Lightning connector, and I wish I could have something like that on all my devices.

  • I'm looking forward to charging my monitor!

  • by fuzzyfuzzyfungus ( 1223518 ) on Monday April 22, 2013 @02:57PM (#43517845) Journal

    This "100watt USB!!!" nonsense has been floating around for a while, and it just never seems to get any better.

    Uncertainty is Bad. 100watts is a lot of power. Your laptop's brick is almost certainly specced for less than that. Even a desktop PSU will likely be 250-350, outside of gamers and workstations(and often the upper end of the range is...optimistic... at best). Now, if we have this '100watt USB', what are devices going to do? is your next laptop going to ship with a 265 watt brick, so that it has the same 65 watts for itself as your current one does, and can handle both its ports being used? Is it going to ship with exactly the same brick and simply brown out the USB port at some unpredictable power level?(extra credit awarded if that unpredictable level depends on whether the battery is charging or not, and the current CPU load...) If "100watts" is actually "anywhere between ~15 watts and 100watts, largely unpredictable to the consumer", what are peripheral manufacturers going to target? What good is theoretical capacity that you can't actually use because a nontrival-but-hard-to-predict percentage of your customers can't actually deliver it?

    Bus power is nice because it reduces cabling and complexity. However, if it isn't dependable, you can't rely on it, so you have to fall back on designs that pretend it isn't available. Now you have more expensive USB ports(in some devices) and wall warts or PSUs for your higher power peripherals! What a win!

    This isn't to say that any increase in bus power is bad(given USB's use cases, 'enough power to spin up a 2.5 inch HDD' or 'enough power to charge a smartphone' are pretty useful things. However, you can't just keep pushing the ceiling without limit: the wider the uncertainty, the greater the costs(for devices that actually engineer to spec and include the capability to support the top of the range) and the greater the limits and confusion(for devices that target more realistic real-world output values, and for the poor bastards who think that 'USB' means 'works when plugged into my USB port').

    • Now, if we have this '100watt USB', what are devices going to do? is your next laptop going to ship with a 265 watt brick, so that it has the same 65 watts for itself as your current one does, and can handle both its ports being used?

      On top of that, power supply efficiency starts to drop off steeply [twimgs.com] when power draw is only a small fraction of the power supply's capacity.

    • by rjr162 ( 69736 )

      "100watts is a lot of power."

      Only in context. Take 120 volts.. that's less than 1 Amp. That's less than the LED replacement bulbs for Halogens use. 220 Volts (for the Euro guys). That's less than half an amp. Current USB Voltage (5 volts)... that's 20 Amps. Now that is a bit much for DC like that.. you'll need around 16-18 gauge wire ideally. But this spec is (after I was informed of this) using 20 volts, so that's 5 amps. That's not that much. You have some fairly small wires in your car fused at 10 to 15

  • by WaffleMonster ( 969671 ) on Monday April 22, 2013 @03:13PM (#43517979)

    USB 2.0 section 7.2.1.2.1 says 5 A max as in when you hit it the protection circuit kicks in and limits or shuts down current.

    To actually pull 5A means the required protection circuit would need to trip above 5A to be useful which violates this section.

    The more reality based problem is 28 gauge wire over 10-16 FT of cable carrying 5 amps is really stretching it...the voltage losses in that scenario will significantly pull down the actual watts being delivered into heating the wire.

    At 10 ft the voltage drop when pulling 5 amps is ~6 volts. At 16 ft the drop is a staggering ~10 volts.

    Unless there is a whole lot of intelligence to probe wire losses as part of the power specification and take the wire itself into consideration when calculating maximum current availability 100 watts over only 20 volts is really stretching it.

  • ...perhaps in 10 years, USB lightbulbs will be the norm and young teenagers will be telling their little siblings about when we used to screw in light bulbs. Oh man, that would mean all those "how many ___ does it take to screw in a lightbulb" jokes will be outdated.
  • Not enough (Score:4, Funny)

    by istartedi ( 132515 ) on Monday April 22, 2013 @04:18PM (#43518649) Journal

    I want my USB controlled and powered Easy Bake oven.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...