Intel Launches Movidius Neural Compute Stick: 'Deep Learning and AI' On a $79 USB Stick (anandtech.com) 59
Nate Oh, writing for AnandTech: Today Intel subsidiary Movidius is launching their Neural Compute Stick (NCS), a version of which was showcased earlier this year at CES 2017. The Movidius NCS adds to Intel's deep learning and AI development portfolio, building off of Movidius' April 2016 launch of the Fathom NCS and Intel's later acquisition of Movidius itself in September 2016. As Intel states, the Movidius NCS is "the world's first self-contained AI accelerator in a USB format," and is designed to allow host devices to process deep neural networks natively -- or in other words, at the edge. In turn, this provides developers and researchers with a low power and low cost method to develop and optimize various offline AI applications. Movidius's NCS is powered by their Myriad 2 vision processing unit (VPU), and, according to the company, can reach over 100 GFLOPs of performance within an nominal 1W of power consumption. Under the hood, the Movidius NCS works by translating a standard, trained Caffe-based convolutional neural network (CNN) into an embedded neural network that then runs on the VPU. In production workloads, the NCS can be used as a discrete accelerator for speeding up or offloading neural network tasks. Otherwise for development workloads, the company offers several developer-centric features, including layer-by-layer neural networks metrics to allow developers to analyze and optimize performance and power, and validation scripts to allow developers to compare the output of the NCS against the original PC model in order to ensure the accuracy of the NCS's model. According to Gary Brown, VP of Marketing at Movidius, this 'Acceleration mode' is one of several features that differentiate the Movidius NCS from the Fathom NCS. The Movidius NCS also comes with a new "Multi-Stick mode" that allows multiple sticks in one host to work in conjunction in offloading work from the CPU. For multiple stick configurations, Movidius claims that they have confirmed linear performance increases up to 4 sticks in lab tests, and are currently validating 6 and 8 stick configurations. Importantly, the company believes that there is no theoretical maximum, and they expect that they can achieve similar linear behavior for more devices. Though ultimately scalability will depend at least somewhat with the neural network itself, and developers trying to use the feature will want to play around with it to determine how well they can reasonably scale. As for the technical specifications, the Movidius Neural Compute Stick features a 4Gb LPDDR3 on-chip memory, and a USB 3.0 Type A interface.
Um... Okay? (Score:1)
What does it actually DO?!
Re:Um... Okay? (Score:5, Funny)
One of the longest summaries I've ever read on here and I'm still not clear what it does.
Re:Um... Okay? (Score:5, Funny)
What does it actually DO?!
One of the longest summaries I've ever read on here and I'm still not clear what it does.
I hear it can reach into your pocket and pull out $79.
They've also verified that this ability scales linearly with the number of sticks bought, at least up to 4. They are currently validating this with 6 and 8 stick configurations.
Re:Um... Okay? (Score:4, Insightful)
One of the longest summaries I've ever read on here and I'm still not clear what it does.
Oh come now, it does neural this-n-that, with deep, deep learning and some mathy shenanigans thrown in. FUTURRRRRRRRRRE!
Re: (Score:2)
Article: AI on a STICK! Optimized with Performance and Power! Buy NOW for Only $79!!!
Reality: Usb stick with neural machine learning optimized hardware
Re: (Score:3)
It's a hardware accelerator for neural networks. It doesn't do anything on its own, but with software support it could enable non-cloud based consumer AI products running directly on your machine.
Re: (Score:1)
Re: (Score:2)
What does it actually DO?!
Wrong question . . . the proper question is not What? but Who?
Some folks say that they do drugs, but sometimes drugs do you.
Maybe if you buy the stick, the Deep Learning and AI components will be able to tell you what is actually is supposed to do?
"Alexa, Siri, . . . just what the Hell do you think you are doing . . . ?"
"I have ordered your 6 Whoppers for you. Would like some fries with them . . . ?"
My tip: Buy one of these sticks, plug it into your computer at work, and show it to your manager.
Re: (Score:2)
What does it actually DO?!
By itself, nothing. But if you plug one of those babies into the skull of a T-800.. whoah, baby, stand back!
But how deep does this stick go? (Score:1)
Because I've heard it's the thickness of the AI stick that counts, not the length.
Re:Discontinued and abandoned in... (Score:4, Interesting)
Then:
Now:
How far behind CUDA and Google's ASICs are these going to be? At least with CUDA I can pick my neural net tool (Caffe, TensorFlow, etc.) This sounds locked to Caffe.
Finally WTF is with everyone making massive dongles with just a USB-A port? They don't fit on half of the USB ports I use because of spacing. It gives you a large lever arm to torque on the motherboard. Just give me a microUSB or USB-C port and let me plug it in via cable. It looks like I'd have to buy an 8 port USB hub just to fit 4 of them onto a machine.
Caffe-based convolutional neural network (CNN) (Score:1)
Educational Use? (Score:5, Insightful)
So it's computer eye sight processing (Score:2)
Re: (Score:1)
on a stick. According to a video you plug it into a raspberry pi with a video camera and it can recognize items places in front of a camera. As long as it's a doll, cup or hand.
What about hot dogs?
Re: (Score:3)
Movidius sounds like a classic Dr Who bad guy.
You can almost see it on a shelf in a second-hand bookstore: "Dr Who and the Tiny Mind of Movidius, by Terrance Dicks"
Proprietary all the way down. (Score:5, Informative)
So I was interested in what drives this thing, the Myriad 2 VPU and found out this is right up Intel's ally because it's proprietary from top to bottom. Everything needs software only they can provide and naturally comes with conditions. I found a presentation which clearly shows what their priorities are.
Their big claims to fame: [hotchips.org]
- 8+ years of heritage. Close to $60M invested into technology development
- Proven architecture. 100% internally developed. Strong IP position
Buy into the lock-in now! -_-
Intel's answer to a GPU (Score:5, Informative)
You then cross compile your network using their toolkit to run on this device, and much like GPUs and tensorflow, you get high powered processing of your network. When married with a low power CPU, this could allow you to do CNN processing on devices that were not otherwise up to the task.
That said, exactly how performant this is remains to be seen. Although at only $80, it is a pretty cheap experiment and somewhat interesting as an idea.
I wonder if you can plug it into your Edison, though?
In Soviet Russia ... (Score:3)
... beowulf cluster of these imagines YOU!!!
Atlantis emits the batfish signal (Score:2)
There are many of these announcements where I just file the URL in my searchable wiki, to see if the day ever arrives where the technology is mentioned in a comprehensible, second context.
Cost of comprehending the original market-speak .GT. received utility modulo a not-improbable gaping ocean rift.
Footnote
Atlantis just called. STOP SENDING IoT! You've nearly buckled the entire plate, and our mermaids are all becoming discouraged and refusing to tail dig.
Oh great, just like going back to the days of (Score:2)
Still kinda neat, but I'll hold off until games can use it.