AI Has a Compute Dependency Problem, Facebook VP Says (venturebeat.com) 108
In one of his first public speaking appearances since joining Facebook to lead its AI initiatives, VP Jerome Pesenti expressed concern about the growing amount of compute power needed to create powerful AI systems. From a report: "I can tell you this is keeping me up at night," Pesenti said. "The peak compute companies like Facebook and Google can afford for an experiment, we are reaching that already." More software innovation will be required if artificial intelligence is to grow unhindered, he said, and optimization of hardware and software -- rather than brute force compute -- may be critical to AI in years ahead. [...] "We still see gains with increase of compute, but the pressure from the problem is just going to become bigger," Pesenti said. "I think we will still continue to use more compute, you will still net, but it will go slower, because you cannot keep pace with 10 times a year. That's just not possible."
A misnomer (Score:1)
Re: (Score:2)
Re: (Score:2)
Eight, sir; seven, sir; six, sir; five, sir; four, sir; three, sir; two, sir; one!
Re:A misnomer [endless terminology battle] (Score:1)
Oh no, not this again. [slashdot.org]
Re: (Score:2)
yes this again, because people can't get it into their heads that any form of AI doesn't yet exist, if ever it will. We do have decades old techniques (nothing new since the 1970s) that can be done by modern fast hardware to do sometimes useful work.
symbolic AI? genetic algorithms? Neural nets? If you're under 40 years old it's older than you are....
Re: (Score:3)
Anyways, it is correct that for the most part modern "A.I." is just the old "A.I." but on much faster hardware and with a lot more data. The faster
Re: (Score:2)
The faster hardware isnt the essential part, either
Binarized neural networks are around 99.7% accurate compared to large-word neural networks, and you can add more nodes to make up the difference. So e.g. instead of 1,000,000 nodes with 64 SRAM bits per each weighted factor consuming 6 transistors each, you can have 1,500,000 nodes with 1 SDRAM bit per each weighted factor to represent any value from the set {-1,1}.
The connection count goes up, but that's still 1.5M connections to 9M transistors instead of 1M connections to 64M transistors; and it's sti
Re: (Score:1)
How do you know it doesn't exist if you can't define it? I cannot say that Foo doesn't exist if I cannot define Foo, otherwise nobody would know what to look for to verify it's existence.
Yes, I will agree that many of the techniques discovered while doing AI research end up being called "AI", but as I mentioned in the linked thread, defining something by what goal existed in people's heads while discovering/inv
Re: (Score:2)
Natural intelligence seems to be some sort of neural network with 86+ billion or so neurons, give or take a few billions. That's for one human brain.
They're pretty slow though. Maximum firing rate is about 1kHz. We can trade off speed with size by letting the same hardware perform calculations in series rather than parallel.
For example, the LC0 project uses a neural net to play chess. The network is only 20x256 big, which is tiny compared to a human grand master brain (who needs a huge part of the brain just to get dressed and sit in a chair). But it compensates for this size by running thousands of times faster, having perfect memory for all the interm
Re: (Score:1)
As I mentioned, there may be multiple ways to make an intelligent machine. We can't assume fully mirroring biological designs is the best or only way. Airplanes would be powered by flapping wings if we always followed just biology.
Re: (Score:2)
AI has been defined simply and easily for a long time, it's just that no computer system meets the criteria (of performing the tasks a human can do) so it's bad for marketing.
Re: (Score:1)
You mean define it as the essence of something we have yet to achieve? It's like a reverse definition: "X is all matter which is not a unicorn." That's one approach.
Re: (Score:2)
because people can't get it into their heads that any form of AI doesn't yet exist, if ever it will
So you want to reserve the widely used concept of "AI" for something that may never exist, and then all agree on a new term to describe what people are working on right now ? For what reason, exactly ?
Re: (Score:2)
For the reason that computers still routinely fail at tasks humans can easily do, and we shouldn't let marketing buzzwords deceive people to believe something that isn't true.
Re: (Score:2)
Artificial Intelligence is just that: artificial intelligence. It's not General Intelligence. GI is a different problem, and one that's...let's just say the pieces are all there and nobody's put them together just yet. It's current tech, but it's undiscovered current tech.
You're not going to need to invent e.g. room-temperature superconducting alloys for this.
Re: (Score:2)
Today, in the pages of "The Times" of London (once a reputable if unreliable newspaper of record), I saw a full-page advertisement for *toothbrushes* with "artificial intelligence".
One sees why Tom Lehrer retired from writing satire.
Re: (Score:2)
Oh yeah!, You're right, they should totally call if F[ake] P[roblem solving] instead.
Cause, you know, it's not like these systems are ACTUALLY solving problems that could only be solved by people before.
The solutions they provide are fake because they only come from a simulation, even if they work and actually solve the problem - still fake!
Re: (Score:1)
Forget that! They're using compute as a common noun, and my reaction is uncontrollable rage!
Re: (Score:1)
I prefer the term I[diot] S[avant] and that’s being generous.
Re: (Score:2)
If everybody calls this AI, then this is what the term means.
Re: (Score:2)
Yeah, and you are a bot given the intelligence it takes to pick a headline word and repeat, "I do not think it means what you think it means".
This is long overdue. (Score:5, Insightful)
In the early days compute was the expensive resource, so developers wrote efficient code that could make the best use of the available compute. These days development time is the expensive resource, so most products are a tangled mess of inefficient building blocks strung together to minimize development effort, masked by relatively abundant compute. AI forces a compute resource crunch that will require a return to efficient coding, and that is long overdue.
Re: (Score:1)
While this is maybe true for some types of software, in my experience it is not the case for "scientific software" like this. Sure, there may be a lot of bloat in the software packages used etc. but at the end of the day, the actual kernel consuming 99% of the compute is often highly optimized, so the waste is only in the surrounding software used only a fraction of the time. The reason it hasn't been optimized is due the very fact it uses so few resources, so it even if it was essentially optimized away to
Re: (Score:2)
Re: (Score:2)
The language you should have brought up is Julia, because thats where the researchers are right now. Hardware acceleration out of the box, core to the language itself.
Re: This is long overdue. (Score:2)
Re: (Score:2)
Transport was also expens, so was store. Than was solve though. The same compute will.
Spotted the AI.
Re: (Score:2)
Do you want The Matrix? Because that's how you get The Matrix.
Re: The biggest problem with AI (Score:2)
Honestly, the matrix didn't sound that bad to me. I'm the movie, being ignorant and in the matrix was far better than the crappy alternative.
Re: (Score:2)
You had to make it weird.
Simple solution (Score:1)
Just tie everything into Skynet's vast network; problem solved!
Re: (Score:1)
I tested putting Pikachu in the Marvel Comics universe, and no kittens died.
CS (Score:4, Insightful)
optimization of hardware and software -- rather than brute force compute -- may be critical to AI in years ahead.
It's almost as if there is an entire field of academic study dedicated to solving this problem. Weird.
Re: (Score:2)
optimization of hardware and software -- rather than brute force compute -- may be critical to AI in years ahead.
It's almost as if there is an entire field of academic study dedicated to solving this problem. Weird.
Is? From what I've see it's more like "was". At least on the software side. Optimization has been pushed out of the mindset of the majority of devs. Hell forgoing optimization has even been enshrined by some as a waste of money. Traditional development has been cursed with an over-abundance of hardware resources for a while now. Don't spend time optimizing, we'll just throw more hardware at it! We need to ship!
Re: (Score:2)
The response was of course a hundred eager feature seekers that said "optimization can wait"
Fast forward to 2019, still parsing text files or something, and whatever its doing its still doing it poorly. There shouldnt be enormous load times on such simplistic smal
Re: (Score:2)
Re: (Score:2)
This is cancer and it hurt to read.
Hurt. Like I was programmed by God to recall it every time my heart beats. Every word. Like running a full DRM routine every time a video game character changes position (xyz coords). On the store-shelf release.
I saw "pacman" on my friend's iphone. 150MB. Not cosmetic remake or anything, 40-years-vanilla pacman. Not exactly a runtime remark, but still.
Re: (Score:2)
"Optimization can wait" is the worst thing ever said. No, it can't.
Re: (Score:2)
The kind of optimization you make in Comp-Sci is not the same kind that you do on your average business app to make it 10x faster.
It's the Big O notation stuff where you analyze a combinatorial problem that would take the longer than the heath-death of the universe to complete, and find a way to re-express it so that it takes a mere few hours (or weeks) of processor time. That's the kind of thinking that is needed in the "powerful AI systems" that these companies are doing in their backyard for internal con
Problem? What problem? (Score:4, Interesting)
Maybe they should try using Intel's Pohoiki Beach, a neuromorphic computer capable of simulating 8 million neurons.
incomprehensible (Score:3)
Is he the VP of suspending the laws of physics/economics, and are the terms of his employment such that whether he lives or dies next week depends upon this? If so, he's got a real problem on his hands, and I can see why he's not sleeping well.
Otherwise, as the world turns.
Re: (Score:3)
Yeah, I grew up worrying about nuclear war, he's not sleeping well because good programming is more helpful than brute force for some algorithms? wtf?
Maybe he should discontinue some of his medications instead.
Just run it... (Score:1)
Re: (Score:1)
kwatz!
An Interesting Technique for Some Problems... (Score:3)
The summary is that there is a reservoir (a portion of the system like a neural network) that evolves state according to the input and previous state. The reservoir does require that it has certain attributes, but other than that it can be essentially random connections/weights. Instead of training the entire network, there is a transformation of input into the reservoir to determine next state, then there is a simple final layer that is trained to map the reservoir state to the desired output.
It's apparently a more efficient system for time based sequences compared to training a recurrent neural network.
Not to worry.... (Score:2)
... quantum computing for the consumer is right around the corner, but I'm not saying how close that corner is.
Re: (Score:2)
... quantum computing for the consumer is right around the corner, but I'm not saying how close that corner is.
We can't know how far away the corner is at the same time that we know what angle the turn is, so while I agree it is just around the corner, I don't think that implies that we will get there.
Not even close (Score:2)
AI does not have a compute problem... we attempting to create AI have the compute problem.
Any serious AI is years away... likely still a century. AI development is just completely at odds with business operations. We need AI to make money right now it just can't do this. Sure we have lots of knock off AI that is not even close to AI that does great calculations but those are just algorithms.
Until AI is able to rewire itself and recode itself we are not going to have an AI close to sentient quality. What
Re: Not even close (Score:2)
Re: (Score:2)
Until AI is able to rewire itself and recode itself we are not going to have an AI close to sentient quality.
Who says that we want that ? If the AI is working properly for the task it was designed for, it is generally preferable that it doesn't attempt to rewire itself, and risk making new mistakes.
Re: (Score:2)
Re: (Score:2)
which AI systems use Fortran for the GMMs?
TensorFlow uses C++ Eigen under the covers but did FB do something different?
Re: (Score:2)
which AI systems use Fortran for the GMMs?
TensorFlow uses BLAS. For binary implementations I presume it's OpenBLAS, which is C, or you could use Intel MKL, but in theory you could use a Fortran implementation. I've never tried that as it's often enough of a pain to build from source as it is.
Totally disagree. Hardware is king (Score:2)
At some point, someone (probably at some three letter agency, but maybe not, maybe a medical researcher or public AI researcher) is going to scribble some back of the napkin numbers and realize that it's feasible to build or
Brute force bootstrapping AI (Score:3)
Sounds like a chicken/egg problem (Score:2)
How failed at the first part, convincing us we need either chickens or eggs.
The GPU revolution in recent years is not enough (Score:2)
The GPU revolution in recent years is not enough. It feels like optimization has taken a back seat with an embarrassment of luxury that modern CUDA/OpenCL hardware has provided the industry.
On the other hand, GPU hardware keeps improving performance even better than today's anachronistic CISC CPUs from Intel and, to a lesser extent, AMD are.
"Compute" is a verb (Score:3)
"Compute" is a verb, you f*cking nitwit.
Re: (Score:2)
I came here to make the same comment, although maybe not with expletives. Something along the lines of "since when did 'compute' become a noun?" But then I realized it was a VP talking, and the article was on venturebeat.com, and I realized I could immediately disregard the whole thing.
For context, here's IBM's Tom Watson Jr [ibm.com] back in 1970:
Well, duh! (Score:2)
All this fancy computing isn't free.
It's kind of like those who fear that all our jobs are being taking over by robots. They forget that the most sophisticated the robot has to be, the more expensive it is.
So much for the Singularity (Score:2)
I guess that this kinda puts to rest the idea of a artificial intelligence which is exponentially growing more intelligent.
Maybe it doesn't prohibit an AI which is more intelligent then it's creators but not one which is constantly growing....
I'll put money on an general AI also having a problem with latency. As more compute units are added to it, then the latency of the entire system will decrease unless there's compute redundancy and acceptable errors that creep into the outputs of the system.
Anyway, it's