Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Google

Data Center With a Brain: Google Using Machine Learning In Server Farms 26

1sockchuck (826398) writes "Google has begun using machine learning and artificial intelligence to analyze the oceans of data it collects about its server farms and recommend ways to improve them. Google data center executive Joe Kava said the use of neural networks will allow Google to reach new frontiers in efficiency in its server farms, moving beyond what its engineers can see and analyze. Google's data centers aren't yet ready to drive themselves. But the new tools have been able to predict Google's data center performance with 99.96 percent accuracy."
This discussion has been archived. No new comments can be posted.

Data Center With a Brain: Google Using Machine Learning In Server Farms

Comments Filter:
  • ML has already been proposed to improve the performance and resource efficiency of large-scale datacenters. Detailed information on two of the most well-known examples from Stanford and Berkeley can be found below: http://engineering.stanford.ed... [stanford.edu] http://www.eecs.berkeley.edu/P... [berkeley.edu]
  • by m00sh ( 2538182 ) on Wednesday May 28, 2014 @11:56AM (#47110205)

    The article seems top heavy ... meaning the article has all the emphasis on "machine learning in server farms" and way too little on what it actually produces. Some fuzzy paragraph on cooling methods when some servers are taken offline.

    You could use "machine learning for peace in the middle east", or use "machine learning for fixing the economy" but unless it produces real results, it's just an experiment.

    • I actually work with Jim Gao. His design doc was already open in another tab when I saw this article. Jim's a really smart guy. Really nice guy too.

      I can't talk too much about it. You have a huge amount of electricity coming into the DC, on the order of a lightning bolt, and it has to be intelligently choreographed to make the best use of it. Then you have to carry away the heat. There is a lot of machinery to do that, and by accurately predicting where and when power is going to be needed, both for s

  • Comment removed based on user account deletion
  • I'm very curious as to why they are using a neural network for this. I'm no machine learning expert, but I was under the impression that neural networks were somewhat outdated. And yet it seems like Google is spending rather a lot of time with them lately.
    • Re: (Score:3, Informative)

      by Anonymous Coward

      Not outdated - just that they work well only with certain use cases. I know that SMART whiteboards use neural networks to process camera input in identifying if a finger or a pen is being used to write something - and they are very good at that. No matter how you hold the the pen they will recognise it, and never confuse a finger for a pen. They are fussy enough that I've been unable to duplicate a pen for myself with a 3D printer.

      But they are no miracle solution. There are plenty of cases where you might t

  • by larryjoe ( 135075 ) on Wednesday May 28, 2014 @12:09PM (#47110345)

    Artificial intelligence and neural networks are a hot topic, so this is piggy-backing on that trend. It's not a surprise that Andrew Ng's work is referenced quite a bit.

    While the modeling is interesting, it's seems to be just modeling at this point. The main claim of the white paper is high PUE prediction accuracy by the model. While that's academically interesting, the real use is in feedback for optimization. The white paper author realized that and included that optimization problem as one of the examples in the paper. However, the optimization was achieved "through a combination of PUE simulations and local expertise." I'm guessing that the local expertise part was relatively significant because there is basically no discussion of this even though it is the one application that would really make this work practical and really interesting. The paper claimed that this neural network-based optimization reduced PUE "by ~0.02 compared to the previous configuration." But, I have no idea how that would have compared to optimization using just local expertise without the benefit of neural network modeling.

    • Hardly new, though. People have been trying to show niche success of AI and NN can translate into major success for years, at least since the 1980s.
  • It's probably fairly easy to predict usage. They've been doing it for ages with the electricity power grid.
    But what will happen when a singularity arises?
    • by geekoid ( 135745 )

      nothing. The singularity requires resources. Resource humans need to provide. So while you may have a system that designed smarter systems, assuming it's possible to do that, it' snot like they will magically appear everywhere.

      Singularity is a largely overblown issue that fits right into the same meme that infect humans about religion.

  • Google is only using metadata and not actual server data for their analysis to determine threats to server stability, right?
  • Colossus. Need I say more?

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...