At Google, artificial intelligence isn’t just a means of building cars that drive on their own, smartphone services that respond to the spoken word, and online search engines that instantly recognize digital images. It’s also a way of improving the efficiency of the massive data centers that underpin the company’s entire online empire.
 According to Joe Kava, the man who oversees the design and operation of
 Google’s worldwide network of data centers, the web giant is now using 
artificial neural networks to analyze how these enormous computing 
centers behave, and then hone their operation accordingly. These neural 
networks are essentially computer algorithms that can recognize patterns
 and then make decisions based on those patterns. They can’t exactly 
duplicate the intelligence of the human brain, but in some cases, they 
can work much faster–and more comprehensively–than the brain. And that’s
 why Google is applying these algorithms to its data center operations. 
“These models can learn by crunching the data over and over again,” Kava tells WIRED.
 The effort is part of recent resurgence in artificial intelligence that
 spans not only Google but Facebook, Microsoft, IBM, and countless other
 tech outfits. This includes several high-profile projects that depend 
on AI, such as Google’s self-driving cars and IBM’s Jeopardy-winning
 Watson supercomputer. But, behind the scenes, many companies are also 
adopting a new kind of artificial intelligence known as “deep learning,” which can significantly advance the state of the art.
 Google’s data center project is the brainchild of a young engineer 
named Jim Gao. According to Kava, Gao is affectionately known as “Boy 
Genius” among those on the company’s data center team. After taking an 
online class with Stanford professor Andrew Ng–a leading artificial intelligence researcher who now works for Chinese web giant Baidu–Gao used his Google “20 percent time”
 to explore whether neural networks could improve the efficiency of the 
company’s data centers. And as it turns out, they could.
 Every few seconds, Google gathers all sorts of information that 
describes the operation of its data centers, including everything from 
how much energy these facilities consume to how much water they use to 
cool their computer hardware to the temperature of the outside air, 
which can directly affect cooling methods.
 What Gao did was use all this data to build an AI computer model that 
could predict the efficiency of a data center based on certain 
conditions, and over the course of about twelve months, he refined this 
model until its predictions were almost completely accurate (99.6 
percent). Knowing the model was reliable, the company could then use it 
to recommend ways of improving efficiency inside its data centers.
 As Kava puts it, the model became a kind of “check-engine light” for 
these computing facilities. If a data center’s efficiency doesn’t match 
the model’s prediction, the company knows it has a problem that needs 
fixing. But Google can also use the model to decide when to make 
particular changes inside the data center, such as when to clean the 
heat exchangers that help cool the facility. Two months ago, the company
 had to take some computer servers offline, and though this would have 
ordinarily caused a drop in energy efficiency, it used Gao’s AI model to
 adjust a data center’s cooling infrastructure so that efficiency stayed
 relatively high. The model can identify things, Kava says, that Google 
engineers can’t necessarily identify on their own.
 Detailed in a white paper published to the web this morning,
 Gao’s data center model doesn’t involve deep learning. It uses an older
 neural-net framework long used for things like generating product 
recommendations on retail websites. But deep learning may eventually be 
used in ways similar to Google’s methods, helping to improve the 
efficiency of our increasingly complex online universe, according to 
Josh Patterson, a former engineer at big data startup Cloudera who’s 
working to bring deep learning techniques to companies beyond the giants
 of the web. Deep learning, he explains, is a “higher grade” 
machine-learning tool that can improve all sorts of AI tasks, from 
product recommendations to image search to, yes, the analysis of complex
 computer networks.
 Today, Google is using AI to improve the efficiency of its data 
centers. But tomorrow, similar techniques could be used to hone the 
operation of the internet as a whole, which will only grow more complex 
as we accommodate the new breed of wearable computers and other smart 
hardware devices. In other words, artificial intelligence could become 
an important cog in the internet of things.
 Source : Extraa Education 

0 comments:
Post a Comment