Google’s New Artificial Intelligent Chip and Supercomputer are Ready to Kick Butt

Latest Hot DEALS

Google is already a household name when it comes to traditional computing, but now the company wants to expand to become an AI-focused hardware maker. An announcement at the company’s annual developer conference was made by CEO Sundar Pichai that confirmed the completion of a new computer processor designed to perform machine learning in a way that will give others a run for their money. This is probably going to be the first of many AI announcements to come from Google as the company goes through this transformation.


The new processor is called the Cloud Tensor Processing Unit and is named after TensorFlow – the company’s machine-learning framework. Training is a huge part of machine learning, and in order to create an algorithm that recognized different cars, for example, you would first need to feed in thousand of car images along with non-car images until it was able to recognize the difference and pick them out itself. But to train a large model to do a complex task could take days or even weeks to train.

Pichai also advised as part of the announcement that the company is in the process of creating machine-learning supercomputers, or Cloud TPU pods as well as TensorFlow Research Cloud, which will consist of thousands of TPU’s, all accessible via the internet.  “We are building what we think of as AI-first data centers,” stated Pichai. “Cloud TPU’s are optimized for both training and inference. This lays the foundation for significant progress [in AI].”


There were also a number of AI research initiatives announced during Pichai’s speech including an added effort to develop algorithms capable of learning time-consuming tasks integrated with further machine-learning algorithms. He also confirmed that Google is currently developing AI tools for medical image analysis, molecule discovery, and genomic analysis.  It’s no real surprise that Google is heading in this direction and is driven in some part by the need to speed up its own operations.

Strategically speaking, Google could stop Nvidia from cornering the market in AI hardware and machine learning applications and maybe that’s another reason why the company has headed strongly in this direction. As a performance measure to see how well the TPU’s are working, Google says its algorithms could be trained quicker on the new hardware than what’s available now. “These TPU’s deliver a staggering 128 teraflops, and are built for just the kind of number-crunching that drives machine learning today,” said Fei-Fei Li, chief scientist at Google Cloud and director of Stanford’s AI lab.



More News to Read

Comments

comments

Follow Us For News and Discount Deals

TrendinDEALS

More like this
Related

Poker in the New Digital Era: Is It Worth it to Play Poker Online?

Without a doubt, poker is a timeless card game...

The Future of AI: Insights from the Godfather of AI

In the world of artificial intelligence, Geoffrey Hinton stands...

The Science Behind Cold Plunging: Is It Worth It for Your Health?

Ready to cold plunge? We dive into the science...

Unraveling the Mystery of the Ninth Planet: Could Modified Gravity Hold the Key?

In the ever-evolving realm of astrophysics, a recent revelation...