قالب وردپرس درنا توس
Home / Technology / Google unveils tiny new machine-learning AI chips on the device

Google unveils tiny new machine-learning AI chips on the device



Two years ago, Google unveiled its tensor processing units or TPUs – specialized chips that are stored in the company's data centers and facilitate AI tasks. Now the company is moving its AI expertise from the cloud and has taken out the new Edge TPU. a small AI accelerator that performs machine learning tasks in IoT devices.

The Edge TPU is designed to do what is called inference. This is the part of machine learning in which an algorithm actually performs the task for which it was trained; such as recognizing an object in an image. Google's server-based TPUs are optimized for the training part of this process, while these new Edge TPUs provide the inference.

These new chips are designed for enterprise use, not for your next smartphone. This means automating tasks such as quality control in factories. This type of job on the device has a number of advantages over the use of hardware that needs to send data for analysis over the Internet. Machine learning on the device is generally safer; experiences less downtime; and delivers faster results. That's the selling point in any case.


The Edge TPU is the little brother of the standard tensor processing unit that Google uses to run its own AI, which is available to other customers through Google Cloud.
Google

Google is not the only company that develops chips for these types of AI tasks on the device. ARM, Qualcomm, Mediatek, and others make their own AI accelerators, while Nvidia GPUs dominate the training algorithm market.

However, Google has said that its rivals do not have control over the entire AI stack. A customer can save their data in Google's cloud. train their algorithms with TPUs; and then execute the inference on the device with the new Edge TPUs. And most likely they will create their machine learning software with TensorFlow – a coding framework developed and operated by Google.

This type of vertical integration has obvious advantages. Google can ensure that all these different parts communicate with each other as efficiently and smoothly as possible, so that customers can (and will) more easily play in the company's ecosystem.

Google Clouds Vice President of IoT, Injong Rhee, described the new hardware as a "purpose-built ASIC chip that runs TensorFlow Lite ML models on the edge" Blog entry. Said Rhee: "Edge TPUs are designed to complement our Cloud TPU offerings, so you can accelerate ML training in the cloud and then lightning-fast ML inferencing – your sensors become more than data collectors – they meet local, smart decisions in real time. "

Interestingly enough, Google also makes the Edge TPU available as a development kit that makes it easier for customers to test the hardware's ability and see how it could fit into their products. This devkit includes a System On Module (SOM) that includes the Edge TPU, an NXP CPU, a secure microchip element, and Wi-Fi functionality. It can be connected to a computer or server via USB or a PCI Express expansion slot. However, these devkits are only available in beta, and potential customers must request access.

This may seem like a small part of the news, but it's remarkable that Google does not usually bring the AI ​​hardware to the public. However, if the company wants customers to adopt their technology, it needs to make sure they can try it first, rather than just questioning them into the AI ​​Googlesphere. This development board is not just a lure for businesses – it's a sign that Google is serious about owning the entire AI stack.


Source link