قالب وردپرس درنا توس
Home / Business / Tesla has just bought an AI startup to improve autopilot

Tesla has just bought an AI startup to improve autopilot



<img src = "https://cdn.arstechnica.net/wp-content/uploads/2019/10/forrestiandola-800×655.png" alt = "Forrest Iandola, CEO of DeepScale, here on a photo from 2013 , is Tesla's newest Guru for Machine Learning Guru.

Tesla has acquired the machine learning startup DeepScale, CNBC, Techcrunch and other news agencies have reported The company's CEO, Forrest Iandola, announced on Monday that he joined the Tesla autopilot team.

Iandola told Ars his company's mission during a phone call shortly after the company raised $ 15 million from venture capitalists in April 2018. DeepScale developed image recognition software based on neural folding networks

An important step in any self-propelled software system is the detection of cars, pedestrians, bicycles, and other objects in the vicinity of the car It is crucial to understand how objects can make sound predictions about where they will move in the future. Most companies working on the problem use a technique called Convolutional Neural Networks (CNNs) to address this issue. Learn more about how it works in our Deep Dive on CNNs.

DeepScale focuses on improving the speed and efficiency of convolutional neural networks, drawing on Iandola's previous work as a computer science student. The company's techniques will be particularly helpful to Tesla. Tesla relies heavily on machine learning techniques to achieve full self-control without most Tesla competitors using lidar sensors or high-resolution maps.

Making neural networks much smaller

A famous publication from 2012, known as AlexNet, after lead author Alex Krizhevsky, showed for the first time the power of neural networks for image recognition. The AlexNet authors have found out how the parallel processing power of GPU cards can be used to train much larger neural folding networks than in the past. This enabled them to perform much better on a standard image recognition task than any previous algorithm.

One notable disadvantage of the AlexNet algorithm, however, was that it was huge with 60 million trainable parameters. Prior to founding DeepScale, Iandola was a Ph.D. student at the University of California, Berkeley, where he developed techniques for downsizing neural networks such as AlexNet.

Iandola and his co-authors demonstrated this with various optimizations. This enabled AlexNet-like performance while reducing the number of parameters by a factor of 50. This reduced the physical size of a trained AlexNet network from 240 MB to less than 5 MB. Using additional compression techniques developed by other researchers, including the switch from 32-bit to 8-bit parameters, they were able to reduce the size of their model by a further factor of 10, thus creating convolution networks with AlexNet-like performance. which was less than half a megabyte.

In his interview with Ars in 2018, Iandola argued that this type of optimization is important to companies trying to bring image recognition technologies to market. Companies like Tesla regularly provide new versions of their neural networks to customer vehicles, which often have limited bandwidth. It's much easier to shift half a megabyte of data than 240 megabytes.

Smaller models become particularly important as companies begin to develop custom silicon for machine learning applications. Iandola pointed to this benefit in a paper from 2016: "Using CNNs in application specific integrated circuits (ASICs), a sufficiently small model can be stored directly on the chip, and smaller models may allow that ASICs fit on a smaller chip. " This has obvious cost advantages and can also improve performance since the chip does not need to import model parameters from the external memory.

DeepScale attempted to commercialize Iandola's research.

When Iandola completed his research in Berkeley Around 2015, he was looking for a way to market the technology. He quickly realized that the self-driving Autoboom was an opportunity to apply his research to a practical problem.

"Research has focused in particular on producing some of the most efficient neural networks – energy efficiency that is really fast," Iandola told Ars in 2018. "The market for autonomous driving has just picked up speed and we have there found a good opportunity. "

"Our solution is to identify things on the road," said Iandola. "We can tell you what kind of objects we see and how far away they are, and we've seen an order of magnitude improvement in object detection."

Iandola noted that the leader Waymo has impressive technology, but "there is a lot of custom hardware that is expensive." His role in DeepScale is less about having the first ability than putting things at a cost and reliability point where they could be mass-produced. "

" We're not building any hardware, "he added, using" raw material processors and sensors. "He added," Our superpowers reduce the computational burden by a factor of 100. "

DeepScale seems good to Tesla

Three years ago, Elon Musk promised that customers would be able to control themselves the hardware the company shipped at the time, and earlier this year Tesla tacitly admitted that this was due to the However, Tesla is under great pressure to achieve excellent machine learning with a limited computational budget, which is a particularly difficult problem as Tesla tries to avoid lidar sensors or high-resolution cards – Two resources that most other self-driving automakers consider crucial to timely technology g to get started.

Tesla's large fleet provides the company with a large amount of data that can be used to train neural networks. With hundreds of thousands of vehicles on the roads and the ability to query the fleet for "interesting" events, Tesla's engineers can draw on billions of miles of real data to train the neural networks driving autopilot.

Tesla also The autopilot talent pool needs to be constantly replenished as the company has been exposed to a steady exodus of top talent over the past three years. For more than three years, Elon Musk claims that full autonomy is less than two years away. In 2015, he said that full autonomy is "a much easier problem than people believe".

This attitude caused friction with the autopilot team's engineers, who considered Musk's aggressive schedules to be unrealistic. Tesla autopilot boss Sterling Anderson quit in late 2016, shortly after Musk promised that the new hardware would be capable of full autonomy. Two other autopilot chiefs have since left the company together with many engineers.


Source link