Home > News content

Ali and Google have developed AI chips for commercial use, and the relationship between technology giants and chip giants has changed.

via:博客园     time:2019/9/28 20:31:08     readed:194

At the Hangzhou Yunqi Conference in 2019, Ali's first AI chip contained only a few minutes of introduction, but it became the most concerned news from the outside world. It is no longer news that Internet giants have entered the field of self-research chips, but every time they launch their own chips and announce them commercially, they will still attract countless attention.

Many people should wonder if this is the beginning of the replacement of chip giants.

The original intention of Internet giant coremaking

On the way to core-making by internet companies, google is at the forefront. As early as 2006, google began thinking about deploying gpu or fpga, or dedicated integrated circuits (asic) in its data center, when it concluded that there were not many applications that could run on specialized hardware, using the extra computing power of google's large data center.

The situation changed in 2013, when Google users used voice recognition DNN voice search for three minutes a day, doubling the computing needs of Google's data center, and using traditional CPUs would be very expensive. As a result, Google launched a very high priority project to quickly produce a custom chip (ASIC) for reasoning and buy off-the-shelf GPUs for training.

To do this, Google designed, validated, built and deployed TPU (tensor processor, Tensor Processing Unit),) in its data center to increase performance-to-price ratio tenfold tenor tension. to do so, it took Google 15 months to design, verify, build, and deploy Google (tensor processor Google).

In 2016, at the Google I/O Developers Conference, Google officially released its first TPU. However, TPU was initially limited to Google's internal use, and it was not until February 2018 that Google announced the opening of its TPU service on its cloud platform blog. The price was about $6.50 per hour per cloud, and the number was limited.

阿里和谷歌自研

In other words, Google only launched the commercialization of TPU in 2018. At the I / O conference in the same year, TPU 3.0 announced the launch of computing performance eight times higher than TPU 2.0, up to 100PFlops (1000 trillion floating-point computing per second).

It is clear that the introduction of Google TPU is mainly due to the lack of chips in the market to meet their needs, which makes them enter the field of self-research chips, and TPU as a cloud reasoning chip is not sold out, but through the Google cloud to provide external computing power.

Ali's path to self-research chips is similar to Google's. Alibaba, which started in 1999 with Taobao, now has business across e-commerce, finance, logistics, cloud computing, big data, globalization and other scenarios. Different scenarios have different computational requirements, such as taobaoli's Taobao merchandise library adds 1 billion new merchandise pictures every day, and it takes 1 hour to use traditional GPU arithmetic recognition. Besides taking an hour, the power consumption of the GPU is huge.

阿里和谷歌自研

Lei Feng reported earlier that there were 40 small and medium data centers in China in 2017, which consumed more electricity than the Three Gorges Dam in 2017 and emitted twice as much carbon as civil aviation. For technology companies like Ali and Google, which have large data centers, power consumption also brings huge costs.

If Google and Ali want to increase efficiency by buying the newest and more powerful Yingweida GPU, the high price of the GPU is also a problem that technology giants cannot ignore.

thereforeAli has also opened the way to self-research chips. At the Yunqi Conference in 2017, Ali announced the establishment of the Damo Court.It is composed of three parts: independent research center, university joint laboratory and global open research program, including quantum computing, machine learning, basic algorithm, network security, visual computing, natural language processing, next generation human-computer interaction, chip technology, sensor technology, embedded system and so on, covering machine intelligence, intellectual networking, financial science and technology and other industrial fields.

At Yunqi Congress in 2018, Ali announced the establishment of an independent chip enterprise.

The first AI cloud reasoning chip, Hanguang 800, was officially launched by Ali in 2019. In the industry standard ResNet-50 test, the reasoning performance of Hanguang 800 reaches 78563 IPS, which is four times higher than the best AI chip in the industry, and the energy efficiency ratio of 500 IPS/W is 3.3 times higher than that of the second best AI chip in the industry.

The effect of performance improvement is obvious. Continue to use the above example of Pai Li Tao. Ali said that after using the light containing 800, the recognition time of adding 1 billion commodity pictures per day can be reduced from 1 hour to 5 minutes from GPU recognition.In addition, according to the on-site demonstration of Yunqi Conference, real-time processing of traffic videos in Hangzhou's main urban area in the urban brain requires 40 traditional GPUs with a delay of 300 ms, and only 4 with a light-containing 800, with a delay of 150 ms.

Thus, a simple conversion is made, that is, the calculation power of a light 800 is equal to 10 GPUs.

阿里和谷歌自研

Compared with Google's delay, commercial TPU, Ali said at the time of the release of the light 800th that the product has been implemented on a large scale, applied to a number of scenes within the Alibaba Group, such as video image recognition / classification / search, urban brain, etc., andThe announcement that AI cloud service based on Light800 is also officially launched.

It is clear that the core goal of Ali's own research on AI chips is also to achieve lower economic costs.

Technology giants and chip giants are no longer just partners

From this point of view,At present, both Ali and Google's commercial self-developed AI chips are cloud-based reasoning chips. For traditional chip giants Intel and Invidia, this will not bring huge changes, but the relationship between the two sides will change from close cooperation to competition.

How to understand? Although there is an optical 800 reasoning chip for visual scene, it is an ASIC chip, and its location is mainly accelerated by CNN model reasoning, which can be extended to other DNN models. However, for more in-depth learning algorithms, this ASIC can be supported to some extent, but it can not reflect the performance and efficiency optimization.

阿里和谷歌自研

阿里和谷歌自研

So, whether it's Ali or Google, they still need Intel's CPU to provide universal computing power, but also need the acceleration of AI on the FPGA. In addition, Ali's powerful Shenlong Architecture Server also needs the power of Yingda GPU.

This kind of competition and cooperation will not only be confined to the cloud.In July 2018, Google launched the Edge TPU chip, which is a simplified version of the cloud TPU, a specially designed accelerator chip designed to run the TensorFlow Lite machine learning model at the edge.

阿里和谷歌自研

Heroes see the same, Ali also has the end of the cloud integration strategy. In July and August before the release of the 800 cloud AI chip, Ali released a series of high-performance RISC-V architecture processor Xuantie 910 and the SoC chip platform.

In addition, during the Yunqi Conference in 2019, there was a heavy announcement that the intelligent voice chip TG6100N, which was customized and developed by Ali AI Laboratory and Pingtou Brother, will be used in the upcoming speaker products.

阿里和谷歌自研

It's easy to understand that AI chips developed by technology giants cover cloud and terminals. Whether they are chip giants or technology giants, they all believe that future data will be as valuable as oil. Therefore, the giants need to dig the value of data and keep ahead in the era of data, which is supported by AI chips of cloud and terminal.

阿里和谷歌自研

At this time, technology giants and chip giants will have cloud and terminal AI chips, competition is inevitable.Lei Feng (Public No. Lei Feng) believes that the degree of fierce competition will depend more on technology giants, because they have a better understanding of their business and data, their customized ASIC chips are easier to achieve optimal performance and efficiency.Even if the performance of self-developed AI processors is weaker than the products of chip giants, if the technology giants use their own business and scenarios to support the iteration and optimization of self-developed chips for self-controlled consideration, at some cost, they will eventually be able to develop chips with very competitive advantages in specific areas.

Note that technology giants will develop AI chips independently in areas related to their business and ecology. But if you want to replace the existing mature chips, such as Intel's top CPU and Nvidia GPU, there is neither value nor great risk.In the final analysis, the original intention of the technology giants to study AI chips is to obtain greater economic benefits and maintain their technological and ecological leadership through their own research and development of chips. Similarly, their own chips will serve their own business and ecology more than the market of chip giants.

Summary of Leifeng Network

In the AI era, technology giants and chip giants are no longer just close partners, but also competitors in specific areas. In other words, for chip giants, it takes more effort to get orders in areas where technology giants have their own chips.

Conversely, as a technology, capital and talent-intensive industry, chip has a long-term characteristics different from the rapid iteration mode of the Internet and mobile Internet. How to find the best balance between hardware and software, and how to compete with the advantages accumulated by chip giants over the years in the field of chip are also technological giants. Challenges faced by head self-developed chips.

China IT News APP

Download China IT News APP

Please rate this news

The average score will be displayed after you score.

Post comment

Do not see clearly? Click for a new code.

User comments