Menu

In the metaverse, the demand for computing power is exploding. How can Chinese chips overtake?

By Industry News
2022-12-12
313

Huatai Securities joined hands with the 2022 World Artificial Intelligence Conference to hold the "Towards New Intelligence" technology and financial innovation forum to enjoy the market space where cutting-edge technologies are implemented in financial scenarios. Liang Xiaoyao, a professor, doctoral supervisor, and academic leader of the Department of Computer Science and Engineering of Shanghai Jiaotong University, cuts in from the perspective of GPU chips, and analyzes the necessary conditions for the coming of the Metaverse era from the perspective of the underlying technology foundation.

The chip computing power required by the metaverse is more than 100 times the current level

The development of science and technology in modern society is actually a development based on computing power, and computing power is based on chips. From the earliest transistors and triodes, to integrated circuits and large-scale integrated circuits, to the current central processing unit CPU, graphics processing unit GPU, data processing unit DPU and AI-specific chips. For decades, the computing power of human society has been improved by leaps and bounds. In the future, the computing power carrier may be a quantum chip or a biochip. The emergence of the metaverse has given birth to an infinite demand for computing power. "To realize the metaverse, we need to increase the existing computing power by more than 100 times, which will be an era of brain-like intelligence." Since the birth of the chip, the computing power of the global AI industry has expanded rapidly along with the growth of the market size. In the early era of Moore's Law, AI generally adopted the von Neumann architecture that separated computing and storage, which was characterized by low computing power and low energy consumption ratio. At that time, CPU chips represented by Intel, AMD, and ARM were the jewels in the crown of integrated circuits. With the emergence of GPU and AI acceleration chips, the post-Moore era has come, and the computing power has begun to increase significantly, but the energy consumption ratio is still low. In Liang Xiaoyao’s view, future chips will enter the Moore-beyond stage. At that time, chips will use AI’s native computing power to achieve fusion of storage and computing, greatly improving operating efficiency, and achieving a leap in energy consumption while obtaining higher computing power, thereby supporting Technologies such as autonomous driving and large-scale AI cloud computing have come into reality. These expectations are not very far away, and some companies are already making similar attempts. "The openness of the Metaverse allows each chip to provide part of the computing power for the Metaverse, and has the opportunity to play its value in the Metaverse. Therefore, the current chip investment boom has its foundation." Liang Xiaoyao It is judged that the chip industry is undergoing fundamental changes. In the future, more and more chip companies will provide surging computing power for the Metaverse, and GPU is the top priority.

GPU: From image processing to the more distant future

As a product of the post-Moore era, GPU is known for its graphics processing and parallel computing capabilities, and has won the favor of the capital market with its obvious advantages in ultra-high computing power. "GPU has tremendous computing power, which can be used for graphics rendering, and can also be used in scientific computing fields such as medical imaging and financial simulation. The computing speed is about 150 times faster than that of CPU." Liang Xiaoyao said. Through the stock price trend of mainstream chip companies, we can clearly see the market's expectations for different chips. In the past five years, the stock price of Intel, the giant in the CPU field, has risen by 65%. 4 times. The development of GPU is not static, and its computing potential has been continuously tapped for decades. The earliest GPU image rendering was very general, but now it can render a realistic world that is almost indistinguishable to the human eye. It has already surpassed the definition of "graphics processor" and has become the ultimate provider of computing power, as well as future computing power. The main role of the platform. With the generalization of applications, GPU has also evolved into GPGPU (General GPU). "To realize brain-like intelligence, Metaverse, WEB3.0, and scientific computing, the GPU will become the most critical computing power platform base." Liang Xiaoyao said. Video codec is another development focus of Metaverse in the future. The so-called codec technology is to convert the high-traffic original video into a low-traffic video through compression technology, transmit it to the client, and then restore it to a high-quality video. Even in the 5G or even 6G era, the network bandwidth may not be able to support the metaverse content generated in the cloud or server to be directly pushed to the client, so video codec technology will become very important. Now major Internet companies, video companies, and cloud game companies are defining their own video codec standards, and even developing their own codec chips to provide users with the best experience to the greatest extent and compete for the right to define future standards.


Looking forward to the development of finance in the metaverse, Liang Xiaoyao believes that the financial base may rely on the privacy and security provided by distributed ledgers to help users complete information authentication and transaction record authentication in a highly virtual space. The blockchain and privacy computing technologies required by distributed ledgers require a lot of real-time computing power. At present, no chip can complete the authentication and calculation of such a large number of real-time transactions, and its implementation still awaits the explosion of computing power supply.

Where is the domestic metaverse chip headed?

In Liang Xiaoyao's eyes, the chip world under the metaverse has undergone revolutionary changes, and it has provided opportunities for domestic chips to overtake on corners. "The mainstream of chips in the old world is either Intel or Nvidia. There are a large number of software companies developing applications around their chip architecture, because at that time our computing power was sufficient, but with the arrival of the metaverse, the situation is completely reversed. The computing power It has become extremely scarce, and at the same time, computing power chips can have more forms of existence, and each chip has the opportunity to find a space to play in the new metaverse world.” Liang Xiaoyao believes that there are at least three types of domestic metaverse chips path. The first path is to make a GPU chip that is fully compatible with Nvidia, which requires an investment of more than 10 billion U.S. dollars and a team of at least 3,000 people over 10 years. The second path is to make independent strong general-purpose chips, such as private computing and blockchain chips, to carry some metaverse functions. It will probably require an investment of more than 1 billion U.S. dollars, about 5 years, and a team of 1,000 people. Now many start-ups in China have reached this scale and are capable of developing such chips. The third path is to make field-specific chips. The typical one is video codec chips. It only needs to invest 100 million U.S. dollars, and it can be realized in about 2 years and a team of 200 people. Therefore, the entry points chosen by major Internet companies are domain-specific chips, which can also contribute to the computing base of the Metaverse.

Behind the research and development of the computing power platform is the struggle of major manufacturers and even major countries. Facing the existing advantages of the leaders, open source may be a better track for overtaking. There is open source Android in the operating system, and RISC-V, which is highly discussed in the CPU open source architecture. Practice has proved the success of open source. So, can GPU be open source? Liang Xiaoyao and his team are building an open source general intelligent computing chip platform - LOGIC (Launch Open-source GPU In China). The whole project consists of a professional textbook, a core course and an open source platform. Liang Xiaoyao said that LOGIC will aim to sow inclusive computing power, and build a "GPU that everyone can use", with industry talents, intellectual property rights, industry alliances and open ecology as the four pillars, to build the first generation of open source GPU "Blue and White Porcelain" ". "We hope to surpass in imitation. What we want is a steady flow of water and a long-lasting fight." The name of the new architecture contains the ambition of the domestic computing power chip team to chase the industry leader.