Industry News

Meta First Announces Details of AI Chips with Lower Power Consumption than NVIDIA

Views : 13
Update time : 2023-05-20 10:14:36
        In order to better support artificial intelligence projects, technology giants are competing to develop AI chips within the enterprise. Facebook parent company Meta announced its AI self-developed chip project for the first time on May 18th and shared new details of the data center project.
 
 
        Meta is not in the business of selling cloud computing services like technology giants such as Google and Microsoft, and the company has never publicly discussed its internal data center chip projects in the past. But Meta executives say that the global technology community has shown interest in this, so it is necessary to communicate with the outside world.
        Meta's stock price rose nearly 2% on the same day. Since the beginning of this year, the company's cumulative growth rate has doubled.
        In a blog article, Meta stated that it had designed the first generation of chips as early as 2020 as part of the Meta Training and Reasoning Accelerator (MTIA) program, primarily to improve the efficiency of advertising and other content recommendation models.
According to Meta, the first MTIA chip will focus on AI inference. Meta software engineer Joel Coburn stated that Meta initially considered using GPUs for inference tasks, but later discovered that GPUs were not very suitable for inference work.
        The efficiency of GPUs in real-world model applications is still very low, and deployment costs are high, "Coburn said." That's why we need MTIA
        Research firm Gartner analyst Sheng Linghai told First Financial reporters, "NVIDIA's GPU can also be used for inference, but its efficiency may not be appropriate because the computational power required for inference varies greatly, and it often wastes computational power, just like heavy trucks are not suitable for nanny cars. It is still necessary to match the corresponding computational power according to specific application scenarios
        The market believes that AI chips will ultimately support higher-level metaverse related tasks, such as virtual reality and augmented reality, as well as the emerging field of generative artificial intelligence.
        In February of this year, after Meta released its new large-scale language model LLaMA, Zuckerberg announced that the company is creating a top-level team focused on generative AI research and development.
        Currently, Meta uses a supercomputer consisting of 16000 Nvidia A100 GPUs to train the LLaMA language model. The company claims that the largest LLaMA language model, LLaMA 65B, contains 65 billion parameters. In contrast, Google's latest large-scale language model, PaLM 2, contains 340 billion parameters.
        Meta started large-scale layoffs last year and instead increased investment in AI technology infrastructure. However, it did not comment on the deployment schedule of the MTIA new chip.
        Meta acknowledges that MTIA chips still face challenges when dealing with high complexity AI models, but are more effective than competitors' chips when dealing with medium to low complexity models.
        The power consumption of MTIA chips is only 25 watts, much lower than that of chips from manufacturers such as NVIDIA, and they use RISC-V's open source chip architecture. "Meta stated in her blog," The performance improvement will prove that the investment is reasonable
        According to a blog article, the chip processor is manufactured using TSMC's 7-nanometer process. However, the company did not further disclose any relevant details.
        Meta also introduced a chip called "Scalable Video Processor" (MSVP), which can be used to process and transmit videos while reducing energy requirements. The company states that it needs to process 4 billion videos per day.

 
Related News
Read More >>
How many chips does a car need? How many chips does a car need?
Sep .19.2024
Automotive chips can be divided into four types according to their functions: control (MCU and AI chips), power, sensors, and others (such as memory). The market is monopolized by international giants. The automotive chips people often talk about refer to
Position and Function of Main Automotive Sensors Position and Function of Main Automotive Sensors
Sep .18.2024
The function of the air flow sensor is to convert the amount of air inhaled into the engine into an electrical signal and provide it to the electronic control unit (ECU). It is the main basis for determining the basic fuel injection volume. Vane type: The
Chip: The increasingly intelligent electronic brain Chip: The increasingly intelligent electronic brain
Sep .14.2024
In this era of rapid technological development, we often marvel at how mobile phones can run various application software smoothly, how online classes can be free of lag and achieve zero latency, and how the functions of electronic devices are becoming mo
LDA100 Optocoupler: Outstanding Performance, Wide Applications LDA100 Optocoupler: Outstanding Performance, Wide Applications
Sep .13.2024
In terms of characteristics, LDA100 is outstanding. It offers AC and DC input versions for optional selection, enabling it to work stably in different power supply environments. The small 6-pin DIP package not only saves space but also facilitates install