Soochow Securities: Looking forward to the growth of computing power brought by GPT-5 from the triple perspectives of parameter quantity, timetable, and influence
DATE:  Mar 19 2025

Soochow Securities released a research report saying that OpenAI had adjusted the product line of the large model and expectations. The release of GPT-5 may be advanced, or because DeepSeek's recent blockbuster updates and bright performance pose a threat to OpenAI's product landscape, and then hope to accelerate the pace of product iteration. The bank calculates that 1) the number of GPT-5 model parameters is about 18 trillion, and 2) the model training time is about 203-225 days. The bank expects that if GPT-5 triggers a new round of AI boom, the construction of more large clusters will be put on the agenda.

The main views of Soochow Securities are as follows:

How does this line define the GPT next-generation large model?

The bank judged that OpenAI had adjusted the product line of the large model and expectations. In July 2024, OpenAI's chief technology officer, Mira Murati, said that GPT-5 is expected to be launched by the end of 2025 or early 2026. However, according to Altman's statement on social platforms on February 13, 2025, GPT-5 will be released a few months later. The bank judged that the release time of GPT-5 may be advanced, or because DeepSeek's recent blockbuster updates and brilliant performance pose a threat to OpenAI's product map, and then hope to accelerate the pace of product iteration. Since OpenAI's inception in 2015, it has continuously expanded its technology product layout through multiple rounds of financing. From the perspective of time cycle, OpenAI will get a new financing in about one and a half years on average. From the perspective of large model iteration nodes, as ChatGPT pushes up the popularity and competitors continue to innovate, the market's expectation of the iteration speed of OpenAI's products is accelerating.

How to measure GPT-5 pre-training?

Based on the commonly used formula of computing power supply and demand, the bank discusses the change trend of the core parameters in the formula, and gives the bank's judgment. GPT-4 parameters are known to have 1.8 trillion parameters, and it is trained on 25,000 A100s for 90-100 days. Knowing that GPT-4.5 is about 10 times more computationally expensive than the GPT-4 model, assuming that Scaling Law continues to work, assuming that it has 3w-5w H100 training resources, and other conditions are basically the same as GPT-4, it can be speculated that 1) the GPT-4.5 model has about 5.7 trillion parameters, and 2) the GPT-4.5 model takes about 148-247 days to train. About GPT-5: Assuming that it achieves 100 times more computing power than GPT-4, other assumptions are the same, it can be estimated that 1) the number of GPT-5 model parameters is about 18 trillion, and 2) the model training time is about 203-225 days.

What is the impact of the launch of GPT-5 on the AI industry?

Impact #1 Although the industry is still in a heated discussion on the direction of large-scale model development, the construction of the vanka cluster of the leading large-scale model manufacturers has not stopped. Focusing on the domestic market, judging from the release schedule of large models of GPT-4's capabilities, it is generally one year later than GPT4. As a result, the bank expects that if GPT-5 triggers a new round of AI boom, the construction of more large clusters will be put on the agenda. Impact #2 In December 24, ChatGPT has surpassed 300 million weekly active users, with a goal of reaching 1 billion users in the next year. According to the calculation formula of "inference demand = 2× parameter quantity ×token", under the premise of other conditions, the inference market space in 2025 is expected to triple that of 2024; If GPT-5 drives a significant increase in the number of parameters (calculated at 18 trillion), assuming that the overall inference consumption of ChatGPT in 26 years is twice that of 25 years, and assuming that only 20% of the tokens consumed in 26 years is GPT-5's demand according to the 28 rule, the combined inference computing power demand in 26 years is expected to reach about 5.6 times that of 25 years.

Companies related to the industrial chain: Fortune Federation of Industry (601138.SH), Shanghai Electric Co., Ltd. (002463.SZ), Victory Giant Technology (300476.SZ), Cambrian (688256.SH), Haiguang Information (688041.SH), Loongson Zhongke (688047.SH), Centec Communications (688702.SH).

Risk warning: the risk that the progress of AI application is less than expected, the risk of slowing down or failing Scaling Law, the risk of GPU technology upgrade being less than expected, and the risk of less than expected construction of Vanka cluster.

Follow Yicai Global on

star50stocks

Ticker Name

Percentage Change

Inclusion Date