The AI infrastructure boom is in full swing, and storage demand is expanding rapidly! Micron's (MU.US) data center revenue triples.
21/03/2025
GMT Eight
The United States' largest computer storage chip manufacturer, Micron Technology, Inc. (MU.US), has released strong performance data and outlook for the current quarter, primarily benefiting from the global trend of large enterprises and government departments investing heavily in AI infrastructure. The demand for storage chips closely related to artificial intelligence training/inference systems remains strong, driving a significant increase in revenue for Micron's data center business, including HBM storage systems and enterprise SSDs. Additionally, the demand for storage chips in smartphones and PCs, which has been weak since the Fed's rate hike cycle in 2022, is showing signs of recovery due to the rise of AI smartphones and AI PCs.
In its performance outlook announced on Thursday, the company stated that total revenue for the third quarter of fiscal year 2025, ending in May, is expected to be around $8.8 billion (with a fluctuation of $200 million), significantly higher than the average analyst expectation of $8.55 billion. Excluding certain items, Micron's management expects adjusted diluted earnings per share to be around $1.57 (with a fluctuation of $0.1), also higher than the average analyst expectation of around $1.48.
For the second quarter ending in February, Micron Technology, Inc., headquartered in Boise, Idaho, reported a 38% year-over-year increase in overall revenue to $8.05 billion, surpassing the average analyst forecast of $7.91 billion. Revenue from storage chip business closely related to AI data center construction tripled. After excluding certain items, adjusted diluted earnings per share rose to $1.56, far stronger than the previous year's approximately $0.42 and exceeding the average analyst expectation of $1.43.
Under the Non-GAAP criteria, Micron's gross profit for the second quarter reached $3.053 billion, significantly higher than approximately $1.163 billion in the same period last year. Non-GAAP net profit also expanded to $1.783 billion, far higher than the previous year's $476 million.
As of Thursday's U.S. stock market close, Micron's stock price has risen by about 22% year-to-date, significantly outperforming the S&P 500 index. This indicates that storage chips under the AI trend have become the focus of global investors, with investors generally optimistic about the demand for storage chips in 2025, especially with the strong growth of NVIDIA Corporation's AI GPU driving strong demand for HBM, which must be equipped with GPUs - Micron is one of the core suppliers of HBM storage systems for NVIDIA Corporation, the other core supplier being SK Hynix. After the financial report was released, Micron's stock price rose by over 6% in after-hours trading.
Before Micron announced its strong financial results, Wall Street investment firms were generally bullish on the stock's performance. Wells Fargo & Company recently set a target price of up to $130 for Micron, reaffirming its "hold" rating; Cantor Fitzgerald, Stifel, and Baird all set target stock prices of $130 within the next 12 months, indicating a potential upside of up to 30% for Micron's stock. Citigroup's outlook for Micron's stock price is the most optimistic, giving a target price of $150. As of Thursday's U.S. market close, Micron's stock price closed at $103.
AI infrastructure driving storage demand surge, explosive demand for HBM
Micron's management stated in the earnings conference call that they are seeing incredibly strong demand for AI infrastructure components used to develop and operate artificial intelligence application software, including the so-called "AI intelligentsia." In addition, Micron stated that its traditional chip market - i.e. storage chips for smartphones and PCs, has shown weak performance since 2022, but the "AI + consumer electronics" side of the AI trend has clearly shown signs of recovery.
Driven by the strong demand for HBM storage systems, Micron's data center business revenue has grown significantly, tripling from the same period last year. Micron Technology, Inc.'s CEO, Sanjay Mehrotra, stated in the earnings declaration, "The overall revenue of the data center business has tripled compared to the same period last year. We expect to achieve record revenue and significantly improved profitability in fiscal year 2025."
Micron's high-bandwidth storage (HBM) business has become a critical component of global artificial intelligence training/inference systems. The company stated that due to its "strong execution and strong demand for AI storage," revenue from this technology surpassed the $1 billion mark in the second quarter.
One weakness is the company's overall gross margin, i.e. the percentage of sales remaining after deducting production costs. Financial data shows that Micron Technology, Inc.'s adjusted gross margin for the second quarter was 37.9%, lower than the market's expected 38.4%; the forecasted gross margin for the current quarter is around 36.5%, slightly lower than market expectations, indicating that the significant expansion of Micron's HBM capacity inevitably affected the gross margin, which is expected to improve once the new capacity is fully deployed.
Manish Bhatia, Micron's Executive Vice President of Global Operations, stated in an interview, "We expect the profit margin to improve in the fourth quarter, as market conditions are clearly improving."
Additionally, edge-side artificial intelligence features will stimulate widespread upgrades in smartphones and personal computers, meaning that both DRAM and NAND storage demand will enter a new growth stage, and Micron, focusing on the storage sector, is expected to fully benefit from this trend.Continuing strong enterprise-level SSDs (a type of high-end storage product based on NAND flash technology) for 24 years, along with some high-end DDR5 storage products based on enterprise solutions.Artificial intelligence large models are often created through data bombing software and high-density matrix operations, a process that may involve tens of trillions of parameters and highly rely on HBM storage systems. AI reasoning workloads involve massive levels of parallel computing patterns, also highly dependent on HBM storage to provide high bandwidth, low latency, and high energy-efficient storage solutions. To avoid AI power bottlenecks and keep expensive GPU processors running at full speed, Micron and its competitors - SK Hynix and Samsung have developed HBM storage systems that are much faster and more efficient in communicating with other components than traditional storage systems.
Enterprise SSDs refer to solid-state drives specifically designed for data centers, enterprise servers, and critical business applications. These products typically use the highest quality NAND flash memory, offering higher durability, stability, and performance to meet strict requirements for data reliability and continuous performance in enterprise applications.
HBM is a high-bandwidth, low-power storage technology specifically used in high-performance computing and graphics processing fields. HBM uses 3D stacked storage technology to connect multiple stacked DRAM chips together for high-speed, high-bandwidth data transfer using fine Through-Silicon Vias (TSVs). HBM stacks multiple storage chips together through 3D stacking technology, significantly reducing the storage system's footprint and reducing data transfer energy consumption. The high bandwidth significantly improves data transfer efficiency, allowing AI large models to run more efficiently 24/7.
In particular, HBM storage systems also have strong low-latency characteristics, enabling quick response to data access requests. Generative AI large models like GPT-4 often require frequent access to large datasets and heavy model inference workloads. The strong low-latency characteristics can greatly improve the overall efficiency and response speed of AI systems. In the field of AI infrastructure, HBM storage systems are tightly integrated with NVIDIA Corporation's H100/H200 AI GPU server systems, as well as the mass-produced NVIDIA Corporation B200 and GB200 AI GPU server systems.
Undoubtedly, artificial intelligence is the core driver that has propelled Micron's stock price to an all-time high in 2024, and it will be the core catalyst driving Micron's stock price to new historical highs starting from 2025. In the wave of AI infrastructure frenzy where global enterprises are investing heavily in AI, the demand for storage chips, especially HBM storage, is growing rapidly alongside AI GPU demand in large-scale new construction or expansion of data centers, including "Stargate." Bank of America Corp's recent research report predicts that capital expenditures of super-large-scale data center operators will see a significant increase in 2024 and a 34% year-on-year growth in 2025, reaching $257 billion.
In 2025, the demand for storage chips is expected to continue to surge! Wall Street is bullish on Micron's future stock price.
Based in Idaho, Micron Technology, Inc. has been favored by Wall Street investment institutions since 2024 because its high-bandwidth storage products (HBM) meet the explosive storage demand brought by large-scale data processing tasks. Micron is also a leader in the field of AI infrastructure in the wave of artificial intelligence sweeping the global technology industry, which is why many Wall Street institutions have given optimistic target stock price expectations.
Wall Street major investment bank Goldman Sachs Group, Inc. has released a research report stating that the strong demand for generative artificial intelligence (Gen AI) from enterprises has driven higher shipments of AI servers and higher levels of HBM density in each AI GPU. The institution has significantly increased its total market estimate for HBM, expecting the market size to expand with a 100% compound annual growth rate (CAGR) from $23 billion in 2023 to $302 billion in 2026. Goldman Sachs Group, Inc. predicts that the HBM market will continue to be in short supply in the coming years, benefiting major players like SK Hynix, Samsung, and Micron.
The emergence of DeepSeek-R1 and the recent open-source release of numerous underlying codes with far-reaching impacts on AI training/inference can be said to have completely sparked an "efficiency revolution" in AI training and inference, driving future AI large model development to focus on "low-cost" and "high-performance" rather than burning money to train artificial intelligence large models in a miraculous way. However, it should be noted that the widespread penetration of AI applications such as DeepSeek into various industries worldwide, with massive demand for AI inference computing power, means that the future prospects of AI power infrastructure such as AI GPU, HBM storage systems, enterprise-level SSDs, and network and power infrastructure will continue to be vast.
One of the important components of the AI hardware infrastructure sold by Micron - HBM storage systems, as well as the wide range of enterprise-grade DRAM and NAND storage products required for AI infrastructure, are benefiting greatly from this unprecedented wave of AI expenditure. HBM storage systems, essential hardware for heavyweight AI applications like ChatGPT and Sora driven by AI chip leader NVIDIA Corporation, require the use of H100/H200/GB200 AI GPUs. As the demand for NVIDIA Corporation's full line of AI GPUs in the market is almost endless, NVIDIA Corporation has become the world's most valuable chip company. HBM storage systems are closely integrated with all AI GPU products, ensuring seamless operation.Provide information more quickly to help develop and operate artificial intelligence large models.Wall Street giant Citigroup reiterated its "buy" rating on Micron before announcing its financial results, with a target stock price of up to $150 within the next 12 months. They are optimistic about Micron's continued benefits from the surge in demand for HBM storage under the AI trend, as well as the broader recovery in storage chip demand.
According to the latest forecast from the World Semiconductor Trade Statistics (WSTS), the global semiconductor market is expected to continue growing in 2025 based on 2024, indicating a potential 11.2% growth on top of the already strong recovery trend in 2024, with a global market size expected to reach around $697 billion.
WSTS predicts that the growth in the semiconductor market size in 2025 will be mainly driven by the strong demand for AI training/inference capabilities in the enterprise storage chip category, as well as significant growth in the artificial intelligence logic chip category. The overall market size of logic chips including CPUs, GPUs, and ASIC chips is expected to increase by approximately 17% in 2025, with the market size of storage chips covering HBM and enterprise-level SSDs expected to grow by over 13% from 81% growth in 2023. At the same time, WSTS also predicts that the growth rates of all other segment chip markets such as discrete devices, optoelectronics, sensors, MCUs, and analog chips will reach single-digit growth rates.