NAND is the super cycle windfall on the rise! The tidal wave of AI intelligent bodies is exploding the demand for storage, and the market firmly believes that Sandisk's (SNDK.US) performance will skyrocket.

date
10:01 28/04/2026
avatar
GMT Eight
The Middle East war cannot contain the narrative of the "AI bull market"! GPUs no longer dominate the topic of computing power, and the wave of intelligent agents has ignited CPUs and storage.
As of Monday's close in the U.S. stock market, the global leader in SSD storage products, SanDisk (SNDK.US), continued its incredibly strong upward trend since 2025, closing up more than 8%. Its stock price has soared over 350% since the beginning of 2026, outperforming global stock market benchmark indices by a large margin, with a total market value close to $160 billion. Throughout 2025, SanDisk's stock price experienced a staggering 580% increase. This year, driven by the nearly endless demand for storage chips brought on by the AI infrastructure frenzy, it continues to exhibit an epic rise akin to the "global storage chip myth," unquestionably positioning itself as a "super-bull stock" in the theme of AI computing infrastructure in global stock markets. SanDisk is set to release its third-quarter financial performance for fiscal year 2026 after the market closes on Thursday, April 30th. Ahead of the highly anticipated performance data release, several Wall Street financial giants, including Morgan Stanley, have raised their latest target prices and earnings per share expectations for SanDisk, highlighting the market's optimistic view of the company's strong performance exceeding expectations and driving stock prices to new highs. Led by senior analyst Joseph Moore, the Morgan Stanley analyst team stated in an investor report on Monday, "Third-party forecasts currently show that the overall NAND average selling price (ASP) is expected to increase by about 90% in the first quarter and is expected to rise by 70% to 75% in the second quarter, with large data center-level customers such as eSSD likely to show relatively stronger performance (compared to NAND storage peers, eSSD is particularly strong in this subcategory)." In addition, Morgan Stanley has raised SanDisk's target price from $690 to $1100 and maintained its "overweight" rating. The analysts, including Moore, stated, "We still have a very positive view of the stock, but the market thoroughly understands its recent strength; it may take some time to believe in the duration of this strength. For us, DRAM remains a more serious bottleneck for AI growth, but NAND will follow closely behind, and the longer-term prepayment from data center enterprises should already address the issue. And compared to DRAM, NAND capital spending remains lower, so supply expansion is not a factor catching up with demand." In terms of earnings per share, Morgan Stanley has raised its 2026 expectations for SanDisk from the previously set $41.09 to $53.24 per share, as well as increased its 2027 expectations from $82.73 to $155.06 per share, and raised its 2028 earnings per share forecast from $91.92 to $134.13. For the third quarter of fiscal year 2026, the market consensus expects SanDisk's adjusted earnings per share to be around $14.55, with GAAP earnings per share expected to be $13.82. The total revenue for this quarter is projected to reach $4.72 billion, compared to just around $1.7 billion in the same period last year. Another well-known Wall Street institution, the Melius analyst team led by star analyst Ben Reitzes, has initiated coverage of SanDisk for the first time, giving it a "buy" rating and a target price of $1350. As of Monday's close in the U.S. stock market, SanDisk's stock price was around $1070. The analysts at Melius believe that the artificial intelligence boom will continue to drive storage demand until the end of this decade, and both SanDisk and Micron Technology Inc. still have room for further increase in their stock prices. The ongoing tensions in the Middle East have not suppressed the narrative of the "AI bull market"! GPUs no longer dominate the computing power theme, and the wave of intelligent agents is igniting CPU and storage technologies. According to data from the market research agency Counterpoint Research, the storage market has entered a "super-bull market" or "super-cycle" stage, with current supply and demand dynamics and price trends far surpassing the historical high points of the cloud computing boom in 2018. Since 2026, DRAM/NAND storage chips have continued on a rapid growth trajectory. According to the latest storage price survey from TrendForce, overall conventional DRAM prices are estimated to increase by 58%-63% in the second quarter of 2026 (compared to the estimated increase of 93%-98% in the first quarter). The NAND Flash market continues to be dominated by AI training/inference and extensive data center-related demand, with the effect of price increases across the entire product line not diminishing, and overall contract prices in the second quarter are expected to increase by 70%-75% from the previous quarter's nearly 100% increase. With the rapid emergence of super AI agents such as Anthropic's Claude Cowork and OpenClaw, which can autonomously perform tasks, the wave of AI agents is sweeping across the globe. The bottleneck of AI computing architecture is shifting from GPUs core with matrix multiplication and addition throughput to data center CPUs core with control flow, task scheduling, and memory/IO coordination, putting high-performance CPUs for ultra-large scale AI data centers into a severe supply shortage situation. In the eyes of financial giants like Morgan Stanley, the narrative of AI computing investments is shifting from the "competition around AI GPU/ASIC single-point computing power" to the "AI agent-driven artificial intelligence full-stack system". In this transition of AI computing narrative, data center CPUs and storage chips may be the biggest winners. As the benchmark of the Korean stock market, the KOSPI composite index with a significant weighting for Samsung and SK Hynix hits historic highs under the pressure of deteriorating geopolitical tensions in the GEO Group Inc., and the Taiwanese stock market also reaches new highs under the impetus of Taiwan Semiconductor Manufacturing Co., Ltd., known as the "king of chip manufacturing for other companies". Additionally, the Philadelphia Semiconductor Index, known as the "chip stock barometer," has recorded 17 consecutive increases, further solidifying investors' belief in the dominance of the "AI computing power investment theme" over all market noise. Whether it is the vast TPU AI computing power clusters led by Alphabet Inc. Class C or the NVIDIA Corporation's massive AI GPU computing power clusters, they all rely on HBM storage systems integrated with AI chips. In addition to HBM, tech giants like Alphabet Inc. Class C and OpenAI need to purchase server-level DDR5 storage and enterprise-grade high-performance SSD/HDD storage solutions to accelerate the construction or expansion of AI data centers. From a theoretical standpoint of underlying hardware, AI computing is not just limited by computing power but also by "data transport capability." Regardless of whether it is NVIDIA Corporation's GPUs or TPU computing systems, the efficiency of large-scale model training and inference is determined not only by the number of Tensor Cores/matrix units but also by how efficiently weights, KV cache, activation values, and intermediate tensors can be fed into the computing cores per second. Looking at the intersection of semiconductor and AI data center infrastructure, DRAM/NAND storage chips are "perfectly positioned" for the AI boom because they benefit from both training expansion and inference expansion and serve as a "universal toll booth" across platforms, architectures, and ecosystems. When the AI era shifts from training dominance to inference, agents, long contexts, and enhanced retrieval dominance, the demand for capacity, bandwidth, power efficiency, and data persistence layer will only grow stronger. Morgan Stanley's forecast data indicates that by 2030, an additional demand of 15 to 45EB-level DRAM storage chips will be generated, representing 26% to 77% of the total industry's annual supply in 2027. The Morgan Stanley analyst team emphasizes that storage chips have become one of the most "cash-generating" layers of the AI computing infrastructure system, where components such as host-level DRAM, memory interface chips, and CXL extension and tiered storage systems will play crucial roles in supporting long-term value. Storage chips are no longer just options for capacity configuration but have become core components that directly determine the efficiency and throughput capacity of AI systems. Despite the strong presence of GPUs/TPUs in computing power, without HBM providing bandwidth and efficiency, and without enterprise-grade NAND and high-capacity HDDs accommodating training checkpoints, vector databases, and inference data lakes, the utilization and efficiency of the entire AI computing infrastructure cluster cannot be optimized. Therefore, global capital markets are willing to give higher valuations to storage chains because they benefit from the triple leverage of "quantity increase, price rise, and long-term supply constraints," rather than just relying on increased shipments. AI is igniting a "storage super-cycle" where from HBM to NAND, shortages are widespread, and storage giants like SanDisk and Micron Technology are positioned at the forefront of this super trend. Currently, Wall Street's core assessment of the supply-demand dynamics in storage chips emphasizes the mismatch between DRAM/NAND supply and demand, which may continue until around 2028. The market still underestimates the profit growth trajectory of storage chip manufacturers in this super cycle. This is why SanDisk's stock price has surged significantly since the beginning of 2026 and has seen an extreme level of reevaluation over the past year - this is not merely emotional speculation but reflects the market's revaluation of the "AI-driven storage pricing power". Storage chips are being redefined in the market from traditional "strong cycle commodities" to critical assets at the core of AI infrastructure bottlenecks. A recent financial report from SK Hynix is almost a sample of this super cycle: revenue of about 52.6 trillion won in the first quarter - a year-on-year growth of 198% and a quarter-on-quarter growth of about 60%, operating profit of about 37.6 trillion won - a year-on-year increase of 405% and a quarter-on-quarter growth of about 96%. This marks the first time the company has surpassed the 50 trillion won mark in single-quarter revenue, with an operating profit margin as high as 72%, and the management emphasized that the demand for AI-related HBM still significantly outweighs the capacity. Samsung Electronics also announced that its operating profit for the first quarter could reach 57.2 trillion won, meaning an eightfold increase year-on-year, with the core DRIVE derived from the AI data center construction frenzy pulling the demand for DRAM/HBM/NAND. In other words, AI servers not only require GPUs but also HBM for high-bandwidth near-memory, DDR5/LPDDR5 for system memory, and NAND/eSSD for data lakes and inference caching. Storage has transitioned from being a "complementary component" to a throughput bottleneck in AI factories. SanDisk's position corresponds to the high Beta reevaluation of NAND/SSD. Morgan Stanley has raised SanDisk's target price from $690 to $1100 and significantly increased its earnings forecasts for 2026-2028, with the core logic of NAND ASP continuing to rise, strong demand for AI and enterprise SSDs, and restrained NAND capital expenditure, making supply difficult to catch up with demand. Melius has further set bullish target prices for SanDisk at $1350 and Micron at $700, believing that AI is transforming storage demand from short-term PC/mobile-driven cycles to long-term infrastructure demand driven by cloud providers, AI servers, intelligent agents, and physical AI. The market is willing to give higher valuations to storage chip giants like SanDisk and Micron because it believes that these companies can transition from "cyclical manufacturers" to "AI infrastructure suppliers" with long-term supply agreements, prepayments, and high-certainty cash flows. Furthermore, the market investment logic is expanding from simply valuing "HBM premiums" to shortages in traditional DRAM/NAND. Traditional DRAM profit margins are even beginning to surpass HBM profit margins - as DDR5, LPDDR5, enterprise SSDs and NAND prices continue to soar, storage chip manufacturers may not be willing to invest all incremental capital expenditures solely in advanced packaging like HBM, which is extremely complex, costly, and has limited scalability. Instead, they may reassess the capital return between HBM, traditional DRAM, NAND, and advanced packaging. In other words, while HBM remains the star asset for AI training/massive inference workloads, the current larger-than-expected surge comes from the "entire storage pool being in shortage," especially with high-performance DRAM and eSSDs being widely used for AI inference workloads, with their demand and price potential likely far from being fully realized. Large cloud computing providers, AI ASIC big clients, and server manufacturers are indeed pushing for long-term agreements to ensure a large supply of HBM, DDR5, and eSSDs in the coming years, but in an environment where spot and contract prices rise significantly each quarter and supply may remain tight until 2028 or even 2030, storage chip manufacturers have more incentive to retain quarterly negotiation rights or demand prepayments, price reset clauses, and higher return guarantees. The ultimate result may be that the bullish market in the storage sector is no longer just about "NVIDIA Corporation driving up HBM prices" but has evolved into a wave of capital expenditures for AI expansion, limited wafer capacity expansion, bottlenecked advanced packaging, cloud providers stocking up, and traditional storage price hikes all driving a super cycle in storage until around 2030.