This morning, Micron Technology (NASDAQ:MU) released its fiscal fourth-quarter 2025 earnings, delivering a blockbuster performance that solidifies its status as a linchpin in the AI revolution.
Revenue rocketed 46% year-over-year to a record $11.32 billion, surpassing Wall Street’s $11.22 billion estimate, while adjusted EPS hit $3.03 against the expected $2.86. GAAP net income soared to $3.20 billion, capping a fiscal year where total revenue reached $37.38 billion — up dramatically from $25.11 billion in fiscal 2024.
The standout story was the data center segment, which exploded to 56% of total revenue with blistering 52% gross margins fueled by insatiable AI demand. CEO Sanjay Mehrotra declared, “AI-driven demand is accelerating, and industry DRAM supply is tight.”
Looking ahead, MU’s fiscal Q1 2026 guidance calls for $12.5 billion in revenue — a 47% year-over-year surge — with margins holding above 50%. Yet, despite this triumph, MU shares are dipping 4% today, providing investors with a rare discount on a stock primed to dominate AI’s memory surge.
Why Micron’s Memory Empire Is Different
While Nvidia (NASDAQ:NVDA) grabs headlines as the undisputed king of AI compute with its graphics processing units (GPUs) powering everything from ChatGPT to autonomous vehicles, Micron carves a different, equally vital niche: memory.
NVIDIA designs and sells the “brains” that crunch massive datasets at lightning speed, commanding sky-high 73% gross margins on its data center revenue, which hit $41.1 billion in its last quarter alone. But those GPUs are memory hogs — AI models like large language processors guzzle terabytes of data, requiring ultra-fast, high-capacity storage to avoid bottlenecks.
That’s where Micron’s memory comes in. Unlike NVDA’s focus on compute silicon, MU specializes in dynamic random-access memory (DRAM) and NAND flash, the high-speed buffers and long-term storage that feed data to GPUs in real time. This complementary role makes Micron indispensable: without reliable, bandwidth-rich memory, even the mightiest GPU stalls.
MU’s portfolio spans consumer gadgets to enterprise servers, but its secret sauce is high-bandwidth memory (HBM) — stacked DRAM chips optimized for AI’s voracious needs. HBM3E, MU’s latest, consumes 30% less power than rivals while delivering blistering throughput, earning nods as a supplier for NVDA’s Blackwell GPUs and Advanced Micro Devices‘ (NASDAQ:AMD) MI300 accelerators.
In short, while NVDA builds the engine, MU supplies the fuel tank — and in AI’s data deluge, that’s a trillion-dollar moat.
AI’s Memory Boom Takes Center Stage
Beyond the headline revenue, Micron’s Q4 is a watershed event. DRAM sales alone shattered records at $9 billion, up sharply as AI hyperscalers like Amazon (NASDAQ:AMZN) and Microsoft (NASDAQ:MSFT) ramped up server builds.
The cloud memory unit, encompassing data centers, ballooned to $4.54 billion — more than triple last year’s figure — while storage hit new highs in NAND revenue at $2.25 billion. Full-year fiscal 2025 created a $37.38 billion revenue juggernaut, with non-GAAP net income of $9.47 billion, swinging from losses to profitability on AI’s tailwinds.
Data center DRAM demand, supercharged by generative AI training, drove 213% segment growth, with HBM sales ramping to “multiple billions” in fiscal 2026 after selling out 2025 allocations. Micron enjoys lush 59% margins in cloud memory, thanks to premium pricing for HBM amid tight supply.
CEO Mehrotra eyes “substantial revenue records” going forward as AI workloads broaden beyond hyperscalers to edge computing and sovereign clouds. As AI models balloon — GPT-5 needs exabytes of memory — MU’s U.S.-based fabs position it to capture share from Asian rivals strained by export curbs.
The Market Is Overreacting to Nvidia’s Shadow
So why is the stock falling? Likely due to Nvidia’s recent wins, such as a splashy $100 billion pact with OpenAI and Alibaba (NYSE:BABA) announcing its was incorporating NVDA’s full suite of physical AI development tools directly into its platform — announcements that lit up NVDA shares. Investors are probably worried MU’s “beat and raise” wasn’t explosive enough to match the compute hype.
NAND weakness, down 5% year-over-year amid softer consumer demand, sparked cyclical fears, even as executives flagged AI’s NAND pull later in 2026. But this overlooks MU’s HBM sold-out status and tremendous revenue guide — clear wins in a memory upgrade supercycle where supply lags demand by quarters.
Key Takeaways
Micron’s 94% year-to-date surge has been electric, yet it still trades at a steal. A forward P/E around 9x hands investors a massive discount on explosive growth. It is a gift, undervaluing a company on the cusp of AI dominance. MU is unleashing a memory renaissance — HBM leadership fueling terabyte-scale AI infrastructure.
With the stock this cheap, don’t sleep on the opportunity. As data centers upgrade en masse, Micron will morph from chipmaker to an overwhelming AI force that promises multiyear compounding returns for investors.