Everyone's obsessed with Nvidia and I can’t blame them. Up 180% last year, printing money, Jensen Huang's leather jacket, all of it. But the real bottleneck in AI isn't compute. It's memory.

And Micron is the best way to play it.

First, if you are skeptical, I get it. Micron's a cyclical memory company that's made you cry before (2022-2023 was brutal), trading at 28x forward earnings after running up 247% in 2025. This should be the top, right? Time to take profits and run?

Wrong. Dead wrong, actually.

This isn't a normal memory cycle. Morgan Stanley calls it "entering uncharted territory" - we're seeing a 2018-style shortage forming except earnings are already at record levels before the shortage really kicks in. DDR5 spot pricing has tripled since November. DRAM prices are expected to surge another 40-50% through mid-2026. And Micron just announced it can only meet 50-67% of demand from its biggest customers because supply is that tight.

This is the memory supercycle some think already happened - but the data suggests that we're just getting started.

The Memory Company

Let me break down what Micron actually does, because this matters for understanding why this time is structurally different.

Micron makes two types of memory chips: DRAM (think short-term memory, like RAM in your computer) and NAND (long-term storage, like SSDs).

They're one of only three companies on the planet that can make both at scale - Samsung and SK Hynix are the other two. That's it. These three companies are oligarchs with massive barriers to entry because building a modern memory fab costs $15-20 billion and takes 3-4 years.

Micron's creating a specialized version of DRAM called High-Bandwidth Memory - HBM for short. If Nvidia's GPUs are the brain doing all the AI thinking, HBM is the working memory right next to it that feeds the brain fast enough to actually do its job. Without HBM, these AI chips are like having a Ferrari engine with a garden hose for a fuel line.

The numbers from Q4 fiscal 2025 (reported December 17, 2025) show how fast this shift is happening:

Total revenue: $13.6 billion, up 57% year-over-year
Cloud Memory segment (this is the AI stuff): $5.3 billion, up 100% year-over-year
Gross margin: 56%, up from 38% in Q4 and 28% a year ago
Operating margin: 47% - Software-like margins for a hardware business
EPS: $4.78, up 167% from $1.79 a year ago

And then they guided Q1 even higher, at $18.7 billion in revenue , 68% gross margins, and $8.19 in EPS. That's not a typo. Seventy-eight percent gross margin expansion in one quarter.

But the biggest takeaway to me was what CEO Sanjay Mehrotra said on the most recent earnings call in December: "We have completed agreements on price and volume for our entire calendar 2026 HBM supply, including Micron's industry-leading HBM4.” Entire. 2026. Supply. Sold out.

They literally can't make enough chips to meet demand, and they've locked in pricing for the next 12 months at levels that are 20-25% higher than 2025. This is the memory equivalent of signing long-term oil contracts at $120/barrel when everyone else is scrambling for spot market supply at $150.

In terms of the competitive landscape that Micron operates in, SK Hynix dominates HBM with 60% market share (they had first-mover advantage partnering with Nvidia early), Samsung has 35%, and Micron's got 11%. But - and this is critical - demand is so far ahead of supply that market share doesn't matter as much as it normally would. When you can only fulfill 50-67% of orders from hyperscalers like Microsoft and Amazon, being the third-largest player in a sold-out market is still an incredible position.

Three Reasons This Memory Shortage Is Unique

Catalyst #1: The HBM Market Is Doubling Every 18 Months Through 2028

Mos of Wall Street is still pricing Micron like it's a normal memory company bouncing back from a downcycle. But in my view, the HBM market is more structural than they think.

Micron's projecting the HBM Total Addressable Market (TAM) will grow at a 40% CAGR through calendar 2028, hitting $100 billion. That's up from $35 billion in 2025. They moved this timeline up by two years from their prior estimate. Why? Because every major tech company has realized they can't compete in AI without their own massive compute infrastructure. Compute needs memory - and they need lots of it.

Microsoft's spending $80 billion on capex in 2025.

Amazon's at $75 billion (up from $62B).

Google's spending $50 billion.

Meta's at $15 billion.

That's $220 billion of combined hyperscaler capex, and roughly 25-30% of that is going directly to memory subsystems.

The critical distinction that makes HBM different from traditional DRAM is that it consumes 3x the wafer capacity per gigabyte compared to standard memory. So when Samsung, SK Hynix, and Micron allocate fab space to HBM production, they're cannibalizing their conventional DRAM output at a 3:1 ratio.

This creates what many are calling a "zero-sum game" in the memory market. Every wafer allocated to an HBM stack for an Nvidia H200 GPU is a wafer denied to the module in a consumer laptop or the chip in a smartphone. The result? Industry-wide DRAM supply growth is projected at just 16% year-over-year in 2026 - well below the historical average of 20-25%.

And demand? Growing at 30-35% annually for AI infrastructure alone, not counting traditional PC/mobile/server refresh cycles.

The math doesn't work. Supply can't catch up to demand because the production technology doesn't scale linearly.

Morgan Stanley analyst Joseph Moore put it bluntly in his December 2025 note: "We are entering uncharted territory, as we have a 2018-style shortage forming but from a much higher EPS starting point." His team forecasts $25 EPS in calendar 2026 earnings for Micron. At the current stock price of ~$385, that's a forward P/E of 15.4x, cheap for a company with this growth profile.

Catalyst #2: Pricing Power Is Accelerating, Not Decelerating

Contract pricing (which is what enterprise customers pay) is up over 100% as well. Samsung just raised prices for it’s 32GB DDR5 modules from $149 to $239 - a 60% increase - in September alone. And according to Korea Economic Daily reporting in early January 2026, Samsung and SK Hynix are planning to raise server memory prices by an additional 60-70% this quarter.

To be clear, this means DRAM prices could effectively double from their 2025 levels by mid-2026.

Now, Micron doesn't disclose exact pricing (none of the memory companies do), but we can infer from their guidance. Q4 25 saw DRAM average selling prices (ASPs) up mid-30% range. Q1 guidance implies another 20-25% increase based on analyst estimates.

On the most recent earnings call, CFO Mark Murphy said DRAM inventory days are "tight and below 120 days." For context, healthy inventory levels in the memory industry are typically 130-140 days. Below 120 days means channel inventory is depleted, distributors and OEMs are running lean and can't buffer against supply shocks.

When inventory is this tight and demand is this strong, suppliers have extraordinary pricing power. Think about oil in 2007-2008 or housing in 2021. Except in this case, the supply constraint isn't temporary, it's structural because of the HBM reallocation I mentioned above.

The bull case here is simple: If Micron can sustain even 60% gross margins or better and revenue grows to $80-85 billion annually by fiscal 2027, you're looking at $35-40 in annual earnings per share. At 20x forward earnings, reasonable for a company with these returns on capital and this market position, that's a $700-800 stock price.

Am I saying it'll definitely get there? No. But the upside case isn't as crazy as it sounds.

Catalyst #3: The Competition Is Just As Constrained As Micron

One thing that always kills memory companies in the good times? Competitors flooding the market with supply and crashing prices. Remember 2018-2019? Exactly.

But that's not happening this cycle. And there's a reason why.

SK Hynix reported in their October 2025 earnings call that HBM, DRAM, and NAND capacity is "essentially sold out" for 2026. Samsung warned in January 2026 that memory shortages will persist "well beyond 2026" and announced plans to expand HBM production capacity by 50% (but that's not coming online until 2027-2028). Even Micron's raising fiscal 2026 capex from $18 billion to $20 billion specifically to add HBM and advanced DRAM capacity (but remember, that's 2027 revenue at the earliest).

New fab construction takes 3-4 years from groundbreaking to production. You can't just flip a switch and add supply.

This is the first memory cycle where all three major players (Samsung, SK Hynix, Micron) are demonstrating pricing discipline. Why? Because they all got absolutely murdered in the 2022-2023 downcycle when they kept production running full-tilt into collapsing demand. Samsung posted a $15 billion operating loss. SK Hynix laid off workers. Micron slashed capex.

They learned their lesson. The three companies implemented "strategic production cuts" in late 2023 to stabilize the market before AI demand exploded. They're not repeating the mistakes of past cycles by rushing to add capacity and destroying pricing.

Plus, U.S.-China trade tensions mean Samsung and SK Hynix have halted sales of older chipmaking equipment to Chinese manufacturers, effectively capping China's ability to add competing capacity. Meanwhile, U.S.-based Micron is getting government support under the CHIPS Act, and building a massive fab in upstate New York with $6.1 billion in federal subsidies. That's not coming online until 2028, but it positions Micron as the domestic supplier for U.S. government and defense applications, which is another growth story entirely.

Ultimately, the supply response is delayed by 2-3 years, which means this pricing environment has legs well into 2027.

The Bear Case

Look, I'm bullish on Micron. That's obvious. But I'd be lying if I said there weren't real risks here that could blow up this thesis.

Risk #1: AI Capex Slowdown

What keeps me up at night? A scenario where hyperscalers realize they've overbuilt AI infrastructure before finding sustainable ROI, and suddenly the $220 billion in annual capex gets slashed by 20-30%.

We've seen this before. The cloud computing buildout from 2014-2016? Capex grew 40% annually, then hit a wall when utilization rates couldn't keep up with capacity. DRAM prices crashed 50% in 2018-2019.

Could it happen again? Yes.

Microsoft, Amazon, Google, and Meta are collectively spending a quarter trillion dollars on data centers, chips, and networking. That's based on the assumption that AI applications will generate enough revenue to justify the investment. If enterprise adoption stalls or even if ChatGPT Enterprise and Claude Pro subscriptions don't scale as fast as projected, or if AI agents don't deliver the productivity gains everyone's expecting - those capex budgets get re-evaluated fast.

If hyperscaler capex gets cut by 20% (from $220B to $176B), Micron's data center revenue growth goes from 50% annually to maybe 10-15%. At current valuations, a growth deceleration like that would undoubtedly trigger multiple compression. That would send the stock price under $300 in a hurry.

What's the probability of that happening? I'd put it at 20-25% over the next 18 months. The base case (60% probability) is continued 30-40% growth as AI monetization improves. Bull case (15% probability) is acceleration if models like GPT-5 or Claude 4 require even more compute than expected.

Risk #2: Samsung and SK Hynix Catch Up Faster Than Expected

Right now, Micron's got 11% HBM market share versus SK Hynix's 60%. That gap could narrow or widen dramatically depending on production ramps and technology leadership.

SK Hynix has a structural advantage, since they were first to market with HBM3E and have the tightest relationship with Nvidia (rumors are Nvidia accounts for 90% of SK Hynix's HBM sales). Samsung's been playing catch-up but they've recently cleared qualification hurdles for 12-layer HBM3E with major customers.

If Samsung suddenly ramps production ahead of Micron in late 2026 and captures a bigger share of Nvidia's next-gen Rubin platform orders, Micron's growth story gets pressured. They'd still benefit from overall market growth as one of the three players, but the premium valuation requires them to at least maintain share if not gain it.

Micron's counter-argument to that is technology leadership. They claim to have "industry-leading HBM4" and their 1-gamma DRAM node is ahead of competitors on power efficiency. But this is memory, and technology advantages are fleeting. Obviously Samsung and SK Hynix have much bigger R&D budgets and more capacity to throw at the problem.

Risk #3: You (The Investor)

Real talk. Can you handle a 35% drawdown?

Because it's coming. Maybe not this quarter, maybe not in 2026, but at some point Micron's going to pull back hard. Memory stocks are cyclical by nature, and even structural growth stories have 20-30% corrections on macro fears, Fed policy shifts, or earnings misses.

If you bought at $385 and it drops to $250 on some temporary negative catalyst, are you selling at the bottom? Or are you holding (or better yet, adding) because the long-term thesis is intact?

The best investment you can't stick with is worse than a mediocre one you can hold for five years. If this is 5% of your portfolio and that’s keeping you up at night, then trim it down. There's no shame in position sizing for your personal risk tolerance rather than some theoretical optimal allocation.

What I think

At $385, Micron's trading at roughly 14x forward earnings as of today. For 40-50% expected earnings growth over the next two years, that's a PEG ratio of roughly 0.3- DEEPLY undervalued if the growth materializes.

Even if growth slows to 25% (a bear case scenario), you're still around 0.6x PEG, which is reasonable for a company with Micron's improving returns on invested capital.

Who this is for: Investors with 2-3 year time horizons who can handle volatility and understand that memory stocks can drop 30% on macro fears even when fundamentals are solid. This is not a "sleep well at night" stock - it's a high-conviction bet that the AI infrastructure buildout has years of runway and memory is a key bottleneck.

Who this isn't for: Conservative investors, anyone who panics on 15% drawdowns, people who need the money in 12 months.

What to Watch Next

Key dates:

  • March 19, 2026: Q1 fiscal 2026 earnings (watch for margin sustainability and Q3 guidance)

  • June 18, 2026: Q2 fiscal 2026 earnings (critical to see if HBM revenue growth stays at 40%+ or starts decelerating)

  • Throughout 2026: Hyperscaler earnings calls (Microsoft, Amazon, Google, Meta) for capex guidance

Metrics I'm tracking:

  • Gross margin: Needs to stay above 60%. If it drops below 58%, pricing power is weakening

  • HBM revenue growth: Should be 40% at least year-over-year. If it decelerates below 30%, the supercycle thesis is weakening

  • DRAM ASP trends: TrendForce publishes monthly spot pricing data. If DDR5 prices stop rising or start declining, that's an early warning signal

  • Hyperscaler capex: Any two of the big four cutting guidance in the same quarter, I'm trimming

What changes my mind:

  • Microsoft, Amazon, Google, or Meta cut capex by 15% or more in quarterly guidance

  • Gross margin falls below 55% for two consecutive quarters

  • SK Hynix or Samsung announce major HBM capacity additions that come online in late 2026

  • China's CXMT (their domestic memory company) successfully ramps HBM production and disrupts pricing

SK Hynix and Samsung both report earnings in late January 2026. I'll be watching their commentary on HBM supply-demand balance and 2026 capex plans. If either one sounds less optimistic than Micron did in December, that's a yellow flag.

The Bottom Line

Micron's not a traditional memory play anymore. This is the most severe DRAM shortage in 30 years + pricing power is accelerating + the entire 2026 HBM supply sold out + there are only three suppliers globally = structural earnings power that's just beginning to be recognized.

Could I be wrong? Absolutely. AI capex could decelerate faster than expected. Samsung could ramp production and flood the market. Macro could roll over and take everything down 20%. That's why I've got a stop at $325 and I'm not betting the farm.

But the risk-reward at $385? I'll take it. You're risking $1 to make $2 over the next 12-18 months, with a company that's demonstrating pricing power, margin expansion, and structural market position advantages that are rare in the semiconductor space.

This isn't about catching a cyclical bounce. It's about recognizing that the memory market just fundamentally changed, and Micron's in the catbird seat for the coming years of AI infrastructure buildout.

The Earnout Investor provides analysis and research but DOES NOT provide individual financial advice. Jamie Dejter may have a position in some of the stocks mentioned. All content is for informational purposes only. The Earnout Investor is not a registered investment, legal, or tax advisor, or a broker/dealer. Trading any asset involves risk and could result in significant capital losses. Please, do your own research before acquiring stocks.

Subscribe to the Earnout Investor Free Newsletter!

Keep Reading