OmahaLine
MUMICRON TECHNOLOGY INCNasdaq
$455.07+0.00%52w $65.65-$471.34as of Apr 17, 2026
Generated Apr 18, 2026

MU — Micron Technology

Micron is the fastest-growing high-bandwidth memory supplier in a genuine three-player oligopoly with $10–20 billion capital barriers, and its fiscal 2026 results — 81% gross margins, $33.5 billion guided quarterly revenue — represent the most profitable period in the company's history. The HBM supercycle has structurally improved Micron's earning power: AI's insatiable appetite for memory bandwidth has transformed a commodity DRAM producer into an indispensable AI infrastructure supplier, and the underlying shift appears durable. At $420 per share — paying roughly 70 times normalized pre-tax earnings for a business that posted a $5.8 billion net loss three years ago — there is no margin of safety, even for an excellent business.


The relationship between artificial intelligence and memory has become one of the defining industrial facts of this decade. Training a large language model requires moving enormous quantities of data between processors and memory at speeds that conventional DRAM was not designed to support. Inferring from a trained model — answering a question, generating an image, completing a code file — requires the same. The result is that every major semiconductor company building AI accelerators has converged on the same constraint: the memory interface is the bottleneck, and the product that addresses it is high-bandwidth memory. HBM stacks DRAM dies vertically using through-silicon vias, placing memory directly on the same package as the logic chip and delivering memory bandwidth that is an order of magnitude beyond what a conventional DIMM module can provide. NVIDIA's H100 GPU uses 80 gigabytes of HBM3 with 3.35 terabytes per second of bandwidth. The Blackwell successor doubles that. Every generation of AI accelerator that has shipped in the past three years has required more HBM than the one before it, and none of the architectural substitutes — CXL memory pooling, optical interconnects, processing-in-memory — solves the compute-proximate bandwidth problem that HBM was designed for.

What makes this fact financially significant is that HBM can be manufactured by exactly three companies in the world. Samsung, SK Hynix, and Micron are the only memory producers with the fabrication technology, the capital base, and the accumulated process expertise to build HBM at volume. A credible new entrant would require ten to twenty billion dollars in capital, seven or more years of process development, and the willingness to enter a market where the incumbents are already selling every unit they produce under long-term supply agreements. The economics of semiconductor fabrication do not reward new entrants: capital costs are fixed regardless of volume, yields improve only with experience, and customers who qualify a supplier spend twelve to eighteen months doing so and are reluctant to repeat the process. This is not a market with a competitive fringe. It is a duopoly that admits one additional participant.

The standard concern about memory as an industry is that it is a commodity subject to brutal price cycles. DRAM prices in 2023 fell to levels that caused all three producers to lose money simultaneously. The industry's history reads as a series of supply gluts followed by supply crunches, with producer economics oscillating accordingly. That concern is legitimate and relevant to valuation. It is not, however, the whole picture in 2026. HBM is not a commodity: it requires different manufacturing processes than standard DRAM, commands a significant price premium (roughly $10 per gigabyte versus $3 per gigabyte for standard DRAM at mid-cycle), and involves customer qualification processes that create multi-year supply relationships. The memory market has bifurcated — standard DRAM remains commodity-like, while HBM behaves more like a specialized component with customer lock-in and demonstrable technical differentiation between suppliers.

Micron's position in that bifurcation is the central investment question. The company entered 2024 as the smallest of the three HBM suppliers, having ceded early ground to SK Hynix while Samsung struggled with yield problems on its 12-stack HBM3E architecture. The yield crisis at Samsung — which kept Samsung's HBM from passing NVIDIA's qualification tests through much of 2024 and into 2025 — created an opening that Micron exploited aggressively. Micron's HBM3E achieved a 30% power efficiency advantage over competing products, a meaningful edge in data centers where power consumption has become a binding constraint. The company qualified its HBM with every major hyperscaler, ramped its Idaho and Singapore manufacturing capacity, and entered 2025 with its entire annual HBM supply contracted under fixed-price, fixed-volume agreements. By fiscal year 2025, HBM and related high-capacity data center DRAM had grown to roughly $10 billion in combined annual revenue — up from essentially zero in fiscal 2023.

The financial profile of the business in 2026 requires careful reading because the numbers in the most recent quarters are unlike anything in Micron's history. Revenue in the second fiscal quarter of 2026 (ended February 2026) reached $23.86 billion — 196% above the same quarter a year prior. Gross margin was 74.7%. GAAP net income was $13.79 billion. Third quarter guidance calls for $33.5 billion in revenue with gross margins of approximately 81% and non-GAAP earnings per share of $19.15. These are the numbers of a business experiencing an extraordinary supply-constrained pricing environment, not a business operating at its normal earning power. Context is essential: in fiscal 2023, Micron reported $15.5 billion in annual revenue, an 8% gross margin, and a net loss of $5.8 billion. The four fiscal years from 2022 through 2025 produced aggregate net income of approximately $13 billion — roughly equal to the single quarter just reported in fiscal 2026. The current quarter is not representative of what this business earns over a cycle.

Fiscal Year Revenue Gross Margin HBM / AI Memory Revenue Net Income
FY2022 $30.8B ~33% $8.7B
FY2023 $15.5B 8% -$5.8B
FY2024 $25.1B 22% ~$2B ~$1.5B
FY2025 $37.4B 40% ~$10B* $8.5B
Q2 FY2026 (quarterly) $23.9B 75% Majority of DRAM $13.8B
Q3 FY2026 (guided quarterly) $33.5B 81% Dominant ~$19B (est.)

*FY2025 HBM/high-capacity DRAM/LP server DRAM combined; HBM alone ran at approximately $8 billion annualized by Q4 FY2025.

The trajectory in this table contains the two most important facts about Micron. The first is that the business demonstrates the standard memory cycle: a $30.8 billion revenue year followed by a $15.5 billion revenue year with an 8% gross margin and a substantial net loss. No investor who has not absorbed this cycle should hold the stock. The second fact is that the HBM column tells a different story: from nothing in fiscal 2023 to $2 billion in fiscal 2024 to $10 billion in fiscal 2025, with the current quarterly run rate implying HBM alone at $20 billion or more annualized. The question is whether HBM's addition to the product mix has permanently elevated the earning floor — the minimum the business will earn at the bottom of the next cycle — enough to justify a higher valuation than the old DRAM-only Micron deserved. The evidence suggests the floor has risen. The stock price suggests the market has forgotten there is a floor.

The competitive position in HBM deserves a direct comparison because assertions about Micron's advantage are only as good as the data that supports them. The three-supplier HBM market has different dynamics than the broader DRAM market precisely because of the technical differentiation between products and the qualification friction between suppliers.

Supplier Est. HBM Market Share (2026) Key Technical Position Current Status
SK Hynix ~43% First-mover, dominant NVIDIA supplier Market leader, supply constrained
Samsung ~33% Scale, vertical integration, HBM4 development Recovering from HBM3E yield problems; HBM4 in testing
Micron ~24% 30% power efficiency advantage, fastest-growing 2026 supply sold out, HBM4 ramping Q2 calendar 2026
New Entrant 0% $10–20B investment, 7+ years required

Samsung's yield problems with HBM3E created the window in which Micron gained share — from roughly 11% of the HBM market in early 2024 to 21-24% today. That window is closing: Samsung passed NVIDIA's HBM3E qualification in September 2025, is now in mass production, and is targeting above 30% share by the end of calendar 2026. Goldman Sachs estimated that HBM3E prices would decline approximately 28% year-over-year in calendar 2026, from roughly $15 per gigabyte to $10 per gigabyte, as Samsung's supply rejoins the market at competitive pricing. Micron's 30% power efficiency advantage is real and validated, but it is not a permanent monopoly on the highest-performance product — SK Hynix and Samsung are both developing HBM4 and HBM4E on overlapping timelines. The moat is genuine and the oligopoly is durable, but the extraordinary pricing of the current quarter reflects a temporary supply disruption as much as it reflects the permanent economics of the business.

Sanjay Mehrotra joined Micron as chief executive in 2017 after co-founding SanDisk and selling it to Western Digital. His strategic orientation at Micron has been consistently toward the data center and toward the highest-value memory products — a capital allocation decision that now appears prescient. The HBM investment program, the TSMC partnership for HBM4E base dies (targeting 2027), and the $200 billion commitment to U.S. manufacturing capacity (two Idaho fabs, a New York facility) represent a long-duration bet on AI memory demand that is increasingly validated by results. The track record includes the full cycle: the 2023 loss year, the disciplined capacity cuts in the trough, the reinvestment in HBM as prices recovered. Micron suspended its buyback program during the downturn and resumed it in August 2024 as the business recovered — the opposite of the pattern that destroys capital in cyclical businesses. Insider selling at the current peak price level is worth noting but not alarming: an EVP selling 24,000 shares at $421 in April 2026 is a small fraction of a presumably larger position, and executives selling at all-time highs in a cyclical supercycle is rational personal financial planning.

The growth runway for memory broadly — and HBM specifically — reflects a genuine secular demand shift. Hyperscalers are expected to spend more than $600 billion on data center infrastructure in calendar 2026, approximately 75% of which targets AI workloads. The HBM market is projected to grow from $35 billion in calendar 2025 to more than $100 billion by 2028, a compound annual growth rate of approximately 40%. Micron has captured approximately 24% of the current HBM market at a moment when HBM demand is meeting only 55 to 60% of core customer demand — the constraint is manufacturing capacity, not market acceptance. New Idaho capacity coming online in mid-2027, combined with Singapore HBM packaging capacity in calendar 2027, will allow Micron to address demand it currently cannot fill. Beyond data center, mobile DRAM requirements are rising: flagship smartphones now ship with 12 gigabytes of DRAM as a baseline, up from 8 gigabytes two years ago, driven by on-device AI inference. Automotive memory revenue grew 49% year-over-year in fiscal 2025 as advanced driver assistance systems and autonomous driving features require server-class memory attached to vehicle-mounted processors. The secular tailwinds are real and broad-based.

The valuation question does not turn on whether those tailwinds are real. It turns on what an investor is paying for them. At approximately $420 per share, Micron has a market capitalization of roughly $464 billion. Against fiscal 2026 earnings that appear likely to reach $55 or more in non-GAAP earnings per share — a forward P/E of approximately 7.5 times — the stock appears cheap. That appearance is an artifact of using peak earnings as the denominator. The correct denominator for a cyclical business is normalized earning power: what the business earns at a normal point in its cycle, using historical margins and volumes rather than the extraordinary conditions of the current quarter.

A defensible normalization for Micron looks something like this: mid-cycle revenue of $30–35 billion, mid-cycle gross margins of approximately 38–42% (the level Micron achieved in fiscal 2025 before the HBM pricing spike), operating expenses of roughly $5 billion, and pre-tax income of approximately $7 billion — roughly $6 per share on a diluted share count of 1.1 billion. This estimate gives HBM credit for permanently raising Micron's earning floor above the DRAM-only baseline, but it does not embed the current extraordinary supply constraint, which has temporarily pushed gross margins to 75–81%. At $420, the stock trades at approximately 70 times that normalized pre-tax earnings estimate. The fifteen-times threshold that marks a fair entry price for a business of this quality implies a buy price of approximately $90. The stock is currently trading at nearly five times that price.

The intelligent bear on Micron argues that the normalized earnings estimate is too generous — that HBM pricing will collapse in 2027-2028 as Samsung and SK Hynix expand capacity, the next memory downcycle will look like 2023 rather than a mild correction, and the $6 normalized pre-tax EPS assumption bakes in a structural improvement to the earning floor that has not yet been demonstrated through a full cycle. That argument deserves serious weight. The rebuttal is that the oligopoly structure is genuinely different from prior memory cycles — only three suppliers, $15 billion capex requirements that have grown since 2023, and AI as a secular demand driver that did not exist in prior cycles. These factors do raise the probable trough versus fiscal 2023. But "a better trough than 2023" does not get a rational investor to $90, let alone $420. The bear's implied buy price and the normalized earnings buy price are directionally aligned.

For the current price to represent fair value, Micron would need to demonstrate that $28 per share in normalized pre-tax earnings — the level implied by $420 divided by 15 — is achievable at a mid-point in the memory cycle. That would require HBM to sustain margins not far below current levels (implying Samsung never fully recovers its cost position), AI-driven demand to grow fast enough to absorb capacity additions through 2028 without a pricing correction, and the memory cycle to remain permanently suppressed in its downswing severity. Any one of these could be true. All three being simultaneously true would constitute a genuine change in the fundamental structure of the memory market — one that the evidence from a single HBM supercycle does not yet support.

The HBM TAM reaching $100 billion by 2028 at Micron's current 24% share would imply $24 billion in annual HBM revenue, which at 70% gross margins would generate approximately $16.8 billion in HBM gross profit alone. Combined with standard DRAM and NAND at more normal margins, the business could plausibly earn $8–10 in normalized pre-tax EPS by 2028-2029 — which would imply a buy price of $120–150 at 15 times normalized earnings. That figure assumes the HBM TAM projections materialize and Micron maintains its share, neither of which is certain. It also lies 65–70% below the current price, suggesting that even optimistic scenario analysis does not support ownership at $420.

Micron has built something real. The oligopoly is genuine, the HBM technology is validated by customer adoption and pricing power, and Sanjay Mehrotra has executed a transformation of the business toward the highest-value memory products at precisely the right moment. The earnings being generated in fiscal 2026 are not fictional — they are the output of a supply-constrained market with legitimate structural barriers to new entry, and they represent a company at the peak of both its technology cycle and its pricing cycle simultaneously. The problem is not the business. The problem is the price.

For Micron to become compelling, one of two things must occur: the stock price must fall to the range of $90–120, at which point an investor pays a fair price for the normalized earnings of a genuine oligopoly with improving secular demand; or the next memory cycle must demonstrate a substantially higher earnings floor than the $6 per share assumption embedded in the current buy price estimate, at which point the normalized earnings analysis itself changes. Neither condition is satisfied today. The memory supercycle has delivered extraordinary results. The market has already priced the next several supercycles in advance.

The business deserves to be owned. The stock, at these prices, does not.

Was this analysis useful?

Related Companies

AVGOAMDTSMQCOMAMAT
Your Pile
<- newer