
Microsoft’s AI ambitions may have just taken another major step forward. A new report claims the company has secured a massive high‑bandwidth memory deal with SK hynix — big enough to send the chipmaker’s stock soaring nearly 9% in a single day. With AI hardware demand exploding and every major player scrambling for supply, this kind of agreement might give Microsoft a serious edge in the race to build out next‑gen cloud and AI infrastructure.
Here's why it matters.
In case you've been living under a large slab of silicon, you might've heard that memory prices have exploded in recent months, and it's all because of "artificial intelligence." The demand for compute from AI-first companies is so impossibly vast that memory-makers cannot keep up with demand. It's gotten so bad that Samsung has even refused to sell memory to divisions within its own company, with analysts estimating that anywhere up to 70% of new silicon could go straight to data centers this calendar year.
In essence, corporations and data center hyperscalers are out-bidding consumer goods manufacturers. As a result, consumer-grade products are being increasingly squeezed out of the picture, as laptops, PC components, and even smart fridges feel the pinch. Pretty much any device with DRAM in it is expected to (or already will) go up in price, as tariffs and shortages continue to impact consumer products.
One such memory maker, SK Hynix, recently shuttered its consumer RAM divisions to focus entirely on the data center corps. And good old Microsoft looks like it's got a front row seat at the RAM buffet.
According to a report from Bloomberg, SK Hynix has entered into a big contract with Microsoft to become the sole supplier of advanced memory for the firm's new AI products. The news sent SK Hynix's stock price up over 8% on the Korean stock exchange, wiping out some earlier losses on fresh U.S. tariff fears.

SK Hynix is one of the world's largest memory-chip makers, and has become central to the race to mainstream and commercialize artificial intelligence platforms based on large language models (LLMs) and other similar tech. Its high-bandwidth memory systems are often specifically designed for AI workloads, and as profitability increasingly becomes an issue for investors, companies like Microsoft are looking to boost efficiency as a top priority. Microsoft's AI (non-Azure) products are powered partially by OpenAI, which is presently history's most efficient cash incinerating machine, burning millions per day with profitability well out of sight.
To that end, Microsoft debuted its Maia 200 AI chip today, which it says will offer world-leading efficiencies over competing similar products. It's a positive move for Microsoft's AI hopes, as it faces down the prospect of competing with Google, whose own home-grown Tensor server tech is already a fair bit further ahead than Microsoft's. The report states that SK Hynix will become the sole supplier of memory for this chip, driving a beefy stock rally.
SK Hynix is up 35% for the month and a whopping 250% for the year, offering some insight into how critical it has become to the AI supply chain.
Microsoft and other AI hyperscalers are in something of a race to become the "platform du jour" for AI. Google has a significant advantage in that it owns YouTube, exclusive access to Reddit content for model training, exclusive rights to iOS' Siri chatbot, as well as the default AI option on Android. Microsoft's advantages revolve entirely around enterprise and government compliance. Copilot is embedded directly into Azure, which is trusted the world over, and is uniquely proficient at navigating complex local laws and data protection rules, both internally and at a legal nation-state level.
Our newest AI accelerator Maia 200 is now online in Azure.Designed for industry-leading inference efficiency, it delivers 30% better performance per dollar than current systems.And with 10+ PFLOPS FP4 throughput, ~5 PFLOPS FP8, and 216GB HBM3e with 7TB/s of memory bandwidth… pic.twitter.com/UUiGikO1uBJanuary 26, 2026
Microsoft is one of the few companies actively profiting from AI technologies right now, owing to early investments into OpenAI and its ChatGPT model, alongside rapid integrations with Microsoft 365, Azure, and products like GitHub and Microsoft Dynamics. Boosting efficiency via home-grown tech will be crucial to reducing the costs associated with AI
Microsoft is also one of the companies suffering from the biggest mainstream backlashes to the technology, as forced, often useless integrations into products like Microsoft Word, Edge, and even MS Paint continue to irritate and frustrate users.
For better or worse, Microsoft sees artificial intelligence as the future of computing. The firm hopes that it can be the company that actually makes the technology genuinely useful beyond generating meme-slop and investor hype, delivering on some of those lofty promises like new medicines or other scientific breakthroughs. So far, its consumer-grade AI products, like Windows Recall and its Copilot+ PC range, have fallen firmly flat to say the least.
If the report is accurate, Microsoft is locking down a strategic lifeline in a market where HBM is becoming the most fought‑over resource in tech. The AI boom has turned memory into “the new battleground,” and securing long‑term supply could shape who actually keeps up as workloads scale. Whether this deal reshapes the competitive landscape or simply marks the start of an even bigger arms race, one thing is clear: AI‑grade memory is now as valuable as the silicon it feeds.


Follow Windows Central on Google News to keep our latest news, insights, and features at the top of your feeds!