Skip to content

DRAM Shortages, Price hikes, and AI.

I’ve been building, using, and messing around with computers since I was 12. I’m 24 now, which means I’ve spent over a decade watching the PC hardware space go through cycles of hype, shortages, and price spikes. Crypto mining. Chip shortages. Scalpers. And now — AI.

So it feels fair to ask: is PC gaming really on the decline, or is it just being priced out?

If you’ve paid any attention to tech news lately, you’ve probably seen the headlines about RAM prices going through the roof. That’s not random. DRAM prices have jumped hard, and DRAM is used in basically everything graphics cards, system memory, SSDs, phones, you name it. When memory prices rise, the rest of the market follows.

The big difference this time is who is driving demand. It’s not gamers. It’s AI.

Companies like Nvidia, OpenAI, Microsoft, and others are buying up huge amounts of memory for data centers and AI workloads. OpenAI alone has reportedly been working with Samsung and SK Hynix to lock down a massive chunk of global DRAM production. Some estimates put it as high as 40% of new DRAM going to OpenAI by itself and that’s before you even factor in everyone else.

On top of that, manufacturers are clearly shifting focus. Companies like Micron, which have historically sold consumer RAM and SSDs, have started pulling back from those markets to sell directly to AI data centers instead. From a business standpoint, it makes total sense. Enterprise buyers pay more and buy in bulk. For consumers, though, it means higher prices and fewer choices.

Some people think this is temporary and that things will balance out sometime in 2026. Others think this is just how it’s going to be from now on. Honestly, it’s probably a bit of both.

Nvidia is the best example of this shift. They used to be a gaming GPU company first. Now, they’re one of the biggest AI companies in the world, with a market cap north of four trillion dollars. When your stock is doing that well, it’s pretty obvious where your priorities are going to be.

For the average user, this actually lines up with another trend. In 2025, most people don’t need a dedicated graphics card anymore. Integrated graphics inside modern CPUs are good enough for everyday use, light gaming, and media. That means the consumer GPU market is shrinking at the same time Nvidia is focusing elsewhere.

Nvidia has also said they plan to cut consumer GPU production by around 20–50%. We don’t know all the details yet, but the outcome is pretty easy to predict: less supply, same demand, higher prices. If you’re a gamer or a PC builder, that’s not great news.

None of this is new for PC enthusiasts. We’ve been through shortages before — the crypto mining boom in 2020 and the chip shortages from 2021 to 2023 made building a PC miserable for a while. The difference now is that back then, gamers were still the main audience. Crypto miners were a side effect. This time, consumers just aren’t the priority anymore.

So no, PC gaming isn’t dying. If anything, it’s still growing. But it is getting harder to get into. The cost of entry keeps climbing, and that pushes PC gaming further toward enthusiasts and away from the average person who just wants to build a solid rig.

If you’re thinking about building a PC right now, it might be worth waiting a few months if you can. See where prices land. See if supply improves. And if you do build, be ready to pay an arm and a leg, especially with DDR5 RAM.

You might even want to look at a DDR4 build instead. It’s cheaper, and CPUs like the Ryzen 7 5800X3D are still more than capable in 2025.

PC gaming isn’t going anywhere. But for the first time in a long time, it feels like we’re no longer the main focus — and that’s probably the biggest change of all.

Blog written by; Joseph Flowers IT Administrator | Blooming Technical Solutions.