The narrative in AI infrastructure over the last two years has been dominated by the enormous and growing demand for compute capacity and its economic consequences, such as the buildout of data centers and the consequent shortages of key resources such as land, water, power, and copper.
But of all these bottlenecks, memory is by far the most significant. The demand for memory is now outpacing the demand for other drivers of compute capacity. The implications of this will ripple through not just the economics of data centers, but the cost of every single consumer and enterprise hardware device.
In this piece, we unpack the market action around memory prices, its ripple effects across the consumer and industrial electronics market, and the supply and demand curve that is emerging around AI. Critically, we explain why the amount of memory being purchased by AI companies like OpenAI seems to be more than what they need, and how the threat of on-device inference might actually be incentivizing an engineered memory shortage.
The goal is to desensitize the general population into accepting thin clients instead of actual computers so that they can rent their OS as a cloud subscription.
This gives companies more control over content you consume plus your behaviours. It also gives governments more granular control over their citizens which is in vogue considering democracy is out and barbarism and force are back in.
Asus and dell are already launching clients to save us.
Whether it’s intentional or just fraudulent, it’s malicious either way. This whole datacenter “investment” situation is thoroughly fucked up
It could just be irrational exuberance, and the perverse incentive to keep it going by the people running those companies.
In the dotcom bubble I’m sure there was shortages, same as the housing bubble, which destroyed peoples ability to buy a home.
Its really the Fed whose at fault, with cheap debt and QE, they now exist to create as much misallocated capital as possible. The entire financial system is fraudulent, to quote burry in the big short.
If cxmt tries to fix the memory shortage and the US responds by threatening to sanction them, then the shortage is intentional.
Get ready to only own screens. And every thing is processed via the cloud on data centers destroying your community and livelihood.
And of course you will be paying for every minute of it.
Currently contracting in an “automated” manufacturing center in the US and all of our SCADA traffic is on the global network currently being routed through the UK because oops and it’s hilariously bad. Systems designed for millisecond polling talking 1-30 seconds to react.
SCADA systems are universally terrible. Clouding them does not resolve that. Fuck whoever decided to do this.
Yes. This is the first step to doing with technology what they have done with housing, transport, media, and agriculture. The noose has nearly closed.
Will lower memory availability to consumers increase reliance on cloud-based storage and demand for data centers? >
No, because you need more RAM to run a smart terminal than a standalone micro, because there’s no secondary memory available to rely on.
Therefore, according to our best estimates, OpenAI likely needs less than 30% of the 10.8 million wafers it’s planning to buy>
OpenAI hasn’t actually paid for any of that, its sold on credit with a 6 year repayment period on hardware that will only last 2-3 years at most. That’s why no memory manufacturers are increasing capacity, as they would if they thought there was any long term increase in demand.
For those who don’t want to read several pages of unnecessary text telling you what you probably already know:
The math, while pretty involved, may tell a straightforward story (if you’re interested in the details of our analysis, see the Appendix). OpenAI has contracted 900K memory wafers per month from Samsung and SK Hynix. Partner commentary seems to indicate that’s a monthly number, so that represents 10.8 million wafers over 12 months. In terms of demand, a fully built-out 10GW Stargate cluster would require ~3 million GB200 Bianca Boards. Each board requires ~50% of a memory wafer in total; split between the HBM3e stacks embedded into its two B200 GPU (~30%) and its 480 GB of LPDDR5X system memory (~20%). That puts total wafer demand for the entire cluster at ~3 million wafers.
Therefore, according to our best estimates, OpenAI likely needs less than 30% of the 10.8 million wafers it’s planning to buy
So this is just putting some numbers to what a lot of people already guessed. The AI companies are not just buying a ton of RAM to build out their data centers. They aren’t buying enough other components to even use that RAM. They’re buying it so that no one else can.
And market supervision – is not existent.
I’m just not connecting the dots. The amount of money they’re spending on this is astronomical, and they are burning through the cash they have at a rate they can’t sustain, while they’re fighting for their future against Google, Anthropic, plus xAI and Perplexity and others, and maybe foreign competition like Deepseek that the government can’t fully shield them from. While also competing with major data center companies themselves, who may want to build data centers for other non-AI purposes, too. And those competitors have deep, deep pockets.
If they don’t have a revenue model that actually keeps them afloat, then all their capital expenditures will end up going to benefit someone else.
In other words, the central thesis that they want to choke out competition from on-device models kinda ignores that they’re facing a much more immediate, much more pressing threat from their data center competition. It’s like trying to corner the market on snow shovels when a hurricane is bearing down.
Plus one important thing worth noting is that OpenAI purchased the option to buy that much memory, enough to persuade the memory manufacturers to change their own investment decisions for the next 5 years. They’re not necessarily going to actually buy that much. And in theory could sell that option to others. 40% of the market is enough to really move prices, but not enough to actually corner it and exclude others from buying memory. They’ll just have to make it more expensive for themselves at the same time that they make it more expensive, but not impossible, for their true competitors also outfitting data centers.
It’s OpenAI in particular trying to screw everyone else. The wafers they contracted from Samsung and SK Hynix are something like 40% of those companies’ production. There isn’t enough production volume for the other AI companies to over order like that.
So this is the strategy of putting 4 houses on your properties in Monopoly and never upgrading them to hotels because that way there are no houses for your opponents to buy
Hopefully this accelerates their crash.
haven’t read the article just yet, but these companies are known for price fixing and colluding for decades now
Oligarchs trying to buy up all the digital real estate so they can be the digital landlords of computerland
Why sell you a computer, when they can rent you one for $$ per query you do
No, clearly it was accidental. 🙄 🙄 🙄








