The narrative in AI infrastructure over the last two years has been dominated by the enormous and growing demand for compute capacity and its economic consequences, such as the buildout of data centers and the consequent shortages of key resources such as land, water, power, and copper.

But of all these bottlenecks, memory is by far the most significant. The demand for memory is now outpacing the demand for other drivers of compute capacity. The implications of this will ripple through not just the economics of data centers, but the cost of every single consumer and enterprise hardware device.

In this piece, we unpack the market action around memory prices, its ripple effects across the consumer and industrial electronics market, and the supply and demand curve that is emerging around AI. Critically, we explain why the amount of memory being purchased by AI companies like OpenAI seems to be more than what they need, and how the threat of on-device inference might actually be incentivizing an engineered memory shortage.

  • PerogiBoi@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    ·
    4 hours ago

    The goal is to desensitize the general population into accepting thin clients instead of actual computers so that they can rent their OS as a cloud subscription.

    This gives companies more control over content you consume plus your behaviours. It also gives governments more granular control over their citizens which is in vogue considering democracy is out and barbarism and force are back in.

    • Luffy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      3 hours ago

      The goal is to desensitize the general population into accepting thin clients

      Shame on you, because Im in the thin client Hype for 2 years now

      My backup PC and media center both run on 2015 thin Clients running Debian, they are really cheap, dont have any moving parts, draw 15w at most, and are Really space efficient