The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • themurphy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 hours ago

    Well, if they succeed, it’s because of efficiency and lowering costs. Second is how much the data and control is really worth.

    The big companies is not just developing LLM’s, so they might justify it with other kinds of AI that actually makes them alot of money, either trough the market or government contracts.

    But who knows. This is a very new technology. If they actually make a functioning personal assitant so good, that it’s inconvinient not to have it, it might work.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      I can see government contracts making a lot of money regardless of how functional their technology actually is.

      It’s more about who you know than what you can actually do when it comes to getting money from the government.