• jimjam5@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    15 hours ago

    What? Elon Musk’s xAI data center in Tennessee (when fully expanded & operational) will need 2 GW of energy. That’s as much as some entire cities use in a year.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      5
      ·
      11 hours ago

      Rockstar games: 6k employees 20 kwatt hours per square foot https://esource.bizenergyadvisor.com/article/large-offices 150 square feet per employee https://unspot.com/blog/how-much-office-space-do-we-need-per-employee/#%3A~%3Atext=The+needed+workspace+may+vary+in+accordance

      18,000,000,000 watt hours

      vs

      10,000,000,000 watt hours for ChatGPT training

      https://www.washington.edu/news/2023/07/27/how-much-energy-does-chatgpt-use/

      Yet there’s no hand wringing over the environmental destruction caused by 3d gaming.

      • jimjam5@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        Semi non sequitur argument aside, your math seems to be off.

        I double checked my quick phone calculations and using figures provided, Rockstar games with their office space energy use is roughly 18,000,000 (18 million) kWh, not 18,000,000,000 (18 billion).

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          50 minutes ago

          I put the final answer in Watt hours, not Kw hours to match. ChatGPT used 10B watt hours, not 10B Kwatt hours.

          • jimjam5@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            7 minutes ago

            Ahh was wondering where the factor of 1000 came from.

            Without turning into a complete shootout, I can kind of see the point with comparing energy usage, but as others have said with these massive data centers it’s like comparing two similar but ultimately different kinds of beasts.

            Beyond just the energy used in training of generative AI models in data centers, there’s also the energy it needs to fulfill requests once implemented (24/7, thousands of prompts per second).

      • Glitterkoe@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        10 hours ago

        And then you have a trained model that requires vast amounts of energy per request, right? It doesn’t stop at training.

        You need obscene amounts GPU power to run the ‘better’ models within reasonable response times.

        In comparison, I could game on my modest rig just fine, but I can’t run a 22B model locally in any useful capacity while programming.

        Sure, you could argue gaming is a waste of energy, but that doesn’t mean we can’t argue that it shouldn’t have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.