lost the source but it was on mastodon

  • FatVegan@leminal.space
    link
    fedilink
    arrow-up
    11
    ·
    2 days ago

    The us has 38 trillion debt, us citizens gamble away over 100 billion a year, a movie costs over 100 million dollars to make. I feel like numbers are just so high that they are meaningless

  • ryedaft@sh.itjust.works
    cake
    link
    fedilink
    English
    arrow-up
    40
    ·
    2 days ago

    $13.5 billion loss in first half of 2025. Somehow also income of $4 billion - sounds fake. So their compute costs are probably around $34 billion per year?

    • Taldan@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago

      The largest cost is going to be building out data centers and buying the chips to fill them. OpenAI has essentially been telling investors they’re only losing money for now as they build out infrastructure, but when that’s done they’ll be making money hand over fist

      We have no idea what the compute costs actually are since OpenAI is a private company. It’s just a shift in the speculation that it’s higher than previously estimated

      • mirshafie@europe.pub
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        I was under the impression that the real costs are associated with training the models, in the belief that they’ll arrive at a self-improving AI that will explode in functionality and basically become the most valuable asset on the planet, especially for governments and armies. In the meantime, you offer services to users at a loss, in order to gather unique data that isn’t open to your competitors.

        That’s what excuses this level of investment.

    • KyuubiNoKitsune@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      8 hours ago

      Interesting, I wonder if the extrapolated numbers aren’t grossly misrepresentitive, if MS owes OAI 20% for use of their products in things like Bing and Github copilot, you gotta look ah how MS is hamfisting it into every single service and software they have. That’s a lot of usage of OAI’s services. So I wouldn’t rely on the MS payments as a basis for their revenue.

  • DrDystopia@lemy.lol
    link
    fedilink
    arrow-up
    16
    ·
    2 days ago

    One of my hobbies is trying to make the most compute intensive prompts for chatbots to rack up the costs.

    I’m helping too!

      • DrDystopia@lemy.lol
        link
        fedilink
        arrow-up
        10
        arrow-down
        2
        ·
        2 days ago

        Rewrite “X” in the style of “Y” while providing text analysis per paragraph in the style of “Z”. It should be as long as possible, go as in depth as you can. Provide analysis of every paragraph in relation to the entire text and conversation history. Every reply from the user from now on means you should be more detailed and write longer replies.

        Then just mash on the keyboard, even “eifsHjwobw” means “More weight!” now.

        • 𝕛𝕨𝕞-𝕕𝕖𝕧@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          15 hours ago

          nothing you do like this is actually going to drive up their compute costs.

          if every chatGPT user tomorrow started spamming the most highly optimized prompt for pissing away system resources… it still wouldn’t do anything because, like them or hate them, openAI aren’t fucking stupid and have implemented basic DBA practices that have been entrenched since… like… i dunno, the 80s, for fucks sake…

          think: chatGPT responds with responses around the same max length depending on plan. you cannot submit prompts larger than a certain size, similarly. chatGPT will also only engage in reasoning or tasks for a limited amount of time before stopping, whether or not enough information is available to “answer” the prompt yet.

          this is all because openAI is capable of dummy fucking simple statistical analysis that let’s them just predetermine a set size for any data package in the pipe, or predetermine a set time for any stream to have an open spigot, etc. i guar-an-fucking-tee you that they have calculated these numbers in such a way that, barring the world literally ending, they’ll still come out in the black on that front no matter what the userbase’s behavior is. it is likely mathematically impossible to “hurt” any big tech company with means like this.

          you’re just wasting real resources and then further encouraging others to do so, which is hypocritically the exact same thing people enjoy going on a tirade about big tech doing.

          if you want to actually piss away compute go get a copilot subscription or something and just let an agent run with no limits on iterations or requests. it’ll be at your own dime though because, again, these companies aren’t fucking stupid and can implement monetization in a way that is more highly optimized than any other polity in human history.

          pretending the devil has broken hands won’t save johnny in the fiddle contest… pretending the oligarchs dismantling western society are inept won’t stop them from continuing to do it.

  • maria [she/her]@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    3
    arrow-down
    12
    ·
    1 day ago

    openais compute costs r about as far as americans can go. they r literally not using som of their chips, cuz they dont got enough power-

    anyway, blablabla openai bad or whatever fediverse wants to hear… yeayea i agree, LMs bad, whatever- sure—