A new survey conducted by the U.S. Census Bureau and reported on by Apolloseems to show that large companies may be tapping the brakes on AI. Large companies (defined as having more than 250 employees) have reduced their AI usage, according to the data (click to expand the Tweet below). The slowdown started in June, when it was at roughly 13.5%, slipping to about 12% at the end of August. Most other lines, representing companies with fewer employees, are also at a decline, with some still increasing.

  • sj_zero@lotide.fbxl.net
    link
    fedilink
    arrow-up
    82
    ·
    16 hours ago

    IMO, AI is a really good demo for a lot of people, but once you start using it, the gains you can get from it end up being somewhat minimal without doing some serious work.

    Reminds me of 10 other technologies that if you didn’t get in the world was going to end but ended up more niche than you’d expect.

    • MagicShel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      31
      ·
      13 hours ago

      As someone who is excited about AI and thinks it’s pretty neat, I agree we’ve needed a level-set around the expectations. Vibe coding isn’t a thing. Replacing skilled humans isn’t a thing. It’s a niche technology that never should’ve been sold as making everything you do with it better.

      We’ve got far too many companies who think adoption of AI is a key differentiator. It’s not. The key differentiator is almost always the people, though that’s not as sexy as cutting edge technology.

      • floofloof@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 hours ago

        The key differentiator is almost always the people, though that’s not as sexy as cutting edge technology.

        Evidently you haven’t worked with me. I’m actually quite sexy.

      • krunklom@lemmy.zip
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        12 hours ago

        The technology is fascinating and useful - for specific use cases and with an understanding of what it’s doing and what you can get out of it.

        From LLMs to diffusion models to GANs there are really, really interesting use cases, but the technology simply isn’t at the point where it makes any fucking sense to have it plugged into fucking everything.

        Leaving the questionable ethics many paid models’ creators have used to make their models aside, the backlash against so is understandable because it’s being shoehorned into places it just doesn’t belong.

        I think eventually we may “get there” with models that don’t make so many obvious errors in their output - in fact I think it’s inevitable it will happen eventually - but we are far from that.

        I do think that the “fuck ai” stance is shortsighted though, because of this. This is happening, it’s advancing quickly, and while gains on LLMs are diminishing we as a society really need to be having serious conversations about what things will look like when (and/or if, though I’m more inclined to believe it’s when) we have functional models that can are accurate in their output.

        When it actually makes sense to replace virtually every profession with ai (it doesn’t right now, not by a long shot) then how are we going to deal with this as a society?

    • Damage@feddit.it
      link
      fedilink
      English
      arrow-up
      10
      ·
      12 hours ago

      I’ve got a friend who has to lead a team of apparently terrible developers in a foreign country, he loves AI, because “if I have to deal with shitty code, send back PRs three times then do it myself, I might as well use LLMs”

      And he’s like one of the nicest people I know, so if he’s this frustrated, it must be BAD.

      • Aceticon@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 hours ago

        I had to do this myself at one point and it can be very frustrating.

        It’s basically the “tech makes lots of money” effect, which attracts lots of people who don’t really have any skill at programming and would never have gone into it if it weren’t for the money.

        We saw this back in earlier tech booms and see it now in poorer countries to were lots of IT work has been outsourced - they still have the same fraction of natural techies as the rest but the demand is so large that masses of people with no real tech skill join the profession and get given actual work to do and they suck at it.

        Also beware of cultural expectations and quirks - the team I had to manage were based in India and during group meetings on the phone would never admit if they did not understood something of a task they were given or if there was something missing (I believe that it was so as not to lose face in front of others), so ended up often just going with wrong assumptions and doing the wrong things. I solved this by, after any such group meeting, talking to each member of that outsourced team, individually and in a very non-judgemental way (pretty much had to pass it as “me, being unsure if I explained things correctly”) to tease from them any questions or doubts, which helped avoid tons of implementation errors from just not understanding the Requirements or the Requirements themselves lacking certain details and devs just making assumptions on their own about what should go there.

        That said, even their shit code (compared to what us on the other side, who were all senior developers or above, produced) actually had a consistent underlying logic throughout the whole thing, with even the bugs being consistent (humans tend to be consistent in the kind of mistakes they make), all of which helps with figuring out what is wrong. LLMs aren’t as consistent as even incompetent humans.

    • chaosCruiser@futurology.today
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      13 hours ago

      Cyberspace, hypertext, multimedia, dot com, Web 2.0, cloud computing, SAAS, mobile, big data, blockchain, IoT, VR and so many more. Sure, they can be used for some things, but doing that takes time, effort and money. On top of that, you need to know exactly when to use these things and when to choose something completely different.