• GissaMittJobb@lemmy.ml
    link
    fedilink
    English
    arrow-up
    30
    ·
    8 days ago

    I mostly agree with the article, but this particular part stood out to me:

    Confusing: No comments, no clear structure.

    One of my biggest gripes with AI generated code has been the high volume of low-value comments - the type of high noise-to-signal stuff you just sort of have to remove.

    • silasmariner@programming.dev
      link
      fedilink
      English
      arrow-up
      13
      ·
      7 days ago
      /* Calculates the levenshtein distance between two words
      * @param wordA: the first word to pass to the algorithm
      * @param wordB: the second word to pass to the algorithm
      */
      def levenschteinDistance(wordA: String, wordB: String): Int = ...
      

      – completely fucking pointless innit

    • resipsaloquitur@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 days ago

      One of my biggest gripes with human-generated code has been the high volume of low-value comments.

      int a=0; // initialize a
      

      Thanks.

    • criss_cross@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 days ago

      A lot of AI comments are just restating the code to you. Explaining the “what” not “why”.

      Which is fine for educational purposes. Not so much for actual production code.

  • entwine@programming.dev
    link
    fedilink
    English
    arrow-up
    22
    ·
    8 days ago

    Once the bubble pops, Claude, OpenAI, etc will need to raise prices and/or tighten rate limits. I wouldn’t want to be a “vibe coder” in that situation.

  • henfredemars@infosec.pub
    link
    fedilink
    English
    arrow-up
    19
    ·
    8 days ago

    It’s getting near impossible to get hired in the computer science field. There have been tons of layoffs. I don’t expect this will make the situation any better.

    • atomicbocks@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 days ago

      I’ve been unemployed for almost a year and I have 15 years of experience. It’s getting insane, I spoke with a recruiter a couple months ago who told me that their office has 0 jobs right now.

        • atomicbocks@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 days ago

          My experience is largely in bespoke web applications, specifically ones that require regulatory compliance with HIPAA and/or FERPA. I’m familiar with the full stack but I specialize in UI/UX and API development.

      • FizzyOrange@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 days ago

        I haven’t found that, found another job within a few weeks after starting to look. I am in a very small niche though (RISC-V verification).

        I did also try looking for software jobs but didn’t get anywhere. Tbf I only tried big companies or very popular ones (Discord, Jane Street, Zed, etc.) and my recent history isn’t really software focused (or at least it doesn’t seem like it). I reckon networking & going for smaller / less prestigious companies would help. They just don’t pay very well.

      • slate@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        8 days ago

        Boeing doesn’t even think aeronautic engineers are necessary for building aircraft, but I digress

      • JeromeVancouver@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 days ago

        I program in a custom built legacy code base built in basic. There is no way AI is taking this job. The code is filled with magic numbers that you just have to know what they do. I lose hair daily.

      • anothernobody@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        8 days ago

        AI has already arrived in nuclear power plants. I don’t expect aircraft design and control to be the big exception.

      • tjoa@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        7 days ago

        Yea and the invention of the camera was also not taking painters jobs since there remainl contexts where it’s relevant, right? No, actually most painting jobs before the camera consisted of just replicating reality and those were now gone.

    • Epzillon@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      8 days ago

      Quite easy to be cheaper than the billion dollar investment that will coming crashing down in 6 months with huge losses when you need to quickly hire actual developers.

      And dont get me wrong. I see the appeal. The non-tech savvy CEO is promised huge savings by big tech if they use AI instead of good developers… However, in reality the product wont get shipped faster and the tech debt will increase ten-fold. The AI strat works if youre running a scam you expect to shit down quickly either way, so ig it makes sense why crypto bros and NFT lovers like it.

      • anothernobody@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        I don’t expect it to crash. Not only because of sunken cost fallacy but also because most people are not able to recognize quality. They’re too dumb for that.

        • orclev@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 days ago

          It will definitely crash because it’s being kept afloat by VC money right now. Once that dries up and these AI companies start raising their rates to generate a profit companies are going to realize the AI is way more expensive than an actual programmer. It’s also not a question of recognizing quality because companies have all kinds of metrics in place explicitly for trying to measure program quality (usually poorly, but what they do measure incredibly well is how many bugs software has and how long it takes to deliver new features).

          While AI can deliver code quickly not only is that code low quality and incredibly hard to fix it’s also riddled with both obvious and subtle bugs. The QA departments are going to be working overtime and scaling up massively to try to keep up with the pure crap the new vibe coding departments are going to be churning out. The executives won’t be able to tell if the code is low quality, but they will be able to read the reports showing they went from a month to deliver a new release with a 10% defect rate to two months to deliver a release with a 50% defect rate, and it’s still costing them nearly the same amount despite a significantly reduced head count.

          • anothernobody@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 days ago

            It will definitely crash because it’s being kept afloat by VC money right now. Once that dries up

            …Trump and his MAGAs step in like they did with Intel recently. There’s still plenty of money left to make the AI bubble grow much larger (Just think of all the money in social services). Remember that the MAGAs and tech bros are allies.

            I would love to see the bubble burst soon in a healthy way but I rather expect the entire economy to collapse first before this happens, taking the AI bubble with it. Call me a doomer or whatever, but there is simply no reason to believe in enough critical thinking left to see the damage already caused by AI and the future damage.

    • henfredemars@infosec.pub
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 days ago

      Take my angry vote. Code quality is not a concern for the companies who choose to rely on AI coding. They don’t and have never seen the value in actual humans.

      • anothernobody@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 days ago

        Correct. And it’s not just coding either. What can be replaced with AI will be replaced as long as the savings exceed damage costs.

    • 0x01@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      8 days ago

      Unfortunately correct, I’ve seen this strategy work already. I wonder if it will end up being a veritable race to the wage floor for workers.

      I’ve seen others go the “grass fed, free range, human made, artisan software” route as well, utterly opposing ai assistance.

      At some point there must have somebody out there making sweaters and swearing up and down that machines could never do their job as well, but at the end of the day the employee who makes the shirt doesn’t decide if their job will be automated. The boss cares about the economics and even if the product is inferior they will choose the cheaper option.

  • plantfanatic@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    45
    ·
    8 days ago

    So does the same apply to mathematics and calculators? Tradespeople and power tools? Writers and spellcheck?

    It’s a bloody tool, why are people so against it? The same crap happens whenever something new comes out ANYWHERE.

    • voracitude@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      2
      ·
      8 days ago

      Okay, go hire an accountant who only knows how to use a calculator - no formal training. Should be fine, accounting is just maths and a calculator is a tool to do maths with.

      And honestly, even this analogy is too generous, because at least to use a calculator you have to know what the symbols mean. Vibe coding doesn’t even require that much.

      • Epzillon@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        8 days ago

        Honestly, a calculator isnt even a fair comparison. A calculator can be reverse-engineered and proven to produce a certain output from a given input. Even scientists say AI is a black box. It cant be given any guarantees to what it will produce.

      • Zexks@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        23
        ·
        8 days ago

        Yes it does. People cant even use a search engine withlut help. If you dont know what to ask you dont know what youre looking for.

        Just like your analogy misses the mark. Would you hire an accountant that doesnt know how to use a calculator and can only do work by hand.

        • voracitude@lemmy.world
          link
          fedilink
          English
          arrow-up
          25
          arrow-down
          1
          ·
          edit-2
          8 days ago

          Would you hire an accountant that doesnt know how to use a calculator and can only do work by hand.

          First of all, you don’t want to talk about analogies missing marks and then pull something like this. You didn’t just miss the point, you missed the barn wall behind the point.

          The whole point of the article is that the new generation of developers won’t have the skills to spot the errors the AI makes. So, in your analogy, the accountant already has the skill and experience to know what to ask and fix mistakes the tool makes, and they can figure out how to use the tool, they’ll just be slower than without it.

          In my analogy, we have an accountant who was trained to use the tool, not to do the work, which is exactly what’s happening in development now. Their work will be subpar and will run into blockers constantly because the tool gets stuck in a loop and the “developer” using it can’t code well enough to tell what it did wrong.

          So, to answer your question: If the calculator was known to consistently confidently make shit up and leave glaring errors in its work, yes, I would absolutely hire a competent professional who doesn’t use one over a cookie-cutter vibe-numberer who can’t do it without one. Similarly, I’m not going to hire someone who can’t use a search engine without help either.

    • Godort@lemmy.ca
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      2
      ·
      8 days ago

      If my calculator was wrong 10% of the time, but always produced a result that “felt” correct, that would be worse than not using a calculator at all.

      Same thing for power tools or spell check.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      Well yeah. Lots of professions that used to exist just don’t anymore because of that stuff, or at least only exist as a small niche where before they employed tons of people. Like “computer” used to be a job people had doing math by hand.

    • tjoa@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      14
      ·
      edit-2
      7 days ago

      WTF this thread is wild. These people literally think 100% of dev jobs need full stack software architect level expertise and also that code can only have two states: correct or incorrect. LOL

      • voracitude@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 days ago

        That’s a nice strawman you built there, but it’s wrong. The issue is not that “every dev needs to be a senior dev to use AI” nor is it “code is binary: right or wrong” (🥁). The whole point of this article is that new developers are entering the industry but they’re not building the skills that would eventually forge them into senior devs. They’re relying on generative AI for the foundational work, even when assigned a learning exercise by a senior dev, without understanding the “why” behind the output of the LLM they used.

        • tjoa@feddit.org
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          7 days ago

          The articles point is correct that’s not what I am saying, it’s what you guys spiral into LOL. „Let’s not use AI cuz it can be wrong sometimes“ „I’d rather have no AI than ‚wrong code‘“ just yesterday (as if it was called for) a YouTuber put out a piece that is making fun of you for that very reason.

          • voracitude@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            So you’re just going to repeat your strawman bullshit as though that makes it what I said?

            le kek the article is right but you and everyone agreeing with it are wrong

            Juniors using AI means they don’t develop the necessary skills to become seniors. Seniors using AI find their critical thinking and problem solving skills negatively impacted (https://www.mdpi.com/2075-4698/15/1/6).

            I understand that empirical evidence isn’t really your bag and you’d prefer to bloviate at randoms on the internet to feel superior without having to work for it rather than learn to communicate or bother to understand anything you read, so now that I’ve reiterated my point - which is that the article is correct - twice for you, I’m going to block you so you can re-read what I wrote in the first fucking place and reconsider being such a twat.

            • tjoa@feddit.org
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              7 days ago

              Aaaaah yes mental offloading with AI and AI use in general are the same. That’s why a junior should never use AI at all - not for prototyping, not as legacy code - cuz he will be never becoming a real senior that way. Omg you guys just keep on giving