I’ve so far avoided using LLMs unless I need something explicitly explained. I don’t know enough to be able to verify any code it would produce, so I don’t know what the hell these vibe coders are doing.
But I’m also a little long in the tooth to be starting this, so maybe that’s part of my problem.
I am aware of what I don’t know, and having something which hallucinates produce something I want to use seems silly. I wouldn’t be able to verify it independently because I’m not smart enough. It just seems like asking for trouble. Like I would ask the thing which gave me broken code in the first place to fix it? Smort.
I’ve so far avoided using LLMs unless I need something explicitly explained. I don’t know enough to be able to verify any code it would produce, so I don’t know what the hell these vibe coders are doing.
But I’m also a little long in the tooth to be starting this, so maybe that’s part of my problem.
Based to reject vibe coding. Similar to “why learn math when we got calculators”.
I am aware of what I don’t know, and having something which hallucinates produce something I want to use seems silly. I wouldn’t be able to verify it independently because I’m not smart enough. It just seems like asking for trouble. Like I would ask the thing which gave me broken code in the first place to fix it? Smort.