People are pretending as if job replacement happens all at once, and that’s just not how it works.
A new tool that makes a job 15% more efficient will either produce 15% more goods or reduce the required labor by 15%. Some of that labor is absorbed elsewhere, but there was still a 15% reduction that happened.
Slow improvements are undoubtedly a good thing, that means we can create positions as fast as we make them obsolete. Maybe LLMs have reached their peak and we don’t have to worry about it, but it’s not a bad idea to prepare for that possibility that they continue getting better.
People really like shitting on overhyped new technologies, but I don’t think people appreciate just how big of a deal it is that a pretty basic algorithm is able to process natural language at all.
I think a better analogy would be something like a loom: it doesn’t operate independently and still requires an operator and mechanics, but it eliminates the need for rows and rows of weavers to complete the same amount of work (and that both puts many people out of work and undercuts the labor market, which are both big problems). Judging LLM’s on a scale of total job replacement is IMHO a little ridiculous, because unless those LLM’s are fucking sentient and autonomous, they’ll never completely ‘replace’ a human roll. They will certainly make programmers/writers/translators/media producers more productive though, and that’ll put quite a few out of work, and that’s kind of a big problem.
People are pretending as if job replacement happens all at once, and that’s just not how it works.
A new tool that makes a job 15% more efficient will either produce 15% more goods or reduce the required labor by 15%. Some of that labor is absorbed elsewhere, but there was still a 15% reduction that happened.
Slow improvements are undoubtedly a good thing, that means we can create positions as fast as we make them obsolete. Maybe LLMs have reached their peak and we don’t have to worry about it, but it’s not a bad idea to prepare for that possibility that they continue getting better.
People really like shitting on overhyped new technologies, but I don’t think people appreciate just how big of a deal it is that a pretty basic algorithm is able to process natural language at all.
Removed by mod
I think a better analogy would be something like a loom: it doesn’t operate independently and still requires an operator and mechanics, but it eliminates the need for rows and rows of weavers to complete the same amount of work (and that both puts many people out of work and undercuts the labor market, which are both big problems). Judging LLM’s on a scale of total job replacement is IMHO a little ridiculous, because unless those LLM’s are fucking sentient and autonomous, they’ll never completely ‘replace’ a human roll. They will certainly make programmers/writers/translators/media producers more productive though, and that’ll put quite a few out of work, and that’s kind of a big problem.
Removed by mod