It was quite the paradox!

  • Jimmycrackcrack@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    6 days ago

    ChatGPT is pretty helpful despite the hate. I’ve found myself using it quite a bit recently. Situations like these where you don’t get a joke are good ones in particular, since it’s something you might have struggled to figure out just by Googling before. However, you do need to be able to check the output to gain value from it and that’s kind of one of its limitations since you sometimes end up needing to do as much research or work verifying what it tells you as you tried to avoid by using it.

    In this case, where it’s not so much a question of facts and it’s more about interpretation, a simple test of asking yourself “does this make sense?” could have provided a clue for you that chatGPT was struggling here. One of its problems is that it just always tries to be helpful and as a function of how it works that often ends up favouring the production of some kind of response over an accurate response even when it can’t really produce an answer. It doesn’t actually just magically know everything and if you can’t confidently explain the joke to someone else in your own words after reading it’s “explanation” then the odds are good that it just fed you nonsense which superficially looked like it must mean something.

    In this case it seems, that the biggest problem was that the joke itself didn’t entirely make sense on its premise, so there wasn’t really a correct answer and chatGPT just tried really hard to conjure one where it didn’t really exist.