• kazerniel@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 hours ago

    “With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work.”

    I understand the donors aspect, but I don’t think anyone who is satisfied with AI slop would bother to improve wiki articles anyway.

    • drspawndisaster@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 hours ago

      The idea that there’s a certain type of person that’s immune to a social tide is not very sound, in my opinion. If more people use genAI, they may teach people who could have been editors in later years to use genAI instead.

      • kazerniel@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        That’s a good point, scary to think that there are people growing up now for whom LLMs are the default way of accessing knowledge.

        • Hackworth@piefed.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 hour ago

          Eh, people said the exact same thing about Wikipedia in the early 2000’s. A group of randos on the internet is going to “crowd source” truth? Absurd! And the answer to that was always, “You can check the source to make sure it says what they say it says.” If you’re still checking Wikipedia sources, then you’re going to check the sources AI provides as well. All that changes about the process is how you get the list of primary sources. I don’t mind AI as a method of finding sources.

          The greater issue is that people rarely check primary sources. And even when they do, the general level of education needed to read and understand those sources is a somewhat high bar. And the even greater issue is that AI-generated half-truths are currently mucking up primary sources. Add to that intentional falsehoods from governments and corporations, and it already seems significantly more difficult to get to the real data on anything post-2020.

          • llama@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            51 minutes ago

            But Wikipedia actually is crowd sourced data verification. Every AI prompt response is made up on the fly and there’s no way to audit what other people are seeing for accuracy.

            • Hackworth@piefed.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              23 minutes ago

              Hey! An excuse to quote my namesake.

              Hackworth got all the news that was appropriate to his situation in life, plus a few optional services: the latest from his favorite cartoonists and columnists around the world; the clippings on various peculiar crackpot subjects forwarded to him by his father […] A gentleman of higher rank and more far-reaching responsibilities would probably get different information written in a different way, and the top stratum of New Chuasan actually got the Times on paper, printed out by a big antique press […] Now nanotechnology had made nearly anything possible, and so the cultural role in deciding what should be done with it had become far more important than imagining what could be done with it. One of the insights of the Victorian Revivial was that it was not necessarily a good thing for everyone to read a completely different newspaper in the morning; so the higher one rose in society, the more similar one’s Times became to one’s peers’. - The Diamond Age by Neal Stephenson (1995)

              That is to say, I agree that everyone getting different answers is an issue, and it’s been a growing problem for decades. AI’s turbo-charged it, for sure. If I want, I can just have it yes-man me all day long.