• MonkderVierte@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    I just thought that having a client side proof-of-work (or even only a delay) bound to the IP might deter the AI companies to choose to behave instead (because single-visit-per-IP crawlers get too expensive/slow and you can just block normal abusive crawlers). But they already have mind-blowing computing and money ressources and only want your data.

    But if there was a simple-to-use integrated solution and every single webpage used this approach?

    • daniskarma@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      30 days ago

      Solution was invented long ago. It’s called a captcha.

      A little bother for legitimate users, but a good captcha is still hard to bypass even using AI.

      And I think for the final user standpoint I prefer to lose 5 seconds in a captcha, than the browser running an unsolicited heavy crypto challenge on my end.