• Wxnzxn@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    That’s what I suspect, too, but I’m not entirely sure in my research so far. The question I am still unsure about: Is it as costly in running, or is the real costly part “just” the “training our model” part? I wondered that, because when I was messing around, things like generative text models could run on my potato PC with a bit of python scripting without too much issue, even if not ideally - as long as I had the already trained dataset downloaded.

    • zbyte64@awful.systems
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Can’t really answer the expense trade-off until you look at concrete use cases, something general AI is allergic to…