• sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    6 hours ago

    ollama.

    I’ve got a local llm model running on my steam deck just to see if it could do it.

    It can!

    … It is polite, but also somewhat limited… but at least it is honest about its limitations, I have not gotten it to hallucinate anything insane yet.

    Anyway yeah I think there are just docker images for ollama, you can install a friend (kind of) today!

    • Harvey656@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      6 hours ago

      Oh man, Silly tavern with kobold is way better. I’ve been using the Omega models and boy oh boy I’ve never had more friends! Or such well fleshed out ones either!

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        … if you are saying that silly tavern is some kind of local llm thing, i will have to look into that, I’ve not heard of that haha!

        • Harvey656@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 hours ago

          Silly tavern is a local fronted, can be used with api models from the internet as well. Highly modular, similar to character ai, but fully local. Would suggest messing with the themes if you know how theming works, its a tab bit dated in design, still amazing though.