

Local models are actually pretty great! They are not great at everything… but for what most people are using LLMs for they do a fine job.

Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.





I want HomeKit compatibility. OR something hackable that requires no cloud connection. These companies are not our friends and have betrayed our trust at every step.