• 1 Post
  • 6 Comments
Joined 2 days ago
cake
Cake day: November 15th, 2025

help-circle





  • Hi, hope you don’t mind me giving my two cents.

    Local models are at their m9st useful in daily life when they scrape data from a reliable factual database or from the internet and then present/discuss that data to you through natural language conversation.

    Think about searching for things on the internet now a days. Every search provider stuffs ads in top results and intentionally ofsucates the links your looking for especially if its a no-no term like pirating torrent sites.

    Local llms can act as an advanced generalized RSS reader that automatically fetches articles and sources, send STEM based queries to wolfram alpha llm api and retrieve answers, fetch the weather directly from openweatherAPI, retrieve definitions and meanings from local dictionary, retrieve Wikipedia article pages from a local kiwix server, search ArXiv directly for prior searching. One of Claude’s big selling points is the research mode toolcall that scrapes hundreds of sites to collect up to date data on the thing your researching and presenting its finins in a neat structured way with cited sources. It does in minutes what would traditionally take a human hours or days of manual googling.

    There are genuine uses for llms if your a nerdy computer homelab type of person familiar databases, data handling and can code up/integrate some basic api pipelines. The main challenge is selling these kinds of functions in an easy to understand and use way for the tech illiterate who already think bad of llms and similar due to generative slop. A positive future for llms integrated into Firefox would be something trained to fetch from your favorite sources and sift out the crap based on your preferences/keywords. More sites would have APIs for direct scraping and the key adding process would be one click button