• 6 Posts
  • 1.37K Comments
Joined 2 years ago
cake
Cake day: March 22nd, 2024

help-circle
  • I mean, Americans have turned blind eyes to roiling problems for decades.

    What feels different is the curtain is pulled back now, at least for the rest of the world and many folks here, because everything turned into Idiocracy.

    Americans may not be doing anything about it, but I also have a lot of family that’s sorta realizing “Oh… Reconstruction failed really hard, didn’t it?” Or “Oh… we’re a surveillance state.” And are looking back on lives and seeing some nasty shit (like bigoted aunts/uncles and worse) for what it was.


  • You don’t have to overclock, but I’d at least look at your mobo’s settings. Many mobos set really, really bad, non stock settings by default, especially with XMP memory.

    An example: they might default to 1.3V VSOC which is absolutely a “default overclock” and is going to make idle power skyrocket, and your CPU potentially unstable because infinity fabric doesn’t like that. For reference, I personally wouldn’t go over 1.2V VSOC myself and shoot for like 1.1V.

    I’d recommend Buildzoid’s videos:

    https://youtu.be/dlYxmRcdLVw

    https://youtu.be/Xcn_nvWGj7U

    And Igor’s Lab for general text info.

    Also, if you don’t turn on XMP at least (aka the RAM’s rated speed), they will run at some slow default and hurt your inference speed rather significantly.





  • Not anymore.

    I can run GLM 4.6 on a Ryzen/single RTX 3090 desktop at 7 tokens/s, and it blows lesser API models away. I can run 14-49Bs (or GLM Air) in more utilitarian cases that do just fine.

    And I can reach for free/dirt cheap APIs called locally when needed.

    But again, it’s all ‘special interest tinkerer’ tier. You can’t do that with ollama run, you have to mess with exotic libraries and tweaked setups and RAG chains to squeeze out that kind of performance. But all that getting simplified is inevitable.





  • Net might get to where you need AI

    I hate to say it, but we’re basically there, and AI doesn’t help a ton. If the net is slop and trash, there’s not a lot it can do.

    Hopefully by then they will have figured out a way to make it free.

    Fortunately self hosting is 100% taking off. Getting a (free) local agent to sift through the net’s sludge will be about as easy as tweaking Firefox before long.

    You can already do it. I already do it (and am happy to ramble about how when asked), but it’s more of an enthusiast/tinkerer thing now.


  • It’s because ‘anti-woke’ is not the same to every sect. You have:

    • Oldschool conservatives like Mike Pence, a dying breed.

    • Tech Bro billionaires like Musk, and their acolytes.

    • ‘Talk radio/podcast’ conservatives like Steve Banon, the late Rush Limbaugh, Rumble influencers and such.

    • The ‘Great Replacement Theory’ macho style MAGA like JD Vance.

    There’s some weird intersection between them, and some fundamentally incompatible tenants that are finally colliding now that the Democrats are so useless as lighting rods.







  • What you see online/in news now is a caricature of the worst of America. Normal people here aren’t chronically online, or worse, professional influencers, and algorithmic scams/busted parties largely made them delusional voters.

    That being said, I’m a U.S. southerner and I have seen some shit you wouldn’t believe. Like shit too dramatic for television. Some is finally out in the open.