Chorus
Batman Arkham
Chorus
Batman Arkham
deleted by creator
Don’t get mint if you’ll get a remotely capable laptop or plan to game on it. Its so called ‘modern’ desktop environment (wich still defaults to the old X window system) feels awful to use imo and while the ‘retro’ ones are better there’s no point in using them on a new laptop. Choose a distro that ships with KDE, GNOME, or a wlroots based desktop environment.
I’ve also had driver issues with it that didn’t happen with Ubuntu or arch.
Pretty much every distro has a caveman compatible installer.
What has nothing to do with systemd? You open the link and before the introduction it says the current release isn’t fit for general use because they couldn’t add systemd yet. If they picked something with systemd they wouldn’t need to spend so much effort on it
It always puzzles me why they chose the one distro without systemd to base this on and are now trying to add it themselves.
Also I have thoughts about this:
Move sudo to community
At present, sudo is in the main repository, which requires us to provide security support for 2 years. Upstream sudo does not provide an “LTS” lifecycle, so this requires either performing security upgrades during the maintenance lifecycle, or backporting security fixes by hand.
Benefit to Alpine
Prior to the creation of the security team, there was an unofficial preference to push doas as the preferred pivot tool for Alpine. This reinforces that messaging. Additionally, we do not have to support sudo for a 2 year lifecycle, since there are no LTS branches for it.
How often does sudo have security vulnerabilities that it’s worth moving to a lesser used tool whose vulnerabilities are less likely to be discovered against your security team’s wishes? What do all the other distros do?
Their CPUs would also be decent if they only made low end parts
These are the answers they gave the first time.
Qwencoder is persistent after 6 rerolls.
Anyways, how do I make these use my gpu? ollama logs say the model will fit into vram / offloaing all layers but gpu usage doesn’t change and cpu gets the load. And regardless of the model size vram usage never changes and ram only goes up by couple hundred megabytes. Any advice? (Linux / Nvidia) Edit: it didn’t have cuda enabled apparently, fixed now
Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
Kind of unrelated, why does c sometimes fail to print if it hits a breakpoint right after a print while debugging? Or if it segfaults right after too iirc
Does OSM have any bullet vending machines?
Edit it has two:
https://www.openstreetmap.org/node/12272507646
https://www.openstreetmap.org/node/12118923148
They asked me to put my hands behind my back and all that stuff, and I realized what was going on.
Because she was too dangerous to be cuffed normally, or not cuffed at all?
Als I hate this doubly for the kid. Your mom getting arrested for your slightest sign of independence will fuck you up.
Maybe to 5 year olds?
Sony has a patent for an input device having two data streams at once
“You can keep operating, whatever you do don’t be accountable to me”
???
But I keep hearing how the American system isn’t democratic since you don’t directly vote for the president, you vote for some middle person who promises to vote for your president? Those people might not be members of the parliment but they can still form coalitions after the fact by voting for who has a chance to win
Why does she need to do this before the election? They can just form a coalition after the election if Kamala doesn’t win
Ooh I have another one! Steamland, the game of my childhood. RTS with trains. Would be very cool if it had a sequel.