

If you want automatic updates over major versions most images will have the :latest tag for that.


If you want automatic updates over major versions most images will have the :latest tag for that.
It doesnt actually bypass the firewall.
When you tell docker to expose a port on 0.0.0.0 its just doing what you ask of it.


I can’t imagine we currently produce enough electricity for every car to be electric.
Plus all the production processes for the cars themselves, and the energy to power them puts off waste heat. Even solar panels benefit from running cooler by having heat removed from them.


It feels like Bazzite tells you a million times over that you absolutely should not layer packages, it scared me off for sure since I’m new to immutable systems and don’t really know how they work fully.


Depends what protocols you need?
If you use SMB install the Samba server package. If you use WebDAV install a WebDAV server like SFTPGo, etc…
If you want a google drive like replacement there’s Nextcloud, Owncloud, Seafile, and others.
For the drives themselves you can have traditional RAID with MD, or ZFS for more reliability and neat features, or go with MergerFS + SnapRAID, or just directly mount the disks and store files on some and backup to the others with Restic or something.
Lots of options!
Yeah I guess these days the majority of users have fast enough connections that its not worth it. It sucks if you have crappy internet though hah.


Thats how I describe Jellyfin, it works fine, its just inconvenient to use.
Interesting, it wouldn’t work like rsync where it compares the new files to the old ones and transfers the parts that have changed?
Download of 6GB is wild, is that re-downloading the entire package for each one that needs an update? Shouldn’t it be more efficient to download only the changes and patch the existing files?
At this point it seems like my desktop Linux install needs as much space and bandwidth than windows does.


It doesnt graph over time really, it only does it while open and loses the data if you close it.


Here’s an actual answer, a system monitor with historical data: https://beszel.dev/
It’s a webUI but that shouldn’t really matter vs an app with its own GUI.


They do process mapping locally, there’s no reason for a remote connection other than remote control outside your LAN and data collection.
My vacuum running Valetudo works fine with no internet connection, mapping and all.


I wonder how big the crossover is between people that let AI run commands for them, and people that don’t have a single reliable backup system in place. Probably pretty large.


The most frustrating part of running Linux for me is the experience can vary so much for each person, slight hardware differences can cause odd bugs that other people don’t have, and solving them can be really time consuming because a fix that works for one distro or DE may not work on another.
I’m really happy that Bazzite seems to be gaining so much popularity as an actual windows replacement, because it makes it a lot easier to find fixes for problems if there’s a huge community using the exact same distro.


Its a docker compose deployment so should just work on any system with docker installed. Copy the docker compose file and env file if it has one, and run ‘docker compose up -d’ in that directory.
It can collect analytics from multiple places.


It does but will be really out of date.


Remote backups that you 100% know the info to, and have tested to be reliable are very important.
And don’t just have 1 backup, have a 2nd one as well, since stuff can go wrong and render a backup unusable without knowing.


There are some fairly in depth setups to hide the fact that its a VM normally used for testing malware, I winder if those would fool it.


Might be time to self host vaultwarden if you need real DB features like that.
My favorite is ‘fast and lightweight’ followed by ‘RAM required >500MB’ for a some kind of basic server.