This is the best plan I’ve seen yet.
This is the best plan I’ve seen yet.
I think I see where you’re coming from. You just want an occasional “incognito” option for posts.
If I wanted to help out another user and share the story of my struggle with genital warts, I’d probably be more comfortable doing that if it wasn’t tied to my previous post history. Pour one out for Ken Bone.
My incognito posts would be subject to the same community standards as normal posts so if I used the feature to abuse or spam people, my real account would be affected.
I doubt there’s so much of a technical hurdle here as an ethical one. It comes down to whether you feel you can trust your (unpaid, volunteer) instance admin to not spill the beans about your genital warts, and whether THEY are happy being custodians of potentially sensitive PII. The inconvenience of a throwaway is also its main advantage: it isolates whatever sensitive thing you want to share from both you and the admin.
A group of us discovered the only range of non-firewalled IPs in our university, which belonged to a particular library building. And because this was Windows 95 and you could just change your IP to whatever you wanted, we could connect to Quakeworld with a ridiculously low ping.
It’s a great predictor of how the rest of your interaction with a site will go. If a dev doesn’t have the space and motivation to give a shit about details like this, they’ll cut corners on other shit too. It doesn’t always mean they’re bad developers; more often they’re just rushed.
Summit is pretty raw in places but it’s fast and simple. The developer is putting out updates every few days and actively taking user feedback on c/summit. I’m going to stick with it.
I don’t think (completely wild guess here) AI content crawlers should have any more impact than the dozens and dozens of search spiders that make up must of my own site’s traffic.
The impact was magnified for Twitter because it generates so much new content every second. That wasn’t an issue when Twitter had a nice, properly cached API and it shouldn’t be an issue for fediverse instances going forward because we have RSS and caching and we’re not so stupid as to turn those off. Like, what kind of moron would do that?
I don’t think (completely wild guess here) AI content crawlers should have any more impact than the dozens and dozens of spiders that make up must of my own site’s traffic.
The impact was magnified for Twitter because it generates so much new content every second. That wasn’t an issue when Twitter had a nice, properly cached API and it shouldn’t be an issue for fediverse instances either because we have RSS and caching and we’re not so stupid as to turn those off. Like, what kind of moron would do that?
I trust that you all are normal and level headed individuals who can come to disagreements but still respect eachother
Not me!
I trust that you all are normal and level headed individuals who can come to disagreements but still respect eachother
Not me!
With “fuck you” money, you could purchase more dicks.