Sometimes I’ll run into a baffling issue with a tech product — be it headphones, Google apps like maps or its search features, Apple products, Spotify, other apps, and so on — and when I look for solutions online I sometimes discover this has been an issue for years. Sometimes for many many years.
These tech companies are sometimes ENORMOUS. How is it that these issues persist? Why do some things end up being so inefficient, unintuitive, or clunky? Why do I catch myself saying “oh my dear fucking lord” under my breath so often when I use tech?
Are there no employees who check forums? Does the architecture become so huge and messy that something seemingly simple is actually super hard to fix? Do these companies not have teams that test this stuff?
Why is it so pervasive? And why does some of it seem to be ignored for literal years? Sometimes even a decade!
Is it all due to enshittification? Do they trap us in as users and then stop giving a shit? Or is there more to it than that?
Aside from the effort required others have mentioned, there’s also an effect of capitalism.
For a lot of their tech, they have a near-monopoly or at least a very large market share. Take windows from Microsoft. What motivation would they have to fix bugs which impact even 5-10% of their userbase? Their only competition is linux with its’ around 4(?)% market share and osx which requires expensive hardware. Not fixing the bug just makes people annoyed, but 90% won’t leave because they can’t. As long as it doesn’t impact enterprise contracts it’s not worth it to fix it because the time spent doing that is a loss for shareholders, meanwhile new features which can collect data (like copilot for example) that can be sold generate money.
I’m sure even the devs in most places want to make better products and fight management to give them more time to deliver features so they can be better quality - but it’s an exhausting sharp uphill battle which never ends, and at the end of the day the person who made broken feature with data collector 9000 built in will probably get the promotion while the person who fixed 800 5+ year old bugs gets a shout-out on a zoom call.
I’m not sure Windows is a good example here since they’re historically well known for backwards compatibility and fixing obscure bugs for specific hardware.
Whereas Linux famously always had driver support issues.
Backwards compatibility - yes I agree, it’s quite good at it.
Hardware specific issues for any OSes - disagree. For windows that’s 80-90% done by the hardware manufacturer’s drivers. It’s not through an effort from Microsoft whether issues are fixed or not. For Linux it’s usually an effort of maintainers and if anything, Linux is famous for supporting old hardware that windows no longer works with.
But the point I was making is not to say Linux or osx is better than windows or vice versa, it’s that windows holds by far the largest market share in desktops and neither of the alternatives are really drop-in replacements. So in the end they have no pressure on them to improve UX since it’s infeasible to change OS for the majority of their users at the moment.
In terms of driver development it’s more collaborating than just Windows releasing an API and the manufacturers creating the drivers. Bug reports from large manufacturers are absolutely taken seriously. Customers usually never see this interaction.
I would say maybe 1990-2010 was really a dominant time for Windows and that was also when they were actively improving, but now they have plenty of competition. It’s just that people have moved away from desktops. Mobile platforms are now an existential threat. Fewer and fewer people are buying desktops to begin with. Maybe Microsoft has given up on desktops and that’s why they’re making Windows worse.