Good grief! The word is excluded. Holy shit.
Linux didn’t exist when I was 12. 😑
I think that being forced to learn about WINE at a young age may have been beneficial actually (if extremely unpleasant)
So unpleasant.
Das wirft natürlich eine sehr interessante wissenschaftliche Forschungsfrage auf, die ich mir erlaubt habe, in der wissenschaftlichen Literatur zu recherchieren:
“Does early exposure to different operating systems (macOS vs. Windows) correlate with differences in technological literacy and general problem-solving abilities among children and adolescents?”
The available research does not provide conclusive evidence that early exposure to different operating systems directly correlates with differences in technological literacy or problem-solving abilities among children and adolescents.
While studies reveal some interesting distinctions, the evidence is limited. Ronaldo Muyu et al., 2022 found Windows is more popular among university students (84.61% vs. 11.38% for macOS), suggesting potential usage differences. Shahid I. Ali et al., 2019 found no significant competency differences between Mac and Windows users in Excel skills. Cem Topcuoglu et al., 2024 noted that users’ perceptions of operating systems are often based on reputation rather than technical understanding.
Interestingly, Bijou Yang et al., 2003 found Mac users had significantly greater computer anxiety, which might indirectly impact technological literacy.
More targeted research is needed to definitively answer this question, particularly studies focusing on children and adolescents.
I think early exposure to several different OS’s means you’re at least not too poor, and lack of money does correlate a lot with illiteracy of all sorts.
I think you misunderstand: the question is not about exposure to different OSes, but about the correlation/causation of a given OS to later cognitive (and other) abilities. Please do apply adequate scientific rigor here!
So I started with a DOS machine that my dad had at work, then my school got a few Apple Macs in the library so I played Oregon Trail on the green screen, them the first computer we had at home that I was able to spend hours on was windows 3.1.

For some reason, Eternity shows this image until I clicked on the post lol
What about people who started on DOS?
Or AmigaOS?
Or Basic 2.0?
They are either database administrators or completely oblivious to modern technology
*Reads comments in thread*
I started with a pair of matchsticks and a trenchcoat that I got at Galipoli in WW1, using the Phosphorus I found in the Bosphorus to craft makeshift TI calculator based on specs I got via Fax from a Samurai. I ran slackware on my slacks until we defeated the Ottomans, but they unleashed their puppy linuxes on us, and we stood no chance.
First computer I used was DOS.
Also DOS, the single button on the Mac mouse was a whole new way of using a computer.
Mine had 3.1 on it, but most of the games had to b3 run through dos prompt
Ummm how do kids turn out if you install Linux Mint on a cheap laptop and give it to them to screw around with? Asking for a friend.
It leads the kid to Arch. I hope you prepared to always hear “I use Arch, btw.”
I’ll let you know in 10 years.
Nice…I meant, I gave my 7yo (at the time) a computer we put Mint on it. He is 9 now, so by 19 I think we will see how it has changed his skill level vs the gen pop
BAAAABE, I WANT A KID.
My cousin became an IT tech. I set her up with Ubuntu on a cheap desktop when she was about 12.
My 8 and 9 year old kids use xubuntu on a 2013 macbook air. They use it for writing stories, making a lot of pixel art with Piko Pixel, and some code block style programming with Lego Spike. They are learning about multi-user systems, file management, etc. I’m keeping an eye out for a cheap pc that can run Minecraft (lots of those right now since people are just trashing old win 10 machines) because the older kid wants to learn how to make Minecraft mods.
what operating system was that atari with a keyboard you could plug 2800 carts into
Atari OS, which could only be used to access the floppy drive. Atari DOS could be booted from a floppy disk. I never used one of these machines, I skimmed the Wikipedia article on Atari 8-bit computers..
That was also AutistOS
“discluded”
🤣
De-un-cluded even
I thought so too, but turns out it is a word, even if it might be misused here: https://english.stackexchange.com/questions/129015/is-disclude-a-word-and-what-authority-says-a-word-is-a-word-or-isnt
Notice that ain’t a dictionary.
We do not wish to exclude the population because it would preclude comparative analysis, but we wish to disclude them from this study in order to conclude the initial hypothesis.
In disclusion I should probably learn my engrish.
Now include perclude and reclude! (Ok, I’m afraid English forgot to loot the last two from Latin’s pockets, after she robbed her in a dark alleyway)
Not to intentionally interclude, but perclude and reclude seem to have seclude from english.
I think that when you started matters a lot.
I’ve seen so many people on the “Only Millennials know how to use computers” and just kinda forgetting how many of this cohort didn’t get their hands on a computer until that first generation of Apples and Dells ended up in resale shops or on eBay for deep discounts.
So many folks who see kids on touch screens and throw fits, because that’s not how a “real computer” works, were throwing fits at their parents ten years ago for not understand how intuitive a touch screen is.
Feels like its all an excuse for people to get mad at one another, while occluding the simple fact that using a thing for a long time gives you more experience with the thing.
Yes, people keep finding ways to put others down in order to feel superior. It’s called being a bully. When everything was “blame and shame millenials for this”, there was a section of us millenials that swore we’d break the cycle of generational blaming. Now it’s all about blaming and shaming Gen-Z, because that shit gets clicks. Apparently being a bully never really goes out of style.
It’s not really so much the form factor of the hardware. I think it’s more to do with the increasing complexity of the apps and how they’re designed to hide a lot of what goes on behind the scenes. Think about how the earliest versions of Android didn’t even come with a basic file browser, for example.
It’s the overall push to turn computers into single-use appliances, rather than general purpose devices.
Think about how the earliest versions of Android didn’t even come with a basic file browser, for example.
They didn’t offer an official app, but the Google Store was flooded with 3rd party alternatives practically the day the OS was released.
Even then, knowing what an “App Store” is and how/why you’d use it is a skill more common among younger users. My mother, who happily goes on her laptop and installs all sorts of garbage, had no idea how to add an app to her phone. My younger coworkers are much more comfortable working through Citrix and other Cloud Services, because they don’t expect a file system to be living under every app they use.
It’s the overall push to turn computers into single-use appliances, rather than general purpose devices.
I more felt that the phone was becoming a kind of mono-device or universal remote, with a lot of the IoT tech trying to turn it into an interface for every other kind of physical appliance. If anything, I feel like the phone does too much. As a result, its interface has to become more digital and generic and uniform in a way that makes using distinct applications and operations confusingly vague.
But growing up in an analog world has definitely tilted my expectations. Younger people seem perfectly fine with syncing their phones to anything with a receiver or RFID tag. And the particularly savvy ones seem interested in exploiting the fact that everything has a frequency. I’ve met more than a few kids who have fucked around with the Flipper and other wireless receiver gadgets.
Absolutely. But I don’t think it’s crucial. If you test a bunch of 30 year olds on tech literacy and one started using a computer at 29, he will perform bad. But if you test a child at 12 who has had a pc for 2 years and a 30 year old person who has had a pc for 2 years, it becomes so irrelevant that just interest in the topic will determine the outcome. Though children do of course find everything interesting.
I think the reason why we have the perception about children learning fast is due to focus. They have unique abilities with their new little unstuffed heads, while a grown up will worry about not understand, thinking about something else entirely, not having time etc…
I mean younger brains do have more neuroplasticity and other factors, hence it’s easier for children to learn more languages than adults. I assume this applies to more than just language.
it’s easier for children to learn more languages than adults
Kids are also assumed to operate at a child’s language level. So an 8-year-old speaking both English and Spanish at the 1st grade level is impressive. But a 20-year-old speaking at a 1st grade level is considered remedial.
Even then, there’s a lot to be said for experience. Computers and languages alike benefit from years of exposure. A large English vocabulary will help you pick up Spanish faster. And many years of experience on an Apple will clue you into tricks a naive Windows/Linux user would never consider.
I remember my dad trying to limit my screen time by putting a password lock on the screen saver. He was shocked to discover that an eight year old figured out how to evade it by… restarting the computer. But then he enabled password on restart and got cagey when typing it in, and that slowed my Hackerman attempts down significantly.
Kids tend to learn basic things faster. But they lack the breadth of experience to recall and apply strategies and patterns they’ve accrued over a lifetime. So much of what we consider “smart” versus “dumb” in problem solving is just “how many times have you already seen the answer to this question applied successfully?” Figuring something out for the first time is always harder than applying the heuristic you’ve been using half your life.
You raise really good points, but I’d want to add that abstract thought and the general ability to “think around corners” are at least as important (if not more so the higher you go) as experience insofar as problem solving and “smarts” go.
They do have more neuroplasticity. But we have to define what that means and where this phenomenon comes from. Most just assume that younger equals better. This is not the case. You can even keep the neuroplasticity you have had as a child. One of the defining characteristics of neuroplasticity is the ability to adapt to new views. Since children have no views yet, they have no conflicting views either, causing acceptance. This is not the case with adults. But you can instrumentalize such knowledge to essentially undo your conflicting nature to increase neuroplasticity immensely. There is a cutoff and you will have a drop in potential for neuroplasticity as you age, but this is not in your child years. If you’re interested, I remember reading a study of life-long meditation on alzheimers. Maybe you can find them again. This is just one technique to increase neuroplasticity. I didn’t want to mention this part, as it is against common knowledge. But common knowledge is rarely correct.
Source: I have had to deal with significant loss of neural function through mental illness and have read up a lot about this topic to better myself.
I meant more that it makes a difference if your first computer was a Macintosh SE vs a MacBook Air.
Yeah my thought exactly I was raised on Macintosh and it’s completely different than the current apple product experience.
Ah, that’s true and I misunderstood you then. Though it’s hard for me to understand how that implies tech literacy in a total sense, since my grandpa has had a pc in and from the 20th century and while literate on tech given his generation, he is really not comparable to gen y-alpha. I would also wonder how this compares to devs, since most of them grow up relatively the same. PC in front of them, seeing code, monkey see monkey do, wam bam bap, software developer. I am in my 20s so I do not have knowledge about the first personal computers.
Agreed, I think it’s the main thing. My parents at the very least were firm believers in using computers from an early age, so as far back as I could remember I had my own PowerMac G3. With the rad blue monitor and round mouse.
I started on a Mac from Apple’s bad days. The school computers were Windows and it felt like all the other kids had Windows computers at home. I think feeling like I was at the disadvantage probably had an effect on me that led me to Linux. Also the second family computer ran Windows ME, so…
I started out with old Macs running System 7, and it was great. I had several good games installed from floppy disks and found some great shareware games online when we got our first modem and internet
The majority of people I know who have major computer problems solve them by buying another computer
I’m not even that tech illiterate, but I almost did that… My laptop was being slow, and I still had like 4k€ in overtime hours that I could buy Hardware from at work (it’s a great deal because I neither have to pay VAT on the hardware nor income taxes on the money from the overtime), so I was like, eh, might as well get a new laptop.
So then I read up on what laptop brands are out there, found out about Framework, and when I excitedly told my electrical engineer husband about it he was like “You knooow that you can easily replace parts in any laptop, right?”
Well, I didn’t know that (just kinda assumed laptops were more like phones than they are like desktop PCs), so I ended up just ordering a new SSD and new RAM for my laptop. It’s back to being butter smooth, but I have a hunch that cleaning the dust from the fans while I was in there was a very large factor in that haha

I used to work at a locally run computer store, and one of the biggest upgrades for most people was going from a mechanical hard drive to an SSD. Made a night and day difference.
Ooh, totally! I did have an SSD in there before, but it was only 256GB, so I had to store most files on the HDD and be extremely selective about what to install to C:. Going up to 8TB felt very liberating, I no longer have to fear that an npm install might crash my whole machine! (at least not due to space constraints, npm will figure out how to crash it for other reasons)
If the crashing stopped by replacing the SSD probably the SSD is end-of-life. SSDs basically wear down with each write action and when they reach their terabytes written limit they can start crashing the system during read and write actions. Also the smaller the SSD the lower the terabytes written limit is. 256GB drives are on the low end, so not surprising that you reached that limit.
That might be the case too! I do believe it was more of a skill issue in my case because I was booting Linux Mint from a 40GB partition (couldn’t free any more space than that on the old SSD) and enabled too many system backups (they recommended 2 daily and 2 on boot, and I just followed the recommendation without thinking about the space implications). Those alone put me at around 35-38GB of used space, and an npm install is usually around 1 GB, but log and temp files can sometimes balloon up when things go wrong. So it wasn’t really a crash per say, just Mint’s “shut down the system when you run out of storage space” protection triggering haha
I’ve been pretty much upgrading my own desktop PC regularly since the 90s (though I did buy a brand new one 6 years ago).
In my experience the upgrade that’s more likelly to improve it the cheapest is RAM, then a graphics card if you’re a gamer.
Upgrading the CPU has always been something that happens less often and also it doesn’t help that the CPU can only be upgrade up to a point without having to replace the motherboard (which then forces replacing the RAM and possibly even the PC box).
However there were two transition periods were the best upgrade by far was something else: the first was back in the day when hardware 3D accelerator boards were invented (Quake with a 3dfx was night and day compared to software rendering) and the other one was the transition for HDD to SSD, both being massive jumps in performance.
I see you used to have an HDD in there. That alone would’ve made it painfully slow in Windows especially, but even with Linux.
Now it should stay fast for longer.
I mean, asterisk. Most laptops let you swap the storage and RAM and many let you swap the battery. Beyond that it usually gets difficult.
Framework let you swap everything, which is a major difference. But of course you pay for that privilege; modular design has its costs.
Still, good on you for getting a cheap upgrade. No need to throw away a perfectly good laptop if you can make it work fast again with a new SSD.
Framework let you swap everything
I think there’s still a pretty big asterisk on that, because laptop parts are generally not built to be swappable… So I don’t think you can swap the CPU without the rest of the mainboard, and some parts like the CPU cooler are probably tied to the specific variant of mainboard and need to be swapped together if you want to switch CPUs.
They do let you swap out parts that are reasonably swappable, so it’s pretty much a guarantee you’ll be able to upgrade storage and memory, and even where you can’t swap to different parts they make sure you can replace broken parts more granularly, so it still seems like a good deal.
The logic board has the CPU built in, that’s true. However, the Framework 16 has a swappable GPU and all models make the ports independent of the logic board through a USB-C-based expansion module system. So that’s even a few parts other manufacturer might consider unreasonable.
(Also, to be fair, I forgot one other thing most laptops let you swap: The WiFi/BT card, if only because it’s cheaper to have that on a swappable module.)
Wow that’s an amazing amount of dust. I think that’s the most I have seen in a computer and my only source of laptop used to be old things from recycling centers
Could be explained by the fact that my favorite position to program is on my bed, like a teenage girl from a mediocre 2000’s movie writing in her diary. The laptop fans get a taste of all that good good bed sheet fiber.

My back hurts thinking about this.
I use mine on a sofa (for gaming, even!), but I get around the issue a bit by having a pad under the laptop. It’s literally just a hard plastic board with a beanbag attached underneath, I think I got it from IKEA. It isolates the laptop a bit from dust and improves airflow + lets it heat up without burning my knees + the one I have is just large enough that I can also use my wireless mouse on it when I push my laptop to the left.
Ah I see you keep your laptop well fed. I try to keep mine anorexic lol
If you’re a half-decent person you doged a bullet there:
https://www.theregister.com/2025/10/14/framework_linux_controversy/
I’ve told those kind of people about how easily I could format/reinstall the OS, and they looked at me like some kind of lunatic witch doctor.
That’s why I will nab computers out of the trash if I see 'em. Most of them still have perfectly functional modern parts. A lot of the time, the only thing that even needs replacing is the PSU.



















