• python@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    1 day ago

    Ooh, totally! I did have an SSD in there before, but it was only 256GB, so I had to store most files on the HDD and be extremely selective about what to install to C:. Going up to 8TB felt very liberating, I no longer have to fear that an npm install might crash my whole machine! (at least not due to space constraints, npm will figure out how to crash it for other reasons)

    • SkunkWorkz@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 day ago

      If the crashing stopped by replacing the SSD probably the SSD is end-of-life. SSDs basically wear down with each write action and when they reach their terabytes written limit they can start crashing the system during read and write actions. Also the smaller the SSD the lower the terabytes written limit is. 256GB drives are on the low end, so not surprising that you reached that limit.

      • python@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        That might be the case too! I do believe it was more of a skill issue in my case because I was booting Linux Mint from a 40GB partition (couldn’t free any more space than that on the old SSD) and enabled too many system backups (they recommended 2 daily and 2 on boot, and I just followed the recommendation without thinking about the space implications). Those alone put me at around 35-38GB of used space, and an npm install is usually around 1 GB, but log and temp files can sometimes balloon up when things go wrong. So it wasn’t really a crash per say, just Mint’s “shut down the system when you run out of storage space” protection triggering haha