Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
So, as a software engineer who has also used Linux for decades, I get what you’re saying, but the simple fact is that Apple stuff tends to be way more rock-solid reliable for “normal users” (browsing, email, etc - basically, UI- and human-focused tasks) simply because they have vertically integrated everything.
That’s why their stuff “just works” pretty much always for simple activities - because when you control the chip architecture, instruction set, system hardware and integration, OS, the app code, and everything else I forgot to mention, you can do some really cool and hacky things to make the user experience incredible, but that cross some boundaries that a fully black-boxed architecture (that is: a design that strictly followed the hardware specs and didn’t rely on any nonstandard tricks or end-running of normal interfaces) likely wouldn’t.
I get why people do it. I just hate the proposition of throwing out a perfectly good computer that’s potentially upgradable and certainly more repairable compared to a Mac.
Ask anyone who had their Mac break and the answer is usually it can’t be fixed get a new one. Their hardware feels nice but reducing e-waste is a high priority in my book. MacBooks in particular don’t have a great track record for longevity when heavily used, most cheap laptops don’t.
An interprise computer designed to be repaired would always be a better option for professionals and individuals alike but even better is one that you already own.
So, as a software engineer who has also used Linux for decades, I get what you’re saying, but the simple fact is that Apple stuff tends to be way more rock-solid reliable for “normal users” (browsing, email, etc - basically, UI- and human-focused tasks) simply because they have vertically integrated everything.
That’s why their stuff “just works” pretty much always for simple activities - because when you control the chip architecture, instruction set, system hardware and integration, OS, the app code, and everything else I forgot to mention, you can do some really cool and hacky things to make the user experience incredible, but that cross some boundaries that a fully black-boxed architecture (that is: a design that strictly followed the hardware specs and didn’t rely on any nonstandard tricks or end-running of normal interfaces) likely wouldn’t.
I get why people do it. I just hate the proposition of throwing out a perfectly good computer that’s potentially upgradable and certainly more repairable compared to a Mac.
Ask anyone who had their Mac break and the answer is usually it can’t be fixed get a new one. Their hardware feels nice but reducing e-waste is a high priority in my book. MacBooks in particular don’t have a great track record for longevity when heavily used, most cheap laptops don’t.
An interprise computer designed to be repaired would always be a better option for professionals and individuals alike but even better is one that you already own.