Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Macs have a reputation for being expensive because people compare the cheapest Mac to the cheapest PC, or to a custom-built PC. That’s reasonable if the cheapest PC meets your needs or if you’re into building your own PC, but if you compare a similarly-equipped name-brand PC, the numbers shift a LOT.
From the G3-G5 era ('97-2006) through most of the Intel era (2006-2020), if you went to Dell or HP and configured a machine to match Apple’s specs as closely as possible, you’d find the Macs were almost never much more expensive, and often cheaper. I say this as someone who routinely did such comparisons as part of their job. There were some notable exceptions, like most of the Intel MacBook Air models (they ranged from “okay” to “so bad it feels like a personal insult”), but that was never the rule. Even in the early-mid 90s, while Apple’s own hardware was grossly overpriced, you could by Mac clones for much cheaper (clones were licensed third-parties who made Macs, and they were far and away the best value in the pre-G3 PowerPC era).
Toward the tail end of the Intel era, let’s say around 2016-2020, Apple put out some real garbage. e.g. butterfly keyboards and the aforementioned craptastic Airs. But historically those are the exceptions, not the rule.
As for the “does more”, well, that’s debatable. Considering this is using Apple’s 90s logo, I think it’s pretty fair. Compare System 7 (released in '91) to Windows 3.1 (released in '92), and there is no contest. Windows was shit. This was generally true up until the 2000s, when the first few versions of OS X were half-baked and Apple was only just exiting its “beleaguered” period, and the mainstream press kept ringing the death knell. Windows lagged behind its competition by at least a few years up until Microsoft successfully killed or sufficiently hampered all that competition. I don’t think you can make an honest argument in favor of Windows compared to any of its contemporaries in the 90s (e.g. Macintosh, OS/2, BeOS) that doesn’t boil down to “we’re used to it” or “we’re locked in”.
Windows did a few vital things that Apple failed miserably on in the 90’s.
Mac dropped support for legacy software and hardware on every new OS in the 90’s. Microsoft maintained backwards capability. It was a major reason windows was more resource intensive and had more bugs. It was a smart move because windows OS was able to handle more software and hardware than Macs. This is the top reason why windows demolished Mac in sales.
Microsoft’s business model allowed greater range of pricepoints. Most users in business or at home do not need the capabilities of the lowest priced Mac model. You don’t need much to check e-mail, browse the web, and do some basic word processing. Apple did not service this largest section of the market at all.
Windows benefited by not being tied to the hardware. So if you could slap together a bunch of parts and swap out a few dozen floppies you could get a Windows machine. Which meant there were a ton of companies making Windows machines for cheaper than Apple could make Macs.
Apple tried to allow clones, but ran into the same problem because the clone makers could make cheaper machines by slapping together parts.
It’s a shame that they won’t just release macOS as a standalone product, even if it requires specific hardware to run. I would pay for it in a heartbeat.
I was actively into the Hackintosh scene in the early 10s. You could have an insanely powerful build (albeit the parts had to be compatible), and it would still be half the price of a lower end Mac Pro.
insee it as apple is a full vertical stack conpany who doesnt want to share its vertical stack as it cuts into their profit.
its what nvidia is trying to do, and if windows for arm takes off, i bet that nvidia is ready to attempt to remove all competition on windows due to how reliant some sectors of the industry are for nvidia hardware
Apple tried to allow clones, but ran into the same problem because the clone makers could make cheaper machines by slapping together parts.
Yeah, this is exactly what happened, although some of the clone brands were perfectly high-quality (Power Computing in particular made great machines, usually the fastest on the market). In the Mac community at the time, a lot of people (myself included) wished Apple would just exit the hardware business and focus on what they were good at: software.
Then Steve Jobs came back and did exactly the opposite of that. First order of business was to kill cloning. Then came the iPod.
To be fair, the next generation of Power Macs after that were about half the price of the previous gen.
Prior to Steve coming back Apple had a ton of different product lines. You had three or four models of Performa, then two different lines of Power Macs, three different Powerbooks, and even some servers. This wasted a ton of effort and resources maintaining all these product lines.
Steve divided the segments in to four quadrants on two axes: Portable vs. Desktop and Consumer vs. Professional. I think if they’d have started with simplifying their product line there might still be a market for the clones.
Totally agree. Their product line was an absolute mess back then. Their current lineup is getting a little bloated too. I don’t know why they bother having two laptop product lines anymore when they are so similar.
Most of Apple’s history, actually.
Macs have a reputation for being expensive because people compare the cheapest Mac to the cheapest PC, or to a custom-built PC. That’s reasonable if the cheapest PC meets your needs or if you’re into building your own PC, but if you compare a similarly-equipped name-brand PC, the numbers shift a LOT.
From the G3-G5 era ('97-2006) through most of the Intel era (2006-2020), if you went to Dell or HP and configured a machine to match Apple’s specs as closely as possible, you’d find the Macs were almost never much more expensive, and often cheaper. I say this as someone who routinely did such comparisons as part of their job. There were some notable exceptions, like most of the Intel MacBook Air models (they ranged from “okay” to “so bad it feels like a personal insult”), but that was never the rule. Even in the early-mid 90s, while Apple’s own hardware was grossly overpriced, you could by Mac clones for much cheaper (clones were licensed third-parties who made Macs, and they were far and away the best value in the pre-G3 PowerPC era).
Macs also historically have a lower total cost of ownership, factoring in lifespan (cheap PCs fail frequently), support costs, etc. One of the most recent and extensive analyses of this I know if comes from IBM. See https://www.computerworld.com/article/1666267/ibm-mac-users-are-happier-and-more-productive.html
Toward the tail end of the Intel era, let’s say around 2016-2020, Apple put out some real garbage. e.g. butterfly keyboards and the aforementioned craptastic Airs. But historically those are the exceptions, not the rule.
As for the “does more”, well, that’s debatable. Considering this is using Apple’s 90s logo, I think it’s pretty fair. Compare System 7 (released in '91) to Windows 3.1 (released in '92), and there is no contest. Windows was shit. This was generally true up until the 2000s, when the first few versions of OS X were half-baked and Apple was only just exiting its “beleaguered” period, and the mainstream press kept ringing the death knell. Windows lagged behind its competition by at least a few years up until Microsoft successfully killed or sufficiently hampered all that competition. I don’t think you can make an honest argument in favor of Windows compared to any of its contemporaries in the 90s (e.g. Macintosh, OS/2, BeOS) that doesn’t boil down to “we’re used to it” or “we’re locked in”.
Windows did a few vital things that Apple failed miserably on in the 90’s.
Mac dropped support for legacy software and hardware on every new OS in the 90’s. Microsoft maintained backwards capability. It was a major reason windows was more resource intensive and had more bugs. It was a smart move because windows OS was able to handle more software and hardware than Macs. This is the top reason why windows demolished Mac in sales.
Microsoft’s business model allowed greater range of pricepoints. Most users in business or at home do not need the capabilities of the lowest priced Mac model. You don’t need much to check e-mail, browse the web, and do some basic word processing. Apple did not service this largest section of the market at all.
Windows benefited by not being tied to the hardware. So if you could slap together a bunch of parts and swap out a few dozen floppies you could get a Windows machine. Which meant there were a ton of companies making Windows machines for cheaper than Apple could make Macs.
Apple tried to allow clones, but ran into the same problem because the clone makers could make cheaper machines by slapping together parts.
It’s a shame that they won’t just release macOS as a standalone product, even if it requires specific hardware to run. I would pay for it in a heartbeat.
I was actively into the Hackintosh scene in the early 10s. You could have an insanely powerful build (albeit the parts had to be compatible), and it would still be half the price of a lower end Mac Pro.
Apple is fundamentally a hardware company that uses features, workflows, and integrations to keep people buying hardware.
They’re never going to do something than undercuts hardware sales ever again.
insee it as apple is a full vertical stack conpany who doesnt want to share its vertical stack as it cuts into their profit.
its what nvidia is trying to do, and if windows for arm takes off, i bet that nvidia is ready to attempt to remove all competition on windows due to how reliant some sectors of the industry are for nvidia hardware
Yeah, this is exactly what happened, although some of the clone brands were perfectly high-quality (Power Computing in particular made great machines, usually the fastest on the market). In the Mac community at the time, a lot of people (myself included) wished Apple would just exit the hardware business and focus on what they were good at: software.
Then Steve Jobs came back and did exactly the opposite of that. First order of business was to kill cloning. Then came the iPod.
To be fair, the next generation of Power Macs after that were about half the price of the previous gen.
Prior to Steve coming back Apple had a ton of different product lines. You had three or four models of Performa, then two different lines of Power Macs, three different Powerbooks, and even some servers. This wasted a ton of effort and resources maintaining all these product lines.
Steve divided the segments in to four quadrants on two axes: Portable vs. Desktop and Consumer vs. Professional. I think if they’d have started with simplifying their product line there might still be a market for the clones.
Totally agree. Their product line was an absolute mess back then. Their current lineup is getting a little bloated too. I don’t know why they bother having two laptop product lines anymore when they are so similar.
My graphic design teacher in high school had a PowerPC and that thing was awesome.