Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I think it’s that you need to be able to throw parallel processing at a lot of RAM. If you want to do that on a PC you need a GPU that has a lot of RAM, which you can only really buy as part of a beefy GPU. You can’t buy a midrange GPU and duct tape DIMMs onto it.
The Apple Silicon architecture has an OK GPU in it, but because of how it’s integrated with the CPU, all the RAM in the system is GPU RAM. So Apple Silicon Macs can really punch above their weight for AI applications, because they can use a lot more RAM.
I think it’s that you need to be able to throw parallel processing at a lot of RAM. If you want to do that on a PC you need a GPU that has a lot of RAM, which you can only really buy as part of a beefy GPU. You can’t buy a midrange GPU and duct tape DIMMs onto it.
The Apple Silicon architecture has an OK GPU in it, but because of how it’s integrated with the CPU, all the RAM in the system is GPU RAM. So Apple Silicon Macs can really punch above their weight for AI applications, because they can use a lot more RAM.