Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I mean, it’s all runing on general purpose hardware. If we decoded 4k video on general purpose hardware we’d use more power than every AI company put together, but once that became popular we developed chips capable of decoding it at the hardware level that consume barely any power.
And the exact same thing is starting to happen with dedicated machine learning chips / NPUs.
I mean, it’s all runing on general purpose hardware. If we decoded 4k video on general purpose hardware we’d use more power than every AI company put together, but once that became popular we developed chips capable of decoding it at the hardware level that consume barely any power.
And the exact same thing is starting to happen with dedicated machine learning chips / NPUs.