Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I remember when chips first hit 1GHz around 1999. Tech magazines were claiming that we’d hit 7GHz in 5 years.
What they failed to predict is that you start running into major heat issues if you try to go past ~3GHz. Which is why CPU manufacturers started focusing on other ways to improve performance, such as multiple cores and better memory management.
My dad had one of the first consumer 3GHz chips available. By the time I inherited it in 2009 it was completely outclassed by a <2GHz dual-core laptop.
Clock speed isn’t improving that quickly anymore. Other aspects, such as more optimized power consumption, memory speeds, cache sized, less cycle-demanding operations, more cores have been improving faster instead.
That would’ve been a single 3ghz cpu core. Now we have dozens in one chip. Also, the instruction sets and microcode has gotten way better since then as well.
We’re running into hard physical limits now, the transistors in each chip are so small that any smaller and they’d start running into quantumn effects that would render them unreliable.
I remember 20 years ago already seeing 3ghz CPUs, isn’t technology supposed to improve fast?
I remember when chips first hit 1GHz around 1999. Tech magazines were claiming that we’d hit 7GHz in 5 years.
What they failed to predict is that you start running into major heat issues if you try to go past ~3GHz. Which is why CPU manufacturers started focusing on other ways to improve performance, such as multiple cores and better memory management.
Just use the heat to power the machine.
Yeah, that’s how it works.
And it has. The phone you have is faster than the 3GHz chip back then. A phone powered by a battery. And faster by like 20 times.
My dad had one of the first consumer 3GHz chips available. By the time I inherited it in 2009 it was completely outclassed by a <2GHz dual-core laptop.
Clock speed isn’t improving that quickly anymore. Other aspects, such as more optimized power consumption, memory speeds, cache sized, less cycle-demanding operations, more cores have been improving faster instead.
That would’ve been a single 3ghz cpu core. Now we have dozens in one chip. Also, the instruction sets and microcode has gotten way better since then as well.
We’re running into hard physical limits now, the transistors in each chip are so small that any smaller and they’d start running into quantumn effects that would render them unreliable.