I have heard people talk of how the operating frequency of CPU's has not been increasing as greatly recently because, as the device operates at a greater frequency, it generates more heat and consumes more energy, making it too difficult for it to operate faster than a certain speed. Similarly, supporters of the new high-bandwidth memory (HBM) have stated that it is vital, because current DDR shall consume more power as it advances, making it impractical after a certain point.
However, I have long been under the impression that, as integrated circuits advance, they generate less heat and consume less power, a key aspect of Moore's law, so why would the opposite phenomenon be happening? Could someone please explain this to me? Thank you very much.