Recent intel systems unfortunately often push higher power to "beat" amd on performance metrics. Single core intel perf is still higher and likely will be, but multicore and efficiency of zen 4 is generally similar if not better than intel 13th gen. 14th gen helps efficiency but is barely available. Oh, and the amd igpu is quite performant, much better than the 13th gen and lower intels.
Btw as someone with a skylake laptop that also used to sip power, I suspect there's been a mild across the board power increase especially as newer chips clock much higher. My ryzen 7 iirc goes till 5.1ghz and is noticeably faster (i'm at 392 tabs in edge right now) than my skylake. I suspect your older laptop wouldn't clock so high, and a 3ghz limited intel/amd would have great battery.
Colder and hotter are temperatures and not measures of power consumption. A soldering iron can put out 75w at 800F, a cpu can put out 200w and top out at 175F.
In the modern era, AMD chips are actually known for running hotter for quite a number of reasons (much thicker IHS on AM5, stacked v-cache on X3D, boost algorithm deliberately saturating thermals, etc), even though the intel chips pull more power.
Akktually, according to the second law of thermodynamics you cannot get hotter with lower consumption of energy (with equivalent heatsinking, in terms of thermal resistance, measured in K per Watt) , at idle, as idle is a state of thermal equilibrium (and at rather low sustained power of 3-5w, well within ability of heatsink to dissipate), where none of your reasons are applicable.
> according to the second law of thermodynamics you cannot get hotter with lower consumption of energy (with equivalent heatsinking, in terms of thermal resistance, measured in K per Watt
cpus are not an ideal thermal system and do have their own internal thermal resistance. a 7800X3D runs hotter than a 7700X at equivalent (limited) PPT, which runs hotter than a 13900K at the equivalent (limited) PPT, because the thermal resistance is higher. these are objectively measurable things!
Also, generally, surface area is a component of thermal intensity as well and if you take the same flux and spread it out over more surface area you will get a lower temperature too. A threadripper putting out 250W does so with less thermal intensity than a 7800X3D putting out 250W and will run at a lower temperature too.
like yes, you are correctly describing the measurements in which these cpus are not the same thing, but then making the incorrect leap that "because in a spherical-cow world they would be equal" that these cpus are equivalent in these metrics in real life, which they are not. different cpus have different thermal resistances, and AMD's is generally higher right now because of the decision to go with a thicker IHS (to maintain cooler compatibility) and the move towards stacking (more silicon in the way = more thermal resistance).
and again, don't pretend this is some absurd or unknown concept, we literally spent years and years with amd fans making memes about "intel toothpaste"... thermals and wattage dissipated are not the same thing. you can have a great, efficient product with terrible thermal resistance, there have been a number of them!
it's just that AMD isn't on the top this time, so everyone pretends not to get it... or volunteers a bunch of theoretical reasons it doesn't matter... or ...
just like "thermal watts aren't the same thing as electrical watts!" etc
This is with a 14nm chip, one would think the newer systems could hopefully do at least this well.