Unless there's some non-linear interactions that I'm unaware of, it seems like the mean temp is the only thing you'd need to know wrt how much energy you need to use to keep it roughly at some other temp thru the season
IDK, extremes matter. -5 to +7 has the same average as -20 to +22, but the effects on energy use are widely different (e.g. the former may command a little heating for some time; the latter will command both extreme heating and possibly also cooling).
It's the same as with climate talk in general. "Ooh, the average yearly temperature only rose this much, or even fell down relative to 5/10/20 years ago." - sure, but if the range of temperatures keeps growing, you can end up with both a nice average falling year over year, and uninhabitable land due to all useful vegetation freezing up or drying up.
> the latter will command both extreme heating and possibly also cooling
I fully disagree with your use of "extreme" here, you have no evidence for that. Homes naturally have insulation which would mean the +22 temperatures would heat them, and some of that leftover heat would therefore reduce the need for heating. Indeed, it seems likely to me that the rate of heat leach would keep both these situations identical except for the fact you managed to inject "also using AC" in here like anyone ever does that.
Quite a weak argument.
The overall climate of our planet has nothing to do with the energy cost to heat a home.
Resistance heating is linear. Furnaces using slightly colder outside air are just a tiny bit less efficient, but it’s really close to linear. Heat pumps are non linear and do use electricity from natural gas.
So if I understand you, the only non-linear effect in play is that Heat Pumps are MORE efficient when the temperature differential is higher, is that correct?