If electricity was $0.65/kWh for me, I'd move out of the country lol. Assuming a more realistic $0.40/kWh (considered very expensive in the US, where most enthusiasts live), 8 hours of gaming a day, and a 200W power limit, you're paying $19/mo. Not bad.
Don't get me wrong, the 5800x3d is a phenomenal CPU. However like many owners of that CPU, I'm also in the market for a 4090 and intend to use the full 600W power limit with my water loop and outdoor radiator. CPU power consumption is just not an issue for enthusiasts.
Nice, ha ha. I was going to say, look into Undervolting. I saw 95% perf at 60% of the power come up, ie 270w; that's actually going to be superb (very cool, quiet, still extremely performant).
So maybe configure that as an optional profile for titles not needing maximum juice?
I'm quite keen myself.
Power $ mentioned was peak times in Norway; I'm in Australia where it's not anywhere that bad yet. (0.25 AUD for me).
> I saw 95% perf at 60% of the power come up, ie 270w
That's neat, but 5% is a lot when you're spending $1600. Oh and all these measurements you see tossed around are done with air coolers and stock voltages. Of course more power doesn't help if you're limited by thermals. Specifically, you're limited by hotspots you can't see because they're between the on-die temperature sensors.
> maybe configure that as an optional profile for titles not needing maximum juice?
4K gsync monitors are typically limited to 120Hz. If my GPU is capable of running a title at 1200fps, it will idle 90% of the time and consume only ~60W regardless of the 600W power limit. So tuning this is pointless. I'd still get 1/10th the rendering latency compared to a GPU that's only capable of 120fps. Same logic applies to power limiting your GPU which will increase latency and kill frame pacing in exchange for a tiny reduction in electricity bills.
> Oh and all these measurements you see tossed around are done with air coolers and stock voltages. Of course more power doesn't help if you're limited by thermals. Specifically, you're limited by hotspots you can't see because they're between the on-die temperature sensors.
This completely misunderstands heat transfer. If the hotspot temp is 75 deg even with overclocking you're not limited by thermals: https://www.youtube.com/watch?v=zc-zwQMV8-s
>4K gsync monitors are typically limited to 120Hz. If my GPU is capable of running a title at 1200fps, it will idle 90% of the time and consume only ~60W regardless of the 600W power limit. So tuning this is pointless. I'd still get 1/10th the rendering latency compared to a GPU that's only capable of 120fps. Same logic applies to power limiting your GPU which will increase latency and kill frame pacing in exchange for a tiny reduction in electricity bills.
This completely misunderstands power consumption and the nonlinear relationship between power and clockspeed. Performing the same work in the same time in short bursts of high clocks and high voltage uses more power than constant low clocks and voltage.
Don't get me wrong, the 5800x3d is a phenomenal CPU. However like many owners of that CPU, I'm also in the market for a 4090 and intend to use the full 600W power limit with my water loop and outdoor radiator. CPU power consumption is just not an issue for enthusiasts.