Please correct me if I'm wrong; lets consider this example scenario:
Based on the graph above, if the CPU consumes 50W at 3.3 GHz, it would consume ~25W at 2.5 GHz
The reason is because the voltage on 2.5 GHz is 0.8x of 3.3 GHz (800 vs 1000 mV), and the cube of 0.8 is roughly 0.5.
In that case, if we have steeper slope, lets say if we have double the slope, we are going to hit 800 mV at 2.9 GHz instead of 2.5 GHz. In that case, we already halved the power consumption by a large amount (25W) just by reducing a fraction of GHz, and we only use ~10W at 2.5 GHz.
I agree theyre back in the game. However, you can never guess how it will turn out in mobile. For example, as my understanding, Intel CPUs have been able to scale down to miliwatts when its not under load. I don't know how Ryzen will play out. Sure, it may be able to scale well in 2-3 GHz, but is it able to get to 0.6 GHz while consuming tiny amount of power? We don't know yet.
However, fact is, AMD graphics card consistently consume around tenfolds of power in idle than competing cards from NVIDIA.
Sorry for the misunderstanding, I mean that if you have lower frequency, you would get lower per-core performance as a result (with constant IPC).
Regarding the GPU performance, I don't want to go on so much details, but they tested it right after AMD released a rather large update, and before NVIDIA released theirs (378.78). It will always go back and forth that way with one driver update after another, and in my opinion single-digit percentage is not something that's noticeable anyway.
However, fact is this: AOTS was hyped up like mental, and it made AMD cards sounded much better than they really are, with the RX 480 or R9 x90 often matching the 980Ti. They even showed a graph where a dual RX 480 would beat the GTX 1080 in AOTS right in the launch presentation. In reality, in games that people actually play, be it DX11 or DX12, that's simply not the case. I wouldn't expect anything less for Ryzen.