Prescaling wastes bandwidth
Any prescaling (either via the video source itself or via an adapter between the video source and the display) wastes bandwidth because the display receives video signal at a resolution higher than the logical resolution. In case of prescaling via GPU, the display receives video signal at the display’s native resolution. So we end up with a potentially lower refresh rate or color depth compared with scaling by the display itself.
For example, with HDMI 1.x and prescaling via GPU we are limited to 30 Hz regardless of the logical resolution. With the display’s own scaling we can enjoy Full HD at 120 Hz under the same conditions. Via HDMI 2.0 we get Full HD at 60 Hz instead of 120 Hz at true Full HD scaled by the display itself.
Still useful with hardware not capable of integer scaling or 4K
That said, in general, such a scaling adapter would be great in terms of filling the huge gap we currently have between video sources and displays that don’t support pixel-perfect (integer) scaling on their own or are unable to output 4K: e.g. game consoles such as Nintendo Switch, MiSTer, Super Nt, Mega Sg, SNES Mini, and hardware video players.
For example, my Dell P2415Q 4K monitor is limited to 60 Hz anyway, so it would not matter that prescaled signal was limited to the same 60 Hz.
Universal support for any input video mode is important
An important feature would be for such a device to universally support any input video mode without being limited to a fixed hard-coded mode list.