Your Dream Monitor For 2023

I’ve been seeing the MSI boot screen on the EVT ES07D02 test QHD monitor, mostly, but if I power it off and boot, I’m getting about 50-65% success seeing it on my Spectrum ES07D03 4K model as well.

This is a big improvement from never seeing it at all on the 4K, no matter what I did. But it seems that if I have both powered on, it defaults to the QHD, using DP, instead of the 4K, using HDMI.

1 Like

Neither updating motherboard BIOS nor updating GPU BIOS are options for me.

The latest stable BIOS version for my motherboard (Gigabyte GA-Z68X-UD4-B3, 2011) is non-UEFI, and the only UEFI BIOS version available for it remained in beta status forever.

My GPU (GTX 650 Ti Boost, 2013) is not among those that the nVidia GPU-BIOS update is applicable to. nVidia traditionally only provided a fixed BIOS for a couple of their latest GPU generations (RTX 2000 and 3000, IIRC). The good thing is that they provided a hint about the DisplayID standard as opposed to regular/older EDID standard as the reason of the invisible-boot issue. I forwarded this info to Grant (@Lore_Wonder) a while ago; not sure whether the firmware team took this info into account though when searching for a fix for the invisible-boot issue.

And again, Dell P2415Q 4K monitor works fine (except integer scaling as a feature of course) under the same conditions. Both issues (invisible boot; 30 Hz maximum at 4K) are specific to Eve Spectrum. A monitor is a universal device that should not have such compatibility issues with older hardware.

2 Likes

what about booting from mobo CPU and not the gpu. Do you get any book visual from the cpu? cpu has its own built in integrated graphics.

This bugs me as well. It’s such a simple algorithm to calculate the integer scale, and it only needs the most rudimentary mathematical operations.

// assumption: the signal resolution can never exceed the
// native panel resolution
scaleFactor = 1;
scaledWidth = resolution.width;
scaledHeight = resolution.height;
while (scaledWidth + resolution.width <= panel.width
    && scaledHeight + resolution.height <= panel.height)
{
    scaleFactor += 1;
    scaledWidth += resolution.width;
    scaledHeight += resolution.height;
}
// bit-shifting right by 1 is the same as division by 2
xOffset = (panel.width - scaledWidth) >> 1;
yOffset = (panel.height - scaledHeight) >> 1;

The only reason I can think of for not doing this is that the integer scaling relies on some sort of lookup table rather than calculating pixel offsets. And if so, why couldn’t the lookup table be built during the connection handshake once the resolution is known?

1 Like

My motherboard does not have a video output even though the CPU (i7-3770T) has an integrated GPU. Also, this would not allow to use the discrete GPU for 3D rendering, so performance would be much worse.

The algorithm of integer scaling with square pixels (without aspect-ratio correction) is indeed extremely simple, it’s described in my article:

Divide the screen width and height by the image width and height correspondingly. Discard fractional part of the less of the results. This is the needed integer ratio.

With integer math, the fractional part is even discarded automatically.

1 Like

Using integer division works too. The only reason I avoided it is because div is much more expensive (in terms of clock cycles) than simple addition/subtraction. Depending on how efficient the processor is, a small number of loops can actually out-perform an div.

I suspect that under the hood, division is actually implicitly done on low level in a way similar to what you did explicitly, but who knows what specifics scaler hardware has.

Fwiw, Grant (@Lore_Wonder) previously said three months ago in Discord that the firmware team confirmed that dynamic integer-scale calculation is “highly doable”, so whether/when this will be implemented is probably just a matter of prioritization.

3 Likes

My most ideal monitor is:

  • 5K (5120 x 2880)
  • 27"
  • OLED / mini-LED for some kind of HDR
  • at least 120Hz refresh rate
  • USB-C input + KVM

5K at 27" is the perfect pixel density. Text is super clear and sharp - I’m a software dev, and the main use for my monitor is work, so I want text as clear as possible. 5K also allows for integer scaling down to 2560 x 1440 for gaming, if needed.

OLED or mini-LED is good, so long as the mini-LED has a decent number of dimming zones. The more the better.

5K 120Hz is doable over DisplayPort 2.0. More would be a nicety I guess, but I use 120Hz right now, and I don’t really notice a difference between 120Hz and 144Hz on my monitor. Dropping from 120Hz to 60Hz is difficult, and noticeable. Basically the main reason I’ve not bought a Studio Display :smiley:

Basically, I just want something that’ll actually match my MacBook Pro screen in every aspect, and exceed it in other ways. I want to be able to use the same display for work and play, without major compromises on either resolution or refresh rate.

2 Likes

This, a million times this. Having a monitor that can display sub 4k resolutions (240p, 480p, 720p, 1080p and 1440p) without blur is way more important to me than being able to actually drive an 8k monitor at its native resolution.

I’d even be willing to accept a 5k monitor as a compromise if there are no 8k panels because the majority of content I consume is 240/288p, 480p/576p, 720p and 1440p.

Also, I don’t care about burn in so give me more control over how hard I can drive those OLED panels. Slap a 20cm fan in there if you have to but let me ride the thing into the ground.

3 Likes

Agreed with everything except fan cooling. A non-gaming or universal (multi-purpose: games, movies, work / productivity) monitor must be fanless and noiseless, period. In particular, fan cooling is a real deal breaker for me (besides the low pixel density) in the recent Alienware QD-OLED monitor.

1 Like

The bigger the fan the quieter the fan because you can move more air at lower rpm.

I agree with you. Obviously μ-led would be preferred if it was available and not obscenely expensive.

1 Like

Oh yeah, for sure. micro-LED would be the ultimate option, but having seen what a high-density mini-LED is like, I’m more than happy with that as an alternative to OLED until micro-LED works.

1 Like

Wait until one of those miniLEDs dies on you. It’s awful. Far more irritating than having a dead pixel or even burn in IMO.

Just sharing my thoughts here…

When it comes to what panel technology to pick, I think it all boils down to the actual quality of the panel itself. OLEDs are getting better and better with time. I know many people are religiously avoiding OLED panels due to the burn-in issue. But I also see a lot of devices with OLED panels which have entered the 3rd year or so (including my phone and my OLED TV) that don’t even show any signs of burn-in at all.

Of course, mileage will vary since monitors will tend to show static images such as UI.

That being said, I have never owned any of the newer techs (Mini-LED, Micro-LED), so I wouldn’t know how they are performing in the long run. From the reviews, it seems like a good step forward as an alternative to OLED when it comes to color reproduction and contrast, yet it avoids the risk of burn-in.

I actually never seen “failing Mini-LED.” I assume it looks like a dead pixel?

3 Likes

If a mini-LED fails, it’s an entire zone out of action. It would look far worse than a dead pixel.

3 Likes

That’s a point that has to be taken into consideration: a monitor is a piece of hardware that is generally kept for many years and the longevity and reliability of a technology are probably as crucial as image quality and other features we are usually after.

1 Like

I still prefer to go for OLED or QD-OLED because Micro LED is not even ready. It is obvious that Mini LED will be its first foray, it will not surpass OLED unless it offers a better price. It doesn’t, it’s technologically better pixel composition, there is more supply for OLED screens in the market and on top of that they are lowering the price. I’m very sorry but those people who religiously stray from OLED screens should be informed every day or at least try one of those screens. It is more for fear of pixel burning and who knows what else but it is obvious that they have improved a lot. There are even differences in color space and brightness and they will hold up better over time

1 Like

Samsung is releasing the first 4k 240hz gaming monitor. Samsung - Pssssst. 🤫 The world’s first 4K, 240Hz gaming monitor is coming soon. Are you ready? 🎮🎮🎮 | Facebook

1 Like