An affordable Revolutionary! gaming monitor


#1

Revolutionary is not a term to use lightly.
Here’s why I used it.
With a high enough PPI you can do away with the term Native resolution. because any resolution can be stretched to appear as good as a native resolution would.
At high enough ppi the naked eye would not be able to tell. ,
And appearantly the maximum resolution of a perfect human eye is ~477ppi

wouldn’t that be interesting, maybe by the year 2020 well have monitors that suit people with 20/20 vision.
Looking at the new Razer phone, smartphone screens are already there.
Its specs are: 5.7 inches screen
Resolution: 1440 x 2560 px
Pixel density:513 ppi
Refresh rate: 120Hz

When gaming because of pixel sizes if you do not run at native resolution you do not get ideal results.
This is why no one has made 2.5k or 3k monitors.
why did we go from 1080p or 1440p to 4k.
Gaming is all about getting the best resolution money can buy. 1440p is nice. but if I’m running a game at 150hz, maybe I can to run my game at 3k resolution and still get about 90fps which is still great. I would even consider lowering graphics for a bigger resolution, going from ultra to just high.

This is why games give us all these ways to tweak graphic settings.
Unfortunately today mid/budget gamers cannot select resolution above 1080 or 1440p

the solution is in surpassing the human eye threshold for detection distortions caused by pixels trying to unequally divide a pixel.
I’d be really happy if we could get a ~400 ppi monitor running at 80 to 100hz.
what I’m asking for would be closer to around 19 to 22 inches.
(remember we’re pushing the envelope here, lets try and see if its feasible). Eventually a 24 or 27inch monitor could be made.

for Eve this would involve getting in touch with their screen providers to see if they can produce a working prototype and then go from there.


#2

Something to note - how dense pixels need to be before becoming indistinguishable changes with viewing length. (See https://designcompaniesranked.com/resources/is-this-retina/) The average viewing distance and size will need to be taken into account to determine what PPI the monitor should aim to have.


#3

cool thanks. well that makes this even more feasible if 300ppi is retina. atm gaming monitors hover around 100 to 200 ppi
example the LG 32GK850G-B has 218ppi, it is 31.5" big at 144Hz Refresh Rate

if we can cram 300ppi in about 20" or so, and still get 100hz that would be ideal for budget gamers.

to calculate the resolution for this I used this site: https://www.sven.de/dpi/
lets say 6k = 5760x3240
at 6k with a 21.5 inch display that equals
307.38 PPI
then it would be a question of being able to get high frame rate.

Zisworks has a pretty innovative approach to get high hz at high resolutions: http://zisworks.com

keep in mind you’ll be playing at lower resolutions and it will be stretched and still look as though it was native resolution. This way you won’t need a 4k ready GPU and you’ll be able to game at whatever res you want. 2k, 2.5k 3k what ever works best for your needs. maybe if you want to stream at the same time or have things running in the background you could lower the resolution.


#4

You seem to have a lot of misinformation.

The maximum resolution of the human eye is about 876 PPI when viewing an object 4 inches away from the eye. The further you are from something, the lower the resolution required for it to meet your “retina” requirements.

For a monitor, 200ppi is about that sweetspot. For a gamer, since they tend to sit closer to their monitors, 220ppi would be ideal. This becomes “retina” at a mere 16" distance.

The LG monitor you mentioned, is 93.24 PPI – not nearly 218 PPI, and would become “retina” at a 37" viewing distance.

At 20", a standard 4k resolution would meet your 220ppi needs. However, such a monitor would require an extremely powerful gaming setup (dual Titans, anyone?) to reliably reach 144fps at max settings. That makes the “budget” aspect of such a monitor, basically useless.

A much more reasonable monitor would be 24" (who wants 20" these days) IGZO at 2560x1440, 100Hz. This would yield 122 PPI, which goes “retina” at about 28" viewing distance – which is about average viewing distance. This could be done with current display technology, and could potentially be very inexpensive. Every monitor I could find near these specs make a sacrifice somewhere – most are 27" TN panels, which are far less color accurate and have lower resolution.


#5

if this was the case why don’t manufacturers sell/pitch the idea that native resolution is essentially a thing of the past. why did we stop at 1440p? and jump to 4k?

according to top comment’s explanation: “Going from 1080 to 1620 would be an increase of 3/2, and computers aren’t big fans of odd numbers. Especially for displaying images pixel by pixel, you would get weird stretching and blending at 1620p.”

so is it noticeable or not? theoretically the higher the ppi the more any distortion would cease to be visible by the naked eye.

I also don’t get why you would need a great GPU when you could run a game at 1080p and just stretch the image onto more pixels. its not like the game itself is running at your max res.

as for lg monitor i went by their website: http://www.lg.com/us/monitors/lg-32GK850G-B-gaming-monitor
did ctrl f and ppi showed 218.
my bad that gave me spec for their 27MD5KA model. which only runs 60hz. and is super expensive.


#6

Well i enjoy my iMac.


#7

The problem is currently the interface protocol, or the bandwidth that a single cable can transport. This is the reason why 3440x1440 @ 200 Hz is a thing, but 4K @ 120 Hz isn’t a thing yet. The 5K iMac cant even output 5K in Boot Camp, since the way it works is through some proprietary protocol that Windows does not support.

There are 2 workarounds for this:

  • Use multiple cables. This was popular in early 4K panels, where the display would expose itself as two smaller, separate displays, each with their own cable. The average connection today, that is DisplayPort 1.2 and HDMI 2.0 only support output at 4K@60. That means, if youre looking at 4K@120, youd need 2 cables. If you need 8K@60, youd need a whooping 8 cables (two for each 4K screen portion). Sure, its technically possible, but the downside to this is a) it requires some configuration, b) you need a graphics card that support that many cables and c) depending on the combination, flickering may occur when the computer does not send the display signal at the same time, but the display unit has to render both images at the same time.
  • Use the latest standard, that is DisplayPort 1.4 or HDMI 2.1. Even then, they only support 8K at 60 Hz, so youd still need 2 cables to be able to output at higher framerates. At the moment, no gaming GPU supports neither of the interface. That would limit the reach of your product since the gamers would need to purchase a very expensive workstation-grade GPU that’s sometimes outperformed by much cheaper gaming cards.

Then we talk about the gaming performance itself. 400ppi at 22" would be somewhere between 6K and 8K, that would put 4 times as much burden to the GPU as 4K resolution, something that today’s GPU are still struggling with. Add the requirement to push 2x the framerates, then youre looking to 6 to 8x the performance requirements. That means, your GTX 1080 Ti at 8K@120 would have the same Level of Detail as a GT 1030 at 1080@120. I honestly don’t think it is worth the resolution increase with the current generation GPUs, or in the near future.


#8

curious why we would need to transport more bandwidth? the gpu could just render 1080p or whatever you set. then on the monitor itself it could have a small chip or something that stretches this over the additional pixels. it would just multiply any one pixel onto however many extra is needed to stretch to fill screen. no need for additional cables.


#9

On another note. I think it would be cool if we could get this setup:
https://imgur.com/nRkYdW3
where you take say 3 monitors and combine them, while trying to make it seamless like the picture above.
since high PPI at a budget would need to be on a smaller screen. combining 3 screens would help.
i wouldn’t mind a thin seam, it would be a worth while tradeoff imo.


#10

Then whats the point of having the extra pixels in the first place?

Lets say you have a 2-pixel screen, but they have to display the same color at a time (1 pixel stretched), then why not just put a 1-pixel screen?


#11

You’re not understanding what this guy is saying. Scalar data (read this as: pixels) scales best in multiples of 2. 1080p to 2160p, 720p to 1440p, etc. Any time you scale in a different manner, you’re going to get distortion because the scaling requires dithering. Dithering is basically where you make an educated guess of what an extra pixel would look like, based off of what the pixels near the new one are. There are many algorithms to do this, some better than others, but you’re still introducing distortion.

Nobody has ever said that “native resolution is a think of the past.” Consider a 16x16 grid, and pretend it’s a 4k monitor. Now put a non-scaled 8x8 grid, or 2k image, on it. The image is taking up an 8x8 section, but each cell (pixel) is the same size as on the 16x16 grid. But that isn’t what you want, you want the 2k image to take up the ENTIRE 16x16 grid. This requires scaling, so each cell from the 8x8 grid (2k image) needs to be duplicated. Suddenly where there was one cell used in the 8x8, 4 cells are used on the 16x16. This means that despite the individual screen pixels being small, the VISIBLE pixel – the data pixel from the 2k image – is FOUR TIMES BIGGER.

So basically if you take a 4k monitor and scale a 2k image up to fit on it, the eye is still going to distinguish the pixels as if it were a 2k monitor. Give or take, dithering helps some but not nearly enough to warrant the cost of a 4k panel.


#12

the eye is still going to distinguish the pixels as if it were a 2k monitor.

absolutely. I did not want to achieve anything different, did I miss express myself somewhere?
whats particular here is that it would appear as nice as on a native 2k monitor.
because the unequal divisions will be so small that you would not be able to tell the two monitors apart.
even though one monitor is native (no unequal division) and the other isn’t (however it is pass naked eye detection threshold)… and so why it would be a thing of the past to use the term native resolution, since at the end of the day if it’s past our ability to see it, then it is no longer a concern.

If the ppi is not retina and you try and produce a non native resolution then yes you will notice the distortions caused by dithering.

also I did some math and a 13 inch screen fits about 3 times in a 21.5 inch screen.
the new dell XPS13 with infinityEdge display has (276 ppi) which is close, so tech wise in the very near future what i’m asking is or will be commercially a thing if they can run at +80 hz.

consider bezeless phones you can get the seam very tiny/ 1 milimeter or so maybe.


#13

So if you know that the image quality won’t be any different, why bother with the higher-density, more-expensive panel?


#14

Too late. Someone alread did that… DIY 4k gaming monitor. Linus tech tips made a video on already


#15

TV’s do this. It introduces another working part in the process, which introduces lag. Gamers don’t want lag. That’s why there is a “gaming” mode on TV’s nowadays.

As to the OP… I don’t really get it. Most games that have nice graphics won’t run on 4k without a absolute monster of a setup. So budget isn’t something you’ll need.

You’re all forgetting that “Retina level” is an estimation and is different per person. We don’t see in pixels or fps. Camera’s don’t do the same thing we do, and thus it’s hard to give accurate comparisons. For someone with more then 20/20 vision it’s possible that their “retina screens” aren’t dense enough.

Some other things:
High refresh-rate is only really viable with high FPS.
Who are you targeting? Pro gamers use lowest graphic settings, they don’t need fancy stuff.
Casual gamers? They most likely much prefer a better color accuracy over higher refresh-rate. I know refresh-rate is marketing thing nowadays, but wasn’t Eve suppose to be above that? And yes I agree 120Hz is nicer to look at then 60Hz. But I would much rather have way better colors then 120Hz.


#16

So if you know that the image quality won’t be any different, why bother with the higher-density, more-expensive panel?

the reason is because you’d have all the resolutions between 1440 to 4k as native.
I explained it already . if your gpu is doing well rendering 1440 and you can get a higher res currently the only option is 4k. I’d like to be able to tap into all the other resolutions in between.

also I disagree that above 60hz is marketing. gaming wise. I also don’t think there has been a lot of testing done on this so no real way to say. that said I would question if it were worth while to get over 100hz. but I think between 60hz and 100hz you’d notice a difference.
It’s not just for fps games. I play battlerite for example> there are ton’s of skill shots/aiming in that game.

TV’s do this. It introduces another working part in the process, which introduces lag. Gamers don’t want lag. That’s why there is a “gaming” mode on TV’s nowadays.

why is it that I can stretch on a pc monitor and there is no game mode listed?

Perhaps this could be something that could be improved on. maybe this lag is very small the less it has to stretch an image. If we devise a way to bring this lag down. might be worth researching/developing.
maybe it could do a sort of hybrid. this would require a better gpu but still not as much as trying to power the whole image at 4k.
the hybrid would have half the game processed at a higher res and not involve the extra upscaling step and then the other half would be upscaled.
or maybe it could be a third. and you could decide where on the screen you’d want it upscaled. it could give you templates to pick from. I’d likely pick the top and bottom 15% of monitor, areas that may not be affected have as much action in your game.


#17

What if wve made a little hdmi/whatever pass-through dongle which will enable Gsync. Can this be done?


#18

Not entirely sure but my best guess is no. Not only technically difficult to ensure compatibility, but only due to licensing from nvidia.


#19

You can’t have more than one native resolution on a monitor.


#20

If we could get the FPGA from Nvidia, we could /probably/ have a GSYNC to FreeSync adapter. But they’d never go for that xD