Author Topic: Buying a Monitor  (Read 9218 times)

2020-07-31, 17:48:53

Ekladiuos

  • Active Users
  • **
  • Posts: 30
    • View Profile
Hi Guys, I need help with choosing a monitor. My budget is around 500 euros +/- and I have came across many many monitors that it started to be confusing. Dell's reputation is very good and it seems like a safe choice.
I am thinking about Dell UltraSharp 27 monitor U2719D, but I am afraid it's not worth the money. I am not an expert with this.

Thank you

2020-07-31, 21:41:32
Reply #1

sprayer

  • Active Users
  • **
  • Posts: 794
    • View Profile
How about LG 32GK850F? My friend bought it and says it's good, i have 43" Philips

2020-08-01, 23:19:02
Reply #2

Ekladiuos

  • Active Users
  • **
  • Posts: 30
    • View Profile
i have checked it, sounds cool but high contrast for gaming maybe not for our kind of work

2020-08-04, 14:35:07
Reply #3

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website
That is absolutely terrible choice, no offense to anyone :- ).

LG 32GK850F is 32" with only 2560px, which is tragically low PPI around 90 (for comparison, the popular Apple "Retina" is 220, but 140 is the golden standard you can get with 32/4k). It's looks super pixelated, definitely not worthy for work of any kind.
It's also low-end VA panel with terrible off-side contrast and color loss.

There are gaming monitors which are also perfect for work, like few upcoming LG 27" IPS 4K 120-240Hz, which I think will be available in Q4 2020 so few months still to go. Those will be superb for gaming, HDR content and work as well. But low-res VA panel, or even high-res VA panels are total no-go, completely hands off :- ).

Don't get 27" QHD in IPS either, go straight to 27"/4K, it's absolutely worth it, 100perc. look incredibily better. I use 27/4K as my travel monitor, and it's so much sharper then my desktop 32"/4k.
500 Euro is plenty to find some outliers or discounts. It doesn't really matter much if you get LG, Dell, BenQ or even AOC,etc.. then use the same LG or AU Optronics panels for the IPS.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-08-04, 15:30:21
Reply #4

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8779
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Hmm Juraj, i don't want to argue with you, but people are different and their requirements are different too. 90 ppi, might be tragic for you, but my current monitors have ~90 ppi and i know for sure that my next one will definitely have the same, or very similar pixel density, because anything higher would cause too much strain on my poor eyesight. In fact i'm thinking about 32" 2560px, but not LG. The only concern for me is physical size - i'm not sure if 32" would be too big for me.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2020-08-04, 16:58:20
Reply #5

Ekladiuos

  • Active Users
  • **
  • Posts: 30
    • View Profile
Thank you, Juraj so much!
I will consider the BenQ 2700U, But if I am planning to spend more up to 700 euros  do you think it's worth it to buy 3200U now?

2020-08-04, 21:12:37
Reply #6

sprayer

  • Active Users
  • **
  • Posts: 794
    • View Profile
Juraj PPI for people who working with graphics is useless you need to see pixel and all artifacts. You need to be as guide for average users what using normal monitors or you will not notice artifacts and noise on HiPDI panel but they will be too visible on normal monitor. Also than bigger panel then more distant you will sit from monitor and you do not need HDPI with big panel this is useless what you can see from 1meter? This is not mobile panel.
As for VV panel you are also wrong, they are the same as IPS, but both have issues, IPS low contrast not black color and huge glow. VA panel have better in this, it may be loose a bit in colors and black gradient(but this is depend of each panel) whats it, other only plus, nice blacks and contrast. Also it's often have much more brightness what may have HDR support like i have hdr 1000

Ekladiuos benq and many more companies do not develop panels so i prefer to choose from developers, panels makers are Samsung, LG, Philips(TP Vision), AUO that's all if i am no mistaken. Other companies just using the same panels and rise cost, for example HP 43" have identical panel like LG 43" but it cost +300$ for nothing.
You may check two resource with nice test of monitors and choose from their recommendation
https://www.tftcentral.co.uk/
https://pcmonitors.info/reviews/
« Last Edit: 2020-08-05, 09:58:43 by sprayer »

2020-08-05, 01:41:47
Reply #7

sebastian___

  • Active Users
  • **
  • Posts: 200
    • View Profile
From experience, testing a few monitors and from reading online (maybe old info) I would say IPS is a must for graphics work.

You could use a cheap lower quality monitor as a second monitor. The quality would not matter as you would move most of your toolbars and buttons there, freeing your main monitor. And as a bonus you can verify on that monitor how people with cheap monitors and TVs will see your work.

But I also dislike monitors with high resolution because I feel they take precious CPU and GPU cycles from rendering and from navigating the viewport and with little benefits. I mean I have an ipad with a so called "retina" screen and yes having such a high resolution is nice but far from a deal-breaker if not, at least for me, especially if it comes with big drawbacks.

But I'm also torn as I have a problem when it comes to verify well the pixel sharpness when producing 4k movies. I can zoom in, and compare with other 4k movies, but it's still a problem. But this issue comes rarely.

Still I'm hoping I will be able to buy high quality monitors in the future with a 2k resolution or about 94 ppi as my current one has.

2020-08-06, 21:40:26
Reply #8

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website
Hmm Juraj, i don't want to argue with you, but people are different and their requirements are different too. 90 ppi, might be tragic for you, but my current monitors have ~90 ppi and i know for sure that my next one will definitely have the same, or very similar pixel density, because anything higher would cause too much strain on my poor eyesight. In fact i'm thinking about 32" 2560px, but not LG. The only concern for me is physical size - i'm not sure if 32" would be too big for me.

HiDPI display don't cause eyestrain unless you are using them without scaling :- ).

On 32"/4K, my scaling is 150perc. The 32"/6K Apple XDR, is using 200perc. scaling, which being integer factor is sharpest, smoothest and most pleasant to look at.
My clients require 8k renderings, I need to see them comfortably during post-production and not at 1/8 zoom factor :- ).

Performance wise, they do require more resources to drive, but not necessarily during rendering. Go high enough PPI for integer scaling, and you can set even rendering to use 2X pixel size, and still retail crystal sharp and smooth font rendering.


And now to your arguments Sprayer:

VA being equal tech with IPS. Definitely not in monitors. We are not talking Samsung QLED TVs with FALD, contrast ratios of 8000:1 and higher and greater angles. We are talking monitors, and every single VA monitor, is a budget monitor.
Given that color calibration is far easier to be achieved with IPS dictated market development where only IPS-like panels (so LG IPS, Sharp IGZO, Samsung PLS, AU-Optronics AHVA) are developed at high standard of quality.

Common VA Monitor has barelly contrast ratio of 2000:1, far cry from where TV VA panels are. The viewing angles are abysmall, but those are not ideal in high-end TVs either, but here they are barely tolerable. I have 32" 500 Euro Samsung VA monitor in Livingroom because it is only seen as pretty decoration and... it's just bad.
Brightness has nothing to do with panel types, and most VA monitors on market don't have stronger backlighting than IPS offering. This is made up point. The highest HDR monitors on market, ProArt PA32UCX and Apple XDR, both 1600 NITs peak brightness, are both IPS. And for good reasons.
High-end IPS monitor panels reach 1300:1 contrast in bigger size (27"+) or even 1500:1 in laptops with glass coverings (like latest Dell XPS 16:10 panels with 500 Nits and 100perc. DCI-P3 coverage). The advantage VA has here... is minimal, almost imperceptible.
I am not saying they are bad, they still have their audience, but it's not professional users. And that is why you won't find one even marketed as so.

Argument with buying from factory maker. In theory it makes sense, in practice this only applies in few occasions. LG IPS is one such case, but Dell, BenQ or Apple have historically offered better build quality, factory calibration,etc.

Highly agree with https://www.tftcentral.co.uk/ . Other great source is RtRatings. But both have pretty limited sampling size.



Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-08-06, 22:55:51
Reply #9

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8779
  • Let's move this topic, shall we?
    • View Profile
    • My Models
HiDPI display don't cause eyestrain unless you are using them without scaling :- ).

But that's exactly how i want to use the monitor :] Somehow scaling makes no sense for me at all.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2020-08-07, 00:26:39
Reply #10

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website
HiDPI display don't cause eyestrain unless you are using them without scaling :- ).

But that's exactly how i want to use the monitor :] Somehow scaling makes no sense for me at all.

But have you experienced it :- )? People have poor opinion of scaling because it was implemented poorly in Windows from start. It's very good right now, at least OS wide, but still not smooth experience like on MacOS for all software since they don't follow unified approach.

I for example, thought high-refresh rate displays were overblown... until I bought Razer laptop last week with 144 (so not even the trendy 240). I got converted... in five seconds. It was 'wow' moment. And not playing any games...just moving cursor around.
Same goes for integer-scaled HiDpi displays. I am in absolute love with the screen of my SurfacePro. 12" at 3K, resulting in 270 PPI. When I used it next to my 140 PPI main display... it's just sad moment. I want to scale that Surface Display into 32".

Ask any programmer what they think of HiDPI, they love it. I know guys running code on multiple of the 8K Dell UP3218K. It's different, much more natural experience, the clarity is unreal, the ease of reading text and vector graphics is comparable to high-end print magazine almost.
Now there are some odd controversial arguments people will make about some visual accuity calculation, and how much can you perceive with 20/20 eyesight. But pixel grid stopping to be perceptible is not limiting factor, you still perceive far more clarity up to absurd PPI, it still affects AA.
Similar to how digital sensor are not stopping at 100 MPX, despite diffraction setting in (and grumpy folks complaining of marketing drive). You still get more accurate color information.

Regardless, this is where the industry moves, although slowly and from multiple directions:

-) Near future (next 5 years)  : Mini-LED FALD 500-1000 Zones, 120 HZ RR, 140-180 PPI, 1500:1 Static contrast for Desktop via IPS, 8000 Static contrast via VA for TVs, 1 million dynamic contrast for both + any OLED. Full coverage of DCI-P3.
-) Long future (next 10 years): Micro-LED alternative to OLED. 240+ HZ 180+ PPI, Static and Dynamic contrast like OLED, much higher peak brightness, best displays reaching Rec2020
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-08-07, 05:48:44
Reply #11

sebastian___

  • Active Users
  • **
  • Posts: 200
    • View Profile

Ask any programmer what they think of HiDPI, they love it. I know guys running code on multiple of the 8K Dell UP3218K. It's different, much more natural experience, the clarity is unreal, the ease of reading text and vector graphics is comparable to high-end print magazine almost.

................

 much higher peak brightness,

I don't get some of the attraction for some of these features.
I mean we had high quality print magazines and posters for many many years and I never thought - if only my monitor would have details or text like this. Also never heard anyone before the advent of "retina" screens wishing their text look like high quality print. I guess it would make sense for people sitting very close to their monitors, but what about those who sit at a sizable distance ?
  I'm not saying it's bad. I'm just saying it should be low priority. From my part bring on the infinite resolution and huge refresh rate as long as it doesn't take from other more important things.

  I still cringe when I see people playing games with all the effects stopped and geometry set to low poly but with super high resolution, so you have simple objects, bad lighting with low poly, but super sharp lines. I mean, people don't wish for film like video games anymore ? Since when are films super sharp but low poly ? If anything films have infinite polygons (because it's reality) , tons of effects and motion blur, glow, DOF, bloom, but not super sharp.
The most expensive cameras in the world - the Arri cameras did not even had 4k cameras until recently.

And about higher brightness I thank god my monitor has a software slider on my desktop so I can constantly move the brightness up and down from minimum to maximum. I think it's a good idea when you see white pages for hours, or when watching the monitor close to the sleeping time - I have to remember to bring the brightness down, or else I heard it interferes with the sleeping.
 And now I'm trying to imagine my monitor at 3 times the brightness or 10 times. Uhhh my eyes :)

 /end of the mini rant :)

2020-08-07, 11:40:33
Reply #12

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8779
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Juraj, you're right i had horrible experience with scaling, but it has been very long time ago, probably in windows XP. I believe that situation is a lot better now. I also believe you when you say that everything on hidpi monitors looks so much better. I never sit in front of such monitor and i believe it would be wow moment when i would see it for the first time, but i also know how quickly that wow factor wears away and you basically are left only with downsides of such monitor. And biggest downside for me would be higher resource demand of such monitor. My current pc is almost 7 years old (monitor is 13 years old!) and i'm still feel comfortable to work on it. I'm sure that wouldn't be the case if my monitor would have 4K resolution. If i'd had all the money in the world, sure it would be nice to have all the latest toys with the bling bling they bring, but for now i value real estate on my workspace over perceived crispness. If i'll buy higher resolution monitor, i want that every single pixel would contribute to more space. I don't need virtual pixels :] I'm not saying that hidpi is gimmick for everyone, but it's useless for me and i see scaling as waste.
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2020-08-11, 13:26:17
Reply #13

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website


I got somewhat lost here :- )

Priorities between resolution, refresh rate, backlight quality, color gamut,etc.. it's all being advanced equally at the same time. Obviously everyone has preferences for some different aspects. Good, because the market has a billion options for everyone.
If you argue about the science of HiDPI displays being far more pleasant on your eyesight, ask the optometrist what they suggest :- ).

And people not wanting magazine like print-quality? Look at Kindle E-Ink type displays and books, the difference is stil collossal in experience and companies are racing to bring high-DPI e-inks, but the price is prohibitive.

And biggest downside for me would be higher resource demand of such monitor.

And that is indeed the biggest, but also the only downside. (Unless we consider "big pricetag" also downside :- ).
The original scepticism above from you and others, was that it is more demanding on eyesight, but it's built on premise on lack of proper scaling. I only wanted to argue the opposite, it's actually far more easy on eyesight and your optometrist would agree.
Looking at my 400+ DPI cellphone is just glorious and I enjoy it so much I wanted to swap it to latest 550 one. I just keep reading all news and articles on it instead of my PC.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-08-11, 14:46:53
Reply #14

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8779
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Just to clarify, when i talked about strain on the eyes, i had in mind smaller text and other UI elements, because of smaller pixel pitch. I don't think that hidpi in itself can cause eye fatique or other issues. Not sure about benefits though :]
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2020-08-12, 03:10:11
Reply #15

Luke

  • Active Users
  • **
  • Posts: 75
    • View Profile
    • LC3D
would this be a good option?

Dell UltraSharp U2720Q 27" 4K UHD 99% sRGB USB-C IPS Monitor
https://www.dell.com/en-au/shop/ultrasharp-27-4k-usb-c-monitor-u2720q/apd/210-auzu/monitors-monitor-accessories

2020-08-12, 13:21:43
Reply #16

sebastian___

  • Active Users
  • **
  • Posts: 200
    • View Profile
I got somewhat lost here :- )

Maybe I did not phrase everything quite right.
Last time when I bought a kindle I chose the hiPPI version instead of the lower res one. But in that case it came with no drawbacks.
Same with phones - they just need to render a bunch of text and photos so they can afford to have high resolutions and it looks good. But I believe with many intensive 3d games, phones will use much lower resolutions and upscale to fit the screen.

And some years ago when I checked my eyesight it was good, no need for glasses :) so yeah I can see the difference and having a screen like a high quality magazine is nice, but years ago (and today) everyone struggled with not enough power for their graphics, and no one complained about their crysis game or 3ds max not looking like a super high dpi magazine.
People saying "once you go retina screen" you can't go back, but I guess I'm lucky in that regard because I see the difference but I have no discomfort working on "normal" resolutions screens.

I think people should switch to hiDPI and super high refresh rates when most of the things in graphics will become real-time.
I used to work in music production and I witnessed that happening. Everything switched to real-time at some point and it was great.

I guess the same thing will happen in graphics. With 2d graphics work it already kind of happened. So now we just need to wait for the 3d part :)

2020-08-12, 16:46:25
Reply #17

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website
The optometrist remarks wasn't for sharp eyesight :- ) My eyesight sucks. Minus five dioptries myopia. It was remarks on 'ease' on eyes. Displays with higher accuity are less stressing, the clean anti-aliasing is easier on eyes.

People didn't complain about Crysis not looking sharp enough because no one was even able to run it properly :- ). Just kidding, people didn't complain about 8MB of memory either.  Doesn't say much. Even now a days most gamers/users will use some kind of "XYZ doesn't make sense" and think that's it.

I am using 3x 4K displays for work, doesn't really stress my GPU to any perceivable performance drawback. If I were using 8K display, I would just set pixel size in Corona framebuffer to 2X (Effectively creating constant 200perc. zoom), and IR rendering speed would be identical. Average high-end GPU has no issue to drive 3x 8K 60HZ displays through DisplayPort (or upcoming HDMI 2.1).
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-08-14, 02:26:00
Reply #18

sebastian___

  • Active Users
  • **
  • Posts: 200
    • View Profile
I am using 3x 4K displays for work, doesn't really stress my GPU to any perceivable performance drawback.

Maybe there's a non linear GPU usage or logarithmic or something as you move on to higher resolutions, at least for the 2d side as I imagine that's the one more involved when driving monitors.
But still with my weird setup I should watch out for any extra percent of GPU/CPU usage. I have a super old i7 930, 24GB ram, Intel X58 mainboard with PCIe 2.0, and a RTX 2080 - that's what makes it weird I guess. A somewhat new GPU on a 10 year or older mainboard and CPU.

 A year ago I bought a higher core Xeon based on advice on this forum and other articles, should be compatible with my mainboard,  but I have not mounted yet because I only worked on elements and not on whole scenes, and you just don't need power for single trees, single rock, single car and so on. But I will as I will move to jungles.

I wonder who else from the whole internet has a setup like me ? :)  BTW, new games with high graphics like Forza Horizon 4 work with all the effects on max, decent resolution (for me), and 70 fps.

2020-08-14, 23:55:55
Reply #19

Ekladiuos

  • Active Users
  • **
  • Posts: 30
    • View Profile
Guys I got PD2700U I am so happy with it so far. the first impression that its great colors and sharpness.

Thank you, everyone.
Thanks, Juraj.

2020-08-17, 13:40:12
Reply #20

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website

I wonder who else from the whole internet has a setup like me ? :)  BTW, new games with high graphics like Forza Horizon 4 work with all the effects on max, decent resolution (for me), and 70 fps.

WOW :- )! Pre-SandyBridge CPU, that is getting somewhat archaic :- ).

10 years ago, I bought my first PC workstation, and I waited three weeks after Christmas for SandyBridge (i7 2600k legend!) to arrive, since it was massive step (30perc. more performance at least) at the time.
The fact you kept this CPU is impressive :- ). It does limit your GPU quite a lot, but only above 60fps in realtime, and probably not at all for rendering which doesn't depend on bandwidth.

If you were to upgrade to some current CPU like Ryzen 3xxx or IceLake Intel (10xxx) you would feel like on another planet, the jump would be so drastic it can't be described :- ).

Lot of people are still working on 10 years old CPU, but it's mostly SandyBridge ones not Bloomfield.

And who else has that? I know of Francesco(Cecofuli) :- ). Not sure why he never upgraded.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-08-17, 15:47:32
Reply #21

sebastian___

  • Active Users
  • **
  • Posts: 200
    • View Profile
Maybe I'm getting a bit off-topic but I think these are interesting times where you can have a 10 year old computer who will perform absolutely adequate for a lot of tasks.
I remember this not to be the case many years ago, when every 2 years or every year there was a dramatic improvement and you could absolutely not play the top of the line game (with effects set on high) on a 5 year old computer.

I can open countless of tabs in Chrome, I can play 4k videos in youtube without a problem, and most importantly I can have pretty fast updates on interactive renderings as long as I have simple scenes or just a complex object like a huge tree.
 Using Corona for a high resolution render and waiting until the noise would be very low would probably take a pretty long time (I'm not sure as I never did a final render), but just playing and adjusting materials or moving the viewport, the IR is pretty fast.
 Octane or Redshift demo does not update quite that fast like Corona but final renders are much faster, probably from the oversized GPU (compared to CPU).

Davinci Resolve works in realtime and Premiere Pro plays 4.5k Red Raw footage in realtime with GPU acceleration. Plus GPU effects like color correction and gaussian blur and so on. What more would you need ? :)

And I'm sure lots of games would play very well because of that GPU and 24GB ram, even though bottlenecked.

CPU heavy things like cloth simulation and tyFlow and so on are taking a big hit I imagine, compared to new CPU's.

I still have sitting in a box the Xeon X5670 Six Core, Thermal Grizzly Kryonaut paste and a huge Noctua cooler. I will mount them as soon as my scenes would get complex.

Can you guess if I would to mount a second RTX 2080 - would I get a somewhat double render speed for a GPU renderer ? Or would my computer explode and desintegrate from the abomination of pairing 2 new cards with such old components ?

2020-08-17, 20:45:06
Reply #22

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4743
    • View Profile
    • studio website
Two 2080 would be little bit pushing it :- ). Starting fully off-topic thread wouldn't be bad idea though, I'll make it once I finish projects this week so I am not tempted to procrastrinate there.

The CPU development had "interesting curve". i7 9xx (Bloomfield) ---) i7 2xxx (SandyBridge) were massive upgrade 10 years ago, one of the biggest advances ever. But then the next 7 years brought almost no movement at all. So those early SandyBridge CPUs were super-valued among gamers. I sold my i7 2600K for 200 Euro 5 years later after I bought it for 330Euro 10 years ago? Just ridiculous :- ).

3 years though, AMD Ryzen finally accelerated the speed and made another rapid improvement similar to Bloomfield--) SandyBridge. This year CPUs, are just amazing. They are so good, that laptops are finally powerful enough for work, not just "eh, ok, doable I guess".

GPUs on other hand, had much more linear development, until last 3 years where things started to get slow, so exact opposite of CPU development. And looks like that will be over when nVidia Ampere 3xxx and AMD Navi2 will come this fall. It might be another small revolution. The best idea than, will be to upgrade and not buy second GPU.
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2022-10-20, 12:48:23
Reply #23

fabio81

  • Active Users
  • **
  • Posts: 444
    • View Profile
Go high enough PPI for integer scaling, and you can set even rendering to use 2X pixel size, and still retail crystal sharp and smooth font rendering.

sorry if I reopen this topic,
Juraj could you explain this step to me?
Soon I will have a configuration on windows with 2 4k monitors side by side and I'm afraid of finding myself wrong with the necessary resolution.
Doing the 150% upscale, don't I get a slight delay with the mouse?
Also I didn't understand how to set the rendering to 2pixel size

thanks