Author Topic: Buying a Monitor  (Read 7206 times)

2020-08-12, 03:10:11
Reply #15

Luke

  • Active Users
  • **
  • Posts: 70
    • View Profile
    • LC3D
would this be a good option?

Dell UltraSharp U2720Q 27" 4K UHD 99% sRGB USB-C IPS Monitor
https://www.dell.com/en-au/shop/ultrasharp-27-4k-usb-c-monitor-u2720q/apd/210-auzu/monitors-monitor-accessories

2020-08-12, 13:21:43
Reply #16

sebastian___

  • Active Users
  • **
  • Posts: 195
    • View Profile
I got somewhat lost here :- )

Maybe I did not phrase everything quite right.
Last time when I bought a kindle I chose the hiPPI version instead of the lower res one. But in that case it came with no drawbacks.
Same with phones - they just need to render a bunch of text and photos so they can afford to have high resolutions and it looks good. But I believe with many intensive 3d games, phones will use much lower resolutions and upscale to fit the screen.

And some years ago when I checked my eyesight it was good, no need for glasses :) so yeah I can see the difference and having a screen like a high quality magazine is nice, but years ago (and today) everyone struggled with not enough power for their graphics, and no one complained about their crysis game or 3ds max not looking like a super high dpi magazine.
People saying "once you go retina screen" you can't go back, but I guess I'm lucky in that regard because I see the difference but I have no discomfort working on "normal" resolutions screens.

I think people should switch to hiDPI and super high refresh rates when most of the things in graphics will become real-time.
I used to work in music production and I witnessed that happening. Everything switched to real-time at some point and it was great.

I guess the same thing will happen in graphics. With 2d graphics work it already kind of happened. So now we just need to wait for the 3d part :)

2020-08-12, 16:46:25
Reply #17

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4446
    • View Profile
    • studio website
The optometrist remarks wasn't for sharp eyesight :- ) My eyesight sucks. Minus five dioptries myopia. It was remarks on 'ease' on eyes. Displays with higher accuity are less stressing, the clean anti-aliasing is easier on eyes.

People didn't complain about Crysis not looking sharp enough because no one was even able to run it properly :- ). Just kidding, people didn't complain about 8MB of memory either.  Doesn't say much. Even now a days most gamers/users will use some kind of "XYZ doesn't make sense" and think that's it.

I am using 3x 4K displays for work, doesn't really stress my GPU to any perceivable performance drawback. If I were using 8K display, I would just set pixel size in Corona framebuffer to 2X (Effectively creating constant 200perc. zoom), and IR rendering speed would be identical. Average high-end GPU has no issue to drive 3x 8K 60HZ displays through DisplayPort (or upcoming HDMI 2.1).
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-08-14, 02:26:00
Reply #18

sebastian___

  • Active Users
  • **
  • Posts: 195
    • View Profile
I am using 3x 4K displays for work, doesn't really stress my GPU to any perceivable performance drawback.

Maybe there's a non linear GPU usage or logarithmic or something as you move on to higher resolutions, at least for the 2d side as I imagine that's the one more involved when driving monitors.
But still with my weird setup I should watch out for any extra percent of GPU/CPU usage. I have a super old i7 930, 24GB ram, Intel X58 mainboard with PCIe 2.0, and a RTX 2080 - that's what makes it weird I guess. A somewhat new GPU on a 10 year or older mainboard and CPU.

 A year ago I bought a higher core Xeon based on advice on this forum and other articles, should be compatible with my mainboard,  but I have not mounted yet because I only worked on elements and not on whole scenes, and you just don't need power for single trees, single rock, single car and so on. But I will as I will move to jungles.

I wonder who else from the whole internet has a setup like me ? :)  BTW, new games with high graphics like Forza Horizon 4 work with all the effects on max, decent resolution (for me), and 70 fps.

2020-08-14, 23:55:55
Reply #19

Ekladiuos

  • Active Users
  • **
  • Posts: 22
    • View Profile
Guys I got PD2700U I am so happy with it so far. the first impression that its great colors and sharpness.

Thank you, everyone.
Thanks, Juraj.

2020-08-17, 13:40:12
Reply #20

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4446
    • View Profile
    • studio website

I wonder who else from the whole internet has a setup like me ? :)  BTW, new games with high graphics like Forza Horizon 4 work with all the effects on max, decent resolution (for me), and 70 fps.

WOW :- )! Pre-SandyBridge CPU, that is getting somewhat archaic :- ).

10 years ago, I bought my first PC workstation, and I waited three weeks after Christmas for SandyBridge (i7 2600k legend!) to arrive, since it was massive step (30perc. more performance at least) at the time.
The fact you kept this CPU is impressive :- ). It does limit your GPU quite a lot, but only above 60fps in realtime, and probably not at all for rendering which doesn't depend on bandwidth.

If you were to upgrade to some current CPU like Ryzen 3xxx or IceLake Intel (10xxx) you would feel like on another planet, the jump would be so drastic it can't be described :- ).

Lot of people are still working on 10 years old CPU, but it's mostly SandyBridge ones not Bloomfield.

And who else has that? I know of Francesco(Cecofuli) :- ). Not sure why he never upgraded.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2020-08-17, 15:47:32
Reply #21

sebastian___

  • Active Users
  • **
  • Posts: 195
    • View Profile
Maybe I'm getting a bit off-topic but I think these are interesting times where you can have a 10 year old computer who will perform absolutely adequate for a lot of tasks.
I remember this not to be the case many years ago, when every 2 years or every year there was a dramatic improvement and you could absolutely not play the top of the line game (with effects set on high) on a 5 year old computer.

I can open countless of tabs in Chrome, I can play 4k videos in youtube without a problem, and most importantly I can have pretty fast updates on interactive renderings as long as I have simple scenes or just a complex object like a huge tree.
 Using Corona for a high resolution render and waiting until the noise would be very low would probably take a pretty long time (I'm not sure as I never did a final render), but just playing and adjusting materials or moving the viewport, the IR is pretty fast.
 Octane or Redshift demo does not update quite that fast like Corona but final renders are much faster, probably from the oversized GPU (compared to CPU).

Davinci Resolve works in realtime and Premiere Pro plays 4.5k Red Raw footage in realtime with GPU acceleration. Plus GPU effects like color correction and gaussian blur and so on. What more would you need ? :)

And I'm sure lots of games would play very well because of that GPU and 24GB ram, even though bottlenecked.

CPU heavy things like cloth simulation and tyFlow and so on are taking a big hit I imagine, compared to new CPU's.

I still have sitting in a box the Xeon X5670 Six Core, Thermal Grizzly Kryonaut paste and a huge Noctua cooler. I will mount them as soon as my scenes would get complex.

Can you guess if I would to mount a second RTX 2080 - would I get a somewhat double render speed for a GPU renderer ? Or would my computer explode and desintegrate from the abomination of pairing 2 new cards with such old components ?

2020-08-17, 20:45:06
Reply #22

Juraj

  • Moderator
  • Active Users
  • ***
  • Posts: 4446
    • View Profile
    • studio website
Two 2080 would be little bit pushing it :- ). Starting fully off-topic thread wouldn't be bad idea though, I'll make it once I finish projects this week so I am not tempted to procrastrinate there.

The CPU development had "interesting curve". i7 9xx (Bloomfield) ---) i7 2xxx (SandyBridge) were massive upgrade 10 years ago, one of the biggest advances ever. But then the next 7 years brought almost no movement at all. So those early SandyBridge CPUs were super-valued among gamers. I sold my i7 2600K for 200 Euro 5 years later after I bought it for 330Euro 10 years ago? Just ridiculous :- ).

3 years though, AMD Ryzen finally accelerated the speed and made another rapid improvement similar to Bloomfield--) SandyBridge. This year CPUs, are just amazing. They are so good, that laptops are finally powerful enough for work, not just "eh, ok, doable I guess".

GPUs on other hand, had much more linear development, until last 3 years where things started to get slow, so exact opposite of CPU development. And looks like that will be over when nVidia Ampere 3xxx and AMD Navi2 will come this fall. It might be another small revolution. The best idea than, will be to upgrade and not buy second GPU.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2022-10-20, 12:48:23
Reply #23

fabio81

  • Active Users
  • **
  • Posts: 395
    • View Profile
Go high enough PPI for integer scaling, and you can set even rendering to use 2X pixel size, and still retail crystal sharp and smooth font rendering.

sorry if I reopen this topic,
Juraj could you explain this step to me?
Soon I will have a configuration on windows with 2 4k monitors side by side and I'm afraid of finding myself wrong with the necessary resolution.
Doing the 150% upscale, don't I get a slight delay with the mouse?
Also I didn't understand how to set the rendering to 2pixel size

thanks