Author Topic: Realtime engine and render engine ?  (Read 6027 times)

2018-04-18, 11:21:27

Christa Noel

  • Active Users
  • **
  • Posts: 911
  • God bless us everyone
    • View Profile
    • dionch.studio
Based on Ondra's comment in Calculate Lighting for whole scene Like Unreal Engine. He said something like Corona is not a realtime engine and will never be. I thought realtime engine is an faster render-engine which uses a lot of approximation ways, not a game engine.
so i'm wondering, what is the difference between Realtime Engine (game engine) and Render Engine (corona)?

2018-04-18, 11:27:08
Reply #1

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
there can be some naming ambiguity, I am not claiming I am using the terms correctly, and there is probably no super-clear standard. But I always understood "realtime" engines to be technologically very close or identical to "game" engines - difference being, once the technology is used for architecture etc., in the other for games
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2018-04-18, 11:43:51
Reply #2

Christa Noel

  • Active Users
  • **
  • Posts: 911
  • God bless us everyone
    • View Profile
    • dionch.studio
.. once the technology is used for architecture etc., in the other for games
hmm understood. But, can you explain some things about the technologies used in both realtime engine and corona which makes them really difference and you dont want it?
sorry i know this is a dumb question, but i'd like to understand this better

2018-04-18, 14:22:00
Reply #3

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 12768
  • Marcin
    • View Profile
hmm understood. But, can you explain some things about the technologies used in both realtime engine and corona which makes them really difference and you dont want it?
sorry i know this is a dumb question, but i'd like to understand this better

It would be best if Ondra (or someone else from the dev team) answered on this, but here is my 2 cents. There is no amazing info here, just some obvious ideas:

There are various types of "render engines", and if we are considering "real-time" vs "offline" render engines, I would divide them into:

-Full real time (game) engines:
Most of the work is done on the GPU because GPUs are designed for exactly this kind of work. We are dealing with relatively low-poly geometry (for faster load times and good FPS), and complex shaders rendered by the GPU (bump, reflections, other effects). Such render engines are used in games and also for some archviz works (like UE, Unity). The "rendering" speed is at least 20+fps to have smooth motion. The idea is to have good realism-performance-bias ratio.
The output is a 3d scene, where you can manipulate objects, move camera, etc.

-Offline render engines:
Can use CPU or GPU (or both). The idea is to have as photorealistic output as possible. We can have whatever complex geometry and materials that fit in the RAM. An example of this is regular rendering in Corona. The expected "fps" is very low - like 1 frame per hour or so, depending on scene complexity. It is so slow because the engine is calculating physical phenomena such as light bouncing around very carefully, and for each pixel.
The output is a 2D image, which is generated based on the 3d scene. You can no longer manipulate objects, or move camera. You can only edit the image in 2D.

-Semi-real time engines
Rendering is done on the GPU or CPU, and basically it is the same as "offline" rendering, but the image is rendered as quickly as possible, based on some optimizations. An example of this is interactive rendering in Corona. The expected fps is not as high as in real time engines, but should be as high as possible, and definitely not as low as in offline rendering.
The output is a 2D image like in the case of an offline renderer.

Note:
-If you render an extremely complex scene in high enough resolution in a real-time engine, it may take 1 hour or more to render!
-Likewise, if you render an extremely simple scene in an offline renderer, you may get 20+ fps!

Now we cannot say that we "do not want" real-time rendering in Corona. But it is just a very different thing. It is like comparing a petrol engine and a waterfall turbine, and asking why we don't want waterfall turbines in a car. ;)
Marcin Miodek | chaos-corona.com
3D Support Team Lead - Corona | contact us

2018-04-18, 15:22:26
Reply #4

burnin

  • Active Users
  • **
  • Posts: 1535
    • View Profile
and another (very important) divide:
Ray tracer (game engines) VS Path Tracer (Corona, Arnold, Maxwell, Lux, Cycles, PRMan, Clarisse...) 

RT render engine is at least 60fps...

[/youtube]

&

Quote
A few thoughts on the recent realtime ray tracing demos
 
 22 March 2018 05:21 PM
 
 
 Hey guys,
 
 
 In the last few days, we received a number of questions regarding the recent ray tracing demos that were shown around GDC. We are referring to Microsoft’s DXR and NVidia’s RTX.
 
 
 As some of you might already know, the Redshift founders are all ex-videogame developers: Rob and I were rendering leads and Nic was doing tools. So it goes without saying that seeing this tech arrive to videogames is great news! We think better reflections, ray traced AO, better area lights, etc will really help the look and realism!
 
 
 One question we’ve been getting asked a lot is “will this tech make it into Redshift”? The answer to any such tech questions is: of course we’ll consider it! And, in fact, we often do before we even get asked the question!
 
 
 However, we think it’s important that everyone fully understands what they’ve been seeing in these demos. And what are the misconceptions and “technical gotchas” that might not be clearly mentioned in the context of professional/production rendering.
 
 
 So, the first misconception we see people fall victim to is that these demos are fully ray traced. I’m afraid this is not the case! These demos use ray tracing for specific effects. Namely reflections, area lighting and, in some cases, AO. To our knowledge, none of these demos ray traces (“path traces”) GI or does any elaborate multi-bounce tracing. That is to say: it’s using rasterization, like modern videogames do. In plain English, this means that a fairly good chunk of what you see on screen is done with tech that exists today - if you’re a user of Unity or Unreal or any other DirectX/OpenGL render engine. What you don’t have today are the fully ray traced reflections (instead, you get screen-space or cubemap solutions), the more realistic ray traced area shadows (instead, you get shadow map area shadows) and ray traced AO (instead, you get SSAO).
 
 
 The second misconception has to do with what hardware these demos are running on. Yes, this is Volta (a commercial GPU), but in quite a few of these cases it’s either with multiple of them or with extreme hardware solutions like the DGX-1 (which costs $150000). Of course despite all that, seeing the tech arrive is still exciting because, as we all know, hardware evolves and the performance you get today from a $3000 or $150000 solution, you’ll get in a few years time from a much cheaper solution. So while this is “bad” today, it does show what will be possible in the not-so-far future.
 
 
 The third misconception is that, if this technology was to be used in a production renderer like Redshift, you’d get the same realtime performance and features. Or that, “ok it might not be realtime, but surely it will be faster than today”. Well… this one has a slightly longer answer…
 
 
 The main reason why a production renderer (GPU or not) cannot produce “true” realtime results at 30 or 60fps isn’t because you don’t have multiple Voltas. The reason is its complicated rendering code - which exists because of user expectations. It simply does too much work. To explain: when a videogame wants to ray trace, it has a relatively simple shader to generate the rays (reflection, AO, shadow) and relatively simple shaders to execute when these rays hit something in the scene. And such a renderer typically shoots very few rays (a handful) and then denoises. On the other hand, when a renderer like Redshift does these very same operations, it has to consider many things that are not (today) necessary for a videogame engine. Examples include: importance-sampling, multiple (expensive) BRDFs, nested dielectrics, prep work for volume ray marching, ray marching, motion blur, elaborate data structures for storing vertex formats and user data, mesh-light links, AOV housekeeping, deep rendering, mattes, trace sets, point based techniques.
 
 And last but certainly not least… the shaders themselves! Curvature, for example, uses ray tracing on each and every intersection. Same with round corners. And then there’s the concepts of layering multiple materials (each one shooting its own rays) and procedural bump maps which means lots more behind-the-scenes shading work you guys don’t even see. And let’s not forget the concept of out-of-core rendering! The list goes on and on and I’m pretty sure I’ve neglected topics here! A videogame doesn’t need most of those things today. Will it need them a few years from now? Sure, maybe. Could it implement all (or most) of that today? Yeah it could. But, if it did, it’d be called Redshift!
 
 
 We’re fully aware that the above might sound like we are trying to belittle these demos. I want to stress that this is not the case at all!. We are genuinely excited about it and have no doubt that it will keep evolving as GPUs get faster and faster. And, if it evolves in a way that we can ‘serve’ it to Redshift users without having to sacrifice 70% of Redshift’s features then we will absolutely use it!
 
 
 Closing, I just wanted to re-iterate that we’re always closely watching all rendering-related tech and always ask the question “can our users benefit from this?”. It is part of our job to do so! In this case, this question doesn’t have a super-easy answer but you can bet we’re thinking about it! If any decisions are made, you’ll hear about them here in these forums.
 
 
 If you have any thoughts that you’d like to share below… please feel free!
 
 
 Thanks!
 
 
 
 -Panos

 - source @ RedShift forums -
« Last Edit: 2018-04-18, 15:47:16 by burnin »

2018-04-18, 15:32:15
Reply #5

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 8856
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Maybe you wanted to say rasterizers vs ray-tracers/path-tracers?
I'm not Corona Team member. Everything i say, is my personal opinion only.
My Models | My Videos | My Pictures

2018-04-18, 15:45:59
Reply #6

burnin

  • Active Users
  • **
  • Posts: 1535
    • View Profile
all need rasterization... added extra info, hopefully will clear most of it
« Last Edit: 2018-04-18, 15:49:33 by burnin »

2018-04-20, 01:14:28
Reply #7

Benny

  • Active Users
  • **
  • Posts: 170
    • View Profile
The article from the Redshift guys was very interesting, it feels as if we are approaching a crossroad in the market. It can't be easy positioning a product like Corona on a 2-3 year outlook with all this market chatter and noise. I think it is a general impression that real-time alternatives are progressing faster than traditional render engines, and not only due to hardware.

Enscape for Revit is an incredibly automatic solution compared to anything I've seen so far, and although I personally dislike Lumion's workflow it too can generate decent imagery close to real-time. The Redshift guy seems to argue that real-time such as Unreal/Nvidia RTX is faster because they don't have to consider extensive code support for heavy render algoritms. Blender's Evee is further complicating the landscape by offering a real-time modeling environment, and then supposedly a similar in end result renderer in Cycles, when you want to squeeze out extra quality.

This makes me worried about the approach by Chaos/Corona. VrayRT seems to be a Vray offer for GPU but still uncompromising and therefore not that real-time, and Corona seems to reject the whole GPU trend altogether.

If 3ds Max would announce something like Evee, with the option to render and bake in Vray or Corona, I would be fine but right now I'm wondering what the value proposition is in 3 years when some of the things we've seen in Evee/Unreal is more than good enough for 99% of the architectural market. Our clients want VR today, not tomorrow.

2018-04-20, 01:51:54
Reply #8

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
That seems to move really quickly these days indeed! What a wonderful period! Every year there is a new groundbreaking tech announced.

I've been particularly impressed by this one :


2018-04-20, 12:19:48
Reply #9

Christa Noel

  • Active Users
  • **
  • Posts: 911
  • God bless us everyone
    • View Profile
    • dionch.studio
I've been particularly impressed by this one :
vimeo is still banned here in my country.. i cant believe it. but fortunately, there's vpn :D
yes i remember Atom View, that's really amazing. and streamable via Cloud, This way we can help designers do better presentation to their clients :) with Corona (the video said there was vray for maya there doesnt it?). But i guess it is very pricy and need sky level GPU card to operate.

Maru's "a petrol engine and a waterfall turbine" that's funny :) and nice explanation in RedShift forum shared by Burnin.
and Eevee by Blender is so so amazing, everything is amazing when it comes to reach the best presentation quality in efficient ways..

so i have another stupid question .. is Nitrous in my viewport is a realtime engine?

2018-04-20, 13:06:30
Reply #10

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
so i have another stupid question .. is Nitrous in my viewport is a realtime engine?

Sure it is ! It's a shitty one TBH but it remains a real-time engine.

2018-04-20, 13:23:13
Reply #11

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
One of the best example of true real-time path tracer is brigade. This is an OTOY product, the creators of Octane :


2018-04-25, 11:03:56
Reply #12

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 9048
  • Turning coffee to features since 2009
    • View Profile
I wanted to post something, but the redshift explanation is correct and unbiased explanation of the situation ;)
Rendering is magic.How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)