Author Topic: pratical lightfields are almost there  (Read 2780 times)

2020-04-29, 14:57:40

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
Have you seen this?


Lightfields are almost there!

Imagine rendering some views of a scene and then be able to visit it in VR? Futur is exciting!

paper : http://www.matthewtancik.com/nerf

edit : paper link

2020-04-29, 15:13:24
Reply #1

Juraj

  • Active Users
  • **
  • Posts: 4762
    • View Profile
    • studio website
Crazy thing to me is it only needed to generate depth map primarily? Cant we feed it cgi data with zdepth (and world normals, etc..) and get even better results faster?
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-04-29, 15:26:40
Reply #2

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
Crazy thing to me is it only needed to generate depth map primarily? Cant we feed it cgi data with zdepth (and world normals, etc..) and get even better results faster?

That's how I understand it too, it does not seem that different from photogrammetry on the first steps... Then the AI brings the magic. As for the CGI input, that's immediately what comes to my mind. Also using the passes to help solve the tricky parts such as reflection and refractions. I Did not read the paper yet but this stuff looks really Impressive.

edit : Also
« Last Edit: 2020-04-30, 10:38:17 by Fluss »

2020-04-30, 10:41:39
Reply #3

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
As for getting faster results, imagine reconstructing a whole scene with just a couple of 360s finely placed.

2020-04-30, 11:04:20
Reply #4

Juraj

  • Active Users
  • **
  • Posts: 4762
    • View Profile
    • studio website
Ah... I actually imagined the exact opposite usage :- ).

I never do 360s "VR" for clients, rarely people ask for it. But I can imagine if we could look around in them... they would suddenly look lot more impressive. And LOT MORE than Unreal, which naturally can already provide this effect because the data in real-time is there...

That was my first thought of seeing this being used, moving around in 360s static rendered images. High quality from off-line ray-tracing, but the freedom to look like in real-time with static position (which is the future, even Half-Life Alyx has statis position because they did the research and apparently locomotion sucks).
Please follow my new Instagram for latest projects, tips&tricks, short video tutorials and free models
Behance  Probably best updated portfolio of my work
lysfaere.com Please check the new stuff!

2020-04-30, 12:56:11
Reply #5

Fluss

  • Active Users
  • **
  • Posts: 553
    • View Profile
Ah... I actually imagined the exact opposite usage :- ).

I never do 360s "VR" for clients, rarely people ask for it. But I can imagine if we could look around in them... they would suddenly look lot more impressive. And LOT MORE than Unreal, which naturally can already provide this effect because the data in real-time is there...

That was my first thought of seeing this being used, moving around in 360s static rendered images. High quality from off-line ray-tracing, but the freedom to look like in real-time with static position (which is the future, even Half-Life Alyx has statis position because they did the research and apparently locomotion sucks).

I tried this and I wasn't that impressed, I take ages to render and definition is bad. Not to mention that refraction is not handled nicely: https://www.presenzvr.com/