Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - hybaj

Pages: [1]
1
It's here and it's amazing (for the price)


at 9:51 Corona render times!!!

I7-6900K - 128 seconds
Ryzen 1800X - 138 seconds

2
Some practical idea: Ditch the current filmic, it's just not working at all. It's very weirdly behaving, the shoulder is not very pleasing and the shadows are somehow oddly exposure dependent.
So what should be the replacement?

My idea would be if you could basically modify all of the curves Photoshop style (by adding/subtracting points on the curve and adjusting the bezier handles)  that filmic uses for it's color voodoo + having the abbility to save this as LUT (when maybe playing around with colors in Davinci Resolve, Nuke or etc) or just some preset for later re-use.


3
[Max] General Discussion / Re: GPU Doubts....
« on: 2017-02-23, 17:56:52 »
Quote
Asics for rendering have been proposed so god damn often, I lost count.
Not to mention insane dev costs, not a single prototype has ever been shown, by now I ignore all of the asics nonsense. (Xeon Phi doesn't count) Closest we got were the bitcoin asics units.

Umm Imagination Technologies have already released Caustic R2100, R2500 cards that had a limited sort of commercial run and people have used them with the Caustic Visualizer (development of the Visualizer ended in 2015) and liked them. Cards were released in 2013 and discontinued just year later in 2014. So umm not sure if we're talking about the same things... if not then I'm sorry.

Quote
No, we are not. There are multiple players in this field, otoy brigade at the fore front. If there is someone who show a prototype, it is them. And even they stated often, this is far, far off into the Future™
If it actually happens, Siggraph will have a conference and every outlet will write about it. We will notice.

Until then, next time you read "look at this proposal X, that we revolutionize field Y with Incredible Product Z by company Ω", don't click and give them adsense money.

It was Jules Urbach who said that they received a new PowerVR 2W PCI-E prototype for testing and achieved 50 to 120 mrays - in a video there are little glimpses of 2 realtime scenes being rendered and it seems fast. He also said they received a 10W prototype and it "scaled perfectly" which probably means at least 5x the performance. Then in an recent interview for Road to VR he said that the probably upcomming 120W card would do 6 brays. This all seems to be their plan for the future of Brigade in Unity. Foveated rendering for VR HMDs in real-time - sort of his words too.

Even John Carmack has shown optimism about the whole thing in a tweet "I am very happy with the advent of the PVR Wizard ray tracing tech. RTRT HW from people with a clue!" from 18th of March 2014

So if I'm getting this correctly - Urbach is not someone from Imagination Technologies talking about their product. I'm not sure if there is any lying involved or over-hyping. He does seems to be invested since he's planning a grand future for both Octane and Brigade in 2 biggest game engines.

4
[Max] General Discussion / Re: GPU Doubts....
« on: 2017-02-23, 13:37:30 »
GPUs might not be the real competitor of CPUs actually when it comes to raytracing.

A CEO of a company that does GPU rendering package in a recent interview said that they theoretize that a new version of the raytracing ASIC (application specific integrated circuit) from Imagination Technologies (company that does GPUs for iPhones) at the power draw of 120W could do 6 billion rays/pix/sec.

A 2 Watt mobile version already does around 150 million rays/pixel/sec!

So 6 billion at 120 watts even if you take in account scenes where rays have to have complicated paths (which are enemy of the GPU) it's still orders of magnitude faster than what todays CPUs or GPUs can do. We're talking about realtime path-tracing at let's say 1080p basically.

I do wonder what might be the downsides of using such architecture - which types of scenes/materials etc might pose a problem for such tech when it comes to utilization in high-end VFX or real-time games.

5
This is such an interesting topic but also one very very difficult to fully understand.

In fact it's so hard on the brain cells that many many cinema production professionals still don't understand that when they get their 12/14/16bit RAW files from their 60 thousand dollar cameras they get just get an image that has not been debayered but it has been already heavily modified by the firmware magic on the camera - the secret sauce behind every camera manufacturer (just like pokoy wrote earlier). They actually believe what they get is the direct signal from sensor which is something so far from truth.

Canon and Nikon DSLRs perform amazing in studio lighting (light that is usually very "white" when talking in kelvin temps) and really rival analog film in these situations. But when you switch to outdoors or different types of lights the image usually falls apart and does not look good anymore (while analog did) - it needs to be brought back in photoshop or any other image processing software. Canon even with their understanding of color have failed at creating a proper cinema camera (C100,C300,C500) - they have created a sort of bland depressing look which actually works for documentaries but not for cinema. Their C700 camera has colors and "tone-mapping" that is almost identical to their DSLR range which in my opinion won't work for cinema - this only shows that they are trying to backtrack to something that worked for them in the past.. they are out of ideas. A multibillion image hardware corporation has really ran out of ideas which is something really amazing.

Arri Alexa - the first ever camera that provides a very durable "cinematic" look right out of the camera. Great dynamic range and beautiful color - very nice desaturation of highlights. Their color processing and dynamic range is still unrivaled and the camera hardware is over 6 year old already.

Maxwell render - Maxwell render during beta and the first version had a very special tone-mapping and color response.. it made images really look good without any post work.

So really.. it'd be amazing if someone ever would get his hands on the direct firmware code of the cameras to see what color math acrobatics they do. LUTs are simple linear transforms which do not catch all of the intricacies of what goes on in the firmware.

Getting to know what the cameras really do (from sensor signal to raw file) It would really really help the renderer developers too - if you could quickly match your renderings to live action footage that would be insanely helpful for VFX.

analog film - Kodak Vision 5207

digital camera - Arri Alexa

Pages: [1]