I never understood why that option was on this list in the first place. The official answer to that request was always that it isn't possible to develop (reasonably speaking). It's a complete redo of Corona.
I get the reasoning behind why it's tricky for the developers but hear me out: CPU's are a dead end for render engines unless you have unlimited resources.
Lets look at recent shifts in the hardware space objectively. Lets take the Threadripper platform, the 3970x & 3990x are dead ends - single sockets that only lasted a generation and are now not even being made by AMD. AMD then decided to shaft everyone with "Threadripper Pro", costing 2x as much for just the CPU, then locked it down further forcing consumers to buy a full system from Lenovo. It's questionable whether AMD will ever make a non-pro "affordable" Threadripper again.
Current reports are intel are still scrambling around to get anything close to a 3990x.
Now lets look at Nvidia: you can get a 3090 for £1400 retail. You can stick as many of them as you want in any high PCIE lane motherboard going back generations, DDR RAM version agnostic. Reports are the 4090 is set to double the 3090 in performance.
With the above in mind, tell me objectively which method of rendering offers consumers/businesses the best generational leaps in performance, flexibility, upgradeability and cost?