Author Topic: Python code. Rendertime  (Read 1748 times)

2015-05-25, 19:16:39

alebul

  • Active Users
  • **
  • Posts: 6
    • View Profile
Hello. Sorry for my english and my python code style. This is code writen just for fun. Idea is simple. We have some render output low resulution. And we want know without rendering how long rendertime with high resolution. Get quantity of pixels from lowres renderoutput and rendertime. Divide rendertime on quantity of pixels, gettin some number, some coefficient. And multiply this coefficient on number of pixels renderoutput what we want, and we get the rendertime without rendering high resolution. Python code:
Code: [Select]
def render_coeff(width,hight,time):
        return float(time/(width*hight))
def render_time(width,hight,coeff):
        return float((width*hight)*coeff)
   
ri = input("enter rendered image width, hight, time: ")
l = ri.split(',')
coeff = render_coeff(int(l[0]),int(l[1]),int(l[2]))
 
ri2 = input("enter width and hight for final image")
l2 = ri2.split(',')
print("rendertime for this image is ", render_time(int(l2[0]), int(l2[1]), coeff), " seconds")