You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@michalfaber Thanks for your great work!!
When I test the image 'ski.jpg'(shape: 7126743), 'model.predict(input_img)' took about 1200ms with a TITAN X GPU(only with scale 1). But in the caffe version, 'output_blobs = net.forward()' only took about 72ms. Can you help me figure it out? Thanks a lot!!
The text was updated successfully, but these errors were encountered:
@mingmingDiii @allenwangcheng I executed several iterations of model.predict(input_img) - only with scale 1 and shape 712x673x3. The first iteration took around 2300ms but all next iterations around 260ms (GPU 1070). I am not sure but there may be an additional cost of recompiling computation graph. I noticed that there is a latency peak every time when input image size changes (scale = multiplier[m]).
@michalfaber Thanks for your great work!!
When I test the image 'ski.jpg'(shape: 7126743), 'model.predict(input_img)' took about 1200ms with a TITAN X GPU(only with scale 1). But in the caffe version, 'output_blobs = net.forward()' only took about 72ms. Can you help me figure it out? Thanks a lot!!
The text was updated successfully, but these errors were encountered: