You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was following the official example but I encountered an issue with containerization while using default conda due to some dependencies error. Then I ran onto this issue and change the bentofile.yaml to:
2023-07-14T07:14:11+0000 [DEBUG] [api_server:10] Default runner method set to 'predict', it can be accessed both via 'runner.run' and 'runner.predict.async_run'.
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/bentoml/_internal/server/http_app.py", line 341, in api_func
output = await api.func(*args)
File "/home/bentoml/bento/src/service.py", line 20, in predict
return await mnist_runner.run(input_arr)
File "/usr/local/lib/python3.7/site-packages/bentoml/_internal/runner/runner.py", line 52, in run
return self.runner._runner_handle.run_method(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/bentoml/_internal/runner/runner_handle/remote.py", line 290, in run_method
*args,
File "/usr/local/lib/python3.7/site-packages/anyio/from_thread.py", line 45, in run
raise RuntimeError("This function can only be run from an AnyIO worker thread")
RuntimeError: This function can only be run from an AnyIO worker thread
2023-07-14T07:14:59+0000 [INFO] [api_server:9] 172.17.0.1:33646 (scheme=http,method=POST,path=/predict,type=application/json,length=24007) (status=500,type=application/json,length=110) 4.461ms (trace=bf16c819f82aadfe0a0292c52d7064ac,span=701c33aad979d04c,sampled=0,service.name=mlflow_pytorch_mnist_demo)
anyio 3.7.1 & aiohttp 3.8.4 are installed.
I've been doing multiple tests with different versions of dependencies (including locking same versions as mentioned in conda.yaml), different model etc and result is still the same.
I even logged onto a bash session to the container to see if all required files are there, but those were present.
However running bentoml models list or bentoml list directly on the container does not return any results, is it an expected behaviour?
bentoml: 1.0.23, tried also with 1.0.1 & 1.0.8
python: 3.7.16, tried also with 3.8
platform: Linux
mlflow: 1.30.1, tried also with 2.4, 2.4.2
torch: 1.8.1, tried also with 2.0.1
torchvision: 0.9.1, tried also with 0.15.2
The text was updated successfully, but these errors were encountered:
Describe the bug
I was following the official example but I encountered an issue with containerization while using default conda due to some dependencies error. Then I ran onto this issue and change the bentofile.yaml to:
While
bentoml serve service.py:svc
works absolutely fine, same as thebentoml containerize
, the container returns an error:The only one thing I changed in
mnist.py
is an addition of MLflow tracking server uri & experiment with lines:However changing the model in
service.py
from bentoml.mlflow to bentoml.pytorch (and adjusting model's name) also produces the same error.I also tried changing
service.py
from:to:
But it also produced the same error.
According to line:
I also tried changing
service.py
to:But ended up with error:
anyio 3.7.1 & aiohttp 3.8.4 are installed.
I've been doing multiple tests with different versions of dependencies (including locking same versions as mentioned in conda.yaml), different model etc and result is still the same.
I even logged onto a bash session to the container to see if all required files are there, but those were present.
However running
bentoml models list
orbentoml list
directly on the container does not return any results, is it an expected behaviour?Example from BentoML Tutorial works fine.
Unfortunately after deeper research and support from another person I still have no idea what was not found
To reproduce
Steps to reproduce:
bentoml containerize
bentofile.yaml
to:docker run -it --rm -p 3000:3000 mlflow_pytorch_mnist_demo:latest serve
Expected behavior
Model should behave same as with
bentoml serve
, so return the 200 and prediction results:&&
Environment
bentoml: 1.0.23, tried also with 1.0.1 & 1.0.8
python: 3.7.16, tried also with 3.8
platform: Linux
mlflow: 1.30.1, tried also with 2.4, 2.4.2
torch: 1.8.1, tried also with 2.0.1
torchvision: 0.9.1, tried also with 0.15.2
The text was updated successfully, but these errors were encountered: