Skip to content

tekmen0/azure-inference-test

Repository files navigation

azure-inference-test

test-codes for azure nnunet pipeline inference

how it works?

Deployment with docker file :

  1. Build dockerfile

    docker build -t nnunet_local_inference_env .
    
  2. Run dockerfile

    docker run -v ${pwd}:/app -p 5001:5001 nnunet_local_inference_env
    

Note : If you are running from docker desktop, don't forget to mount project root directory to /app directory

Deployment with bare python environment :

Quick note : Workflow tested with python 3.10.6

  1. Clone this repository

    git clone https://github.com/tekmen0/azure-inference-test.git
    
  2. Move to repository folder

    cd azure-inference-test
    
  3. (Optional) Activate python virtual environment.

  4. Install the inference test server

    python -m pip install azureml-inference-server-http
    
  5. Install requirements.

    pip3 install requirements.txt 
    
  6. Move to server directory

    cd server
    
  7. Start inference server

    azmlinfsrv --entry_script score.py
    

Now your scoring is deployed locally, running at 127.0.0.1:5001

You can send request to endpoint 127.0.0.1:5001/score for executing the 'run' function in score.py

Example request using python can be found in request.py, you may also want to use curl or postman

About

test-codes for azure nnunet pipeline inference

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published