This sample contains code that performs TensorRT inference on Jetson.
- Download ONNX U^2-Net Model from PINTO_model_zoo.
- Convert ONNX Model to Serialize engine and inference on Jetson.
- Jetson
- JetPack 4.6
Clone PINTO_model_zoo repository and download MIRNet model.
git clone https://github.com/PINTO0309/PINTO_model_zoo.git
cd PINTO_model_zoo/043_face_landmark/01_float32
./download.sh
Install onnxruntime and tf2onnx.
pip3 install onnxruntime tf2onnx
Convert TensorFlow Lite Model to ONNX Model.
python3 -m tf2onnx.convert --opset 13 --tflite ./keypoints.tflite --output ./keypoints.onnx
Check trtexec
/usr/src/tensorrt/bin/trtexec --onnx=./keypoints.onnx
Install pycuda.
See details:
sudo apt install python3-dev
pip3 install --user cython
pip3 install --global-option=build_ext --global-option="-I/usr/local/cuda/include" --global-option="-L/usr/local/cuda/lib64" pycuda
Clone repository.
cd ~
git clone https://github.com/NobuoTsukamoto/tensorrt-examples
cd tensorrt-examples
git submodule update --init --recursive
Copy keypoints.onnx
to tensorrt-examples/models
.
cp ~/PINTO_model_zoo/043_face_landmark/01_float32/keypoints.onnx ~/tensorrt-examples/models/
cd ~/tensorrt-examples/python/utils
python3 convert_onnxgs2trt.py \
--model /home/jetson/tensorrt-examples/models/keypoints.onnx \
--output /home/jetson/tensorrt-examples/models/keypoints.trt \
Finally you can run the demo.
python3 trt_face_landmark_capture.py \
--model ../../models/keypoints.trt