Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Try using all of the available inference providers #12

Merged
merged 1 commit into from
Feb 12, 2024

Conversation

wkentaro
Copy link
Owner

@wkentaro wkentaro commented Feb 12, 2024

So that CUDA, TensorRT, CoreML is used when possible.

Close #11

@wkentaro wkentaro force-pushed the try_using_all_of_the_available_inference_providers branch from 605b0e2 to 6e5700a Compare February 12, 2024 11:58
@wkentaro wkentaro self-assigned this Feb 12, 2024
@wkentaro wkentaro added the feature for pr label Feb 12, 2024
@wkentaro wkentaro merged commit a4adb1e into main Feb 12, 2024
2 checks passed
@wkentaro wkentaro deleted the try_using_all_of_the_available_inference_providers branch February 12, 2024 12:36
@wkentaro wkentaro mentioned this pull request Feb 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature for pr
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant