Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can NATTEN be integrated to TensorRT? #193

Open
QishuoLu opened this issue Jan 21, 2025 · 1 comment
Open

Can NATTEN be integrated to TensorRT? #193

QishuoLu opened this issue Jan 21, 2025 · 1 comment

Comments

@QishuoLu
Copy link

No description provided.

@alihassanijr
Copy link
Member

Thank you for your interest.

NATTEN kernels are almost entirely independent of PyTorch, and the backend API is specifically designed to avoid using torch's tensor API as to make binding to other frameworks/engines easier.

However, my understanding is that there is "partial" support for binding custom ops in TRT, and it's only limited to inference, so unless a model that depends on NATTEN is going to be served with TRT, I think it should be possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants