Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] Support for LiteLLM and LiteLLM proxy #1213

Open
danrr opened this issue Jan 30, 2025 · 0 comments
Open

[Feature request] Support for LiteLLM and LiteLLM proxy #1213

danrr opened this issue Jan 30, 2025 · 0 comments

Comments

@danrr
Copy link

danrr commented Jan 30, 2025

LiteLLM (https://github.com/BerriAI/litellm) provides a common interface (using the OpenAI API) for a lot of models and providers.

Has support for LiteLLM been considered?

I can imagine it as a prefix for the model given as input e.g. litellm/azureai/llama-2-70b-chat-wnsnw for the litellm package and litellm-proxy/azureai/llama-2-70b-chat-wnsnw for the proxy using openai.

This could help expand the range of models and features beyond what's available in inspect_ai without requiring a custom model implementation from the consumer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant