You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I can imagine it as a prefix for the model given as input e.g. litellm/azureai/llama-2-70b-chat-wnsnw for the litellm package and litellm-proxy/azureai/llama-2-70b-chat-wnsnw for the proxy using openai.
This could help expand the range of models and features beyond what's available in inspect_ai without requiring a custom model implementation from the consumer.
The text was updated successfully, but these errors were encountered:
LiteLLM (https://github.com/BerriAI/litellm) provides a common interface (using the OpenAI API) for a lot of models and providers.
Has support for LiteLLM been considered?
I can imagine it as a prefix for the model given as input e.g.
litellm/azureai/llama-2-70b-chat-wnsnw
for thelitellm
package andlitellm-proxy/azureai/llama-2-70b-chat-wnsnw
for the proxy usingopenai
.This could help expand the range of models and features beyond what's available in
inspect_ai
without requiring a custom model implementation from the consumer.The text was updated successfully, but these errors were encountered: