Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add OpenRouter support #189

Open
BitVortex opened this issue Jan 9, 2025 · 1 comment
Open

feat: add OpenRouter support #189

BitVortex opened this issue Jan 9, 2025 · 1 comment

Comments

@BitVortex
Copy link

Feature Request

OpenRouter is a popular interface towards multiple LLMs. It provides an abstraction for various models with a unified API. Adding support for OpenRouter allows for the use of a multitude of additional LLMs with just a single added provider.

Proposal

It makes most sense to implement an providers::openrouter that implements CompletionModel and EmbeddingModel traits.
OpenRouter supports using the OpenAI SDK or the OpenRouter API directly, so an implementation could likely re-use large majorities of the providers::openai implementation.

@cvauclair
Copy link
Contributor

Hey @BitVortex! If the openrouter API is exactly the same as the OpenAI API (which seems to be the case), then you can simply reuse the existing openai client by changing the url using from_url, e.g.:

use rig::providers::openai;

let openrouter = openai::Client::from_url("OPENROUTER_API_KEY", "https://openrouter.ai/api/v1");

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants