You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
Currently, deep-research appears to use a predefined LLM (i.e o3-mini) for conducting deep research. However, it would be beneficial to allow users to choose which LLM they want to use based on their preferences, access, or API availability.
Proposed Enhancement:
Introduce a configuration setting (e.g., in a config file or as an argument) that allows users to specify which LLM they want to use.
Ensure that the integration is modular so that adding support for new models in the future is straightforward.
Expected Benefits
Provides users with flexibility in selecting models based on cost, availability, or performance.
Increases the project's adaptability as new LLMs emerge.
Request for Approval
I am a beginner in open-source contributions and would love to work on implementing this feature. Would it be okay for me to work on this? If the maintainers have any preferences on how this should be implemented, I’d be happy to follow those guidelines.
The text was updated successfully, but these errors were encountered:
Description
Currently, deep-research appears to use a predefined LLM (i.e o3-mini) for conducting deep research. However, it would be beneficial to allow users to choose which LLM they want to use based on their preferences, access, or API availability.
Proposed Enhancement:
Expected Benefits
Request for Approval
I am a beginner in open-source contributions and would love to work on implementing this feature. Would it be okay for me to work on this? If the maintainers have any preferences on how this should be implemented, I’d be happy to follow those guidelines.
The text was updated successfully, but these errors were encountered: