-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve the tool calling #242
Improve the tool calling #242
Comments
/start |
Tip
|
Important
|
@0x4007 Should the examples be predefined and stored, or should they be retrieved from the manifest along with the |
This is the only viable option. Plugins will continue to be built. |
I think the first step, be updating the mainfest, to ensure it includes the examples for the plugin. And then include it in the prompt along with the command. |
The manifest is supposed to automatically update with CI when plugin source code is updated. I am pretty sure the examples are also automatically included in there so I'm not sure what the problem is but I don't think you need to change anything. @gentlementlegen rfc |
I couldn't find anything directly related to that in the |
/help |
Available Commands
|
^ currently these are retrieved from the manifest. These are meant to be manually populated by the developer, allowing you to describe the commands as you please. |
Is there any other field that could be used to add the same tool calls in the manifest ? Otherwise, manifest type would have to modified for this. An alternative for this would to begin, with dynamic few shot examples but stored in the main kernel as for now. We can move those to manifest if this works out well |
What are you trying to achieve exactly? From what I understand you want to add more examples to the manifest? |
I wanted to add some examples of tool calls for each plugin. When the prompt is built at the kernel, it should retrieve these along with the command. These examples would include a few scenarios where the tool was used and what the output was. |
If it is meant only to be used for the LLM, we can consider adding a field in the manifest. Otherwise we can make the example section as an array so we can include multiple examples. |
I think, to begin with, for the core plugins at least, let's start by storing the examples directly in the kernel. If that works well, we can update the manifest and potentially expand this to other plugins. @0x4007 rfc |
I don't have experience with this sort of problem but when it comes to LLM related matters I generally would go with shiv810's judgement. I hope that it won't clutter the codebase too much because I understand that gentlementlegen is likely trying to keep the kernel codebase clean as it should be. |
I am not sure how you can store examples in the kernel since it is dynamically generated by fetching manifests for plugins, or maybe you meant just feeding them all regardless of what plugin is running? @whilefoo rfc |
In my opinion, the manifest approach is best from the beginning as we only have core plugins right now and ultimately that's the ideal option for scalability. The other option might look like a hardcoded mapping of examples against plugin names taken from the fetched manifest, so it makes sense to embed it in the manifest from the jump I think. |
The existing {
"commands": {
"query": {
"ubiquity:example": "/query @UbiquityOS",
"description": "Returns the user's wallet, access, and multiplier information.",
"parameters": {
"type": "object",
"properties": {
"username": {
"description": "Github username",
"type": "string"
}
}
}
},
"example": [
{
"command": "Give me the details about @xxx wallet",
"parameters": {
"username": "xxx"
},
"response": "| Property | Value |\n|----------|-------|\n| Wallet | <WALLET_ADDRESS> |"
}
]
}
}
This would be retrieved from the As an alternative, just to make sure this properly works with the system, we can test this out by adding the examples in the kernel, and if that works fine we can make the change in the |
Why not directly adding the example to the {
"commands": {
"query": {
"ubiquity:example": "/query @UbiquityOS",
"description": "Returns the user's wallet, access, and multiplier information.",
"command": {
"example": [
{
"command": "Give me the details about @xxx wallet",
"parameters": {
"username": "xxx"
},
"response": "| Property | Value |\n|----------|-------|\n| Wallet | <WALLET_ADDRESS> |"
}
]
"parameters": {
"type": "object",
"properties": {
"username": {
"description": "Github username",
"type": "string"
}
}
}
},
}
}
} |
/ask design the manifest to solve the problem we are discussing |
To address the issue of integrating examples for each plugin command within the manifest, we can enhance the Updated
|
I'm concerned that putting examples in the manifest will clutter it, for example if for every command you have 3 examples and each example includes the @shiv810 Did you test it just by switching to
Currently we are using strict mode so I think it won't allow arbitrary properties in the schema, we can disable the strict mode however it means that sometimes function call won't adhere to the schema. Also I'm not sure if OpenAI does any processing of the tool calling and removes any unknown properties or just feeds it to the prompt. |
Perhaps we can assume that this LLM calling feature of plug-ins should only work with our official plug-ins. Perhaps they should not be required properties for normal plug-ins. |
It does increase the accuracy. I think it would better if we include some examples, at least for core plugins, like
It could be made as optional property, like how |
Then let's proceed and make this an optional manifest property. Let's also use the highest ranking LLM on the LLM leaderboards that supports tool calling. I figure that cost optimization can be handled much later. |
Updated the manifest with an optional property. ubiquity-os/plugin-sdk#68 |
Make sure to use openrouter so we can pick the latest and best model easily. I imagine that OpenAI has newer stuff with tool calling like o3. Also I'm pretty sure Claude also supports tool calling. 4o seems kind of dated already right? Also I'm not concerned about cost optimization for awhile. Mostly focused on top performance. |
Improve the tool calling approach by:
gpt4o
model for better performance on complex queries.The text was updated successfully, but these errors were encountered: