-
Notifications
You must be signed in to change notification settings - Fork 945
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function Call Response Is Not Inside ChatResponse Object #368
Comments
I closed the issue after seeing that my return value appended to the response. However despite calling my function callback each time sometimes i am seeing that I can't see my callbacks response each time. Also I want to know if it is possible to retrieve just function return value? So that I can return just my function out put to the client. Thanks a lot! example prompt for my service: localhost:8080/ai/generate3?message=please draw a graph of top selling products aggregate by country |
I have debugged the outgoing connections a little bit and realized that Spring AI makes two subsequent requests. Is it intentionally? Cause the second output really messes my desired output. Is there a way to close this behavior? |
Hi. I'm not sure which two requests you are referring to. Can you please refer to the places in the code base that you are talking about and we can look into it. Thanks. |
I think this very useful. Sometimes I just need to call a function to access a simple interface, and I don't always need the model to summarize or process the return value for me. Perhaps we need a switch to decide whether the return content of each function should be processed by the model. |
Hello there I completely agree with your point of view. Being able to access the raw response from the function call, so that we can decide how to further process it, rather than receiving a response that has already been processed by the LLM. In the current implementation of Spring AI, the framework sends the function output back to the model for further processing, which, as you mentioned, may not always be ideal. Having the ability to retrieve the original function response would give us better control over the conversation flow. |
@oguzhantortop I think that 5017749 might help addressing this issue? |
@oguzhantortop , @samzhu , @qinfengge I believe that 5017749 provides all the flexibilities you are looking for? Line 269 in 110a520
Are those changes flexible enough for your use cases? |
Potentially, the short solution is to add the "ToolContext" into ChatGenerationMetadata's metadata. This can be related to this #2049 |
Hi,
I am using openAI and registered a function for setting SQL parameters. When I ask a related question such as : "draw a chart of top selling products" I can clearly see that, the function is being called. And I can see that expected parameters are also being set. But inside chatresponse object I can't see the values that I return from method.
Controller Class:
The text was updated successfully, but these errors were encountered: