Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Observability] Add observability support for function calling. #1924

Open
Cirilla-zmh opened this issue Dec 13, 2024 · 5 comments
Open

[Observability] Add observability support for function calling. #1924

Cirilla-zmh opened this issue Dec 13, 2024 · 5 comments

Comments

@Cirilla-zmh
Copy link

Expected Behavior

The observability of Spring AI is quite impressive, and it has helped us resolve numerous issues. Thank you so much!

However, as a crucial component of the Spring AI, function callings lack support for observability. In fact, our agent application contains a significant number of function calling invocations, and we would like to see them as spans in the trace. This visibility would help us understand how the LLM organizes these function calls and allow us to see the input and output during the invocation process (it's important data for evaluate LLM correctness).

We have conducted some preliminary research and believe that adding observability instrumentation to the org.springframework.ai.chat.model.AbstractToolCallSupport#executeFunctions method would be a good solution. If you have plans to support this but are currently not available, we would be more than happy to contribute.

Current Behavior

Function callings lack support for observability.

Context

No more context. If you have more you need to know, feel free to ask. :)

@chickenlj
Copy link

I have encountered exactly the same issue when using Spring Ai, it would be great if the function calling branch could be displayed in the tracing result.

@Cirilla-zmh
Copy link
Author

By the way, we are also attempting to use the opentelemetry-java-instrumentation to export trace data generated by Spring AI to an OTLP-compatible observability backend. However, we have not yet found a suitable implementation approach. Are there any best practices or similar solutions that you could recommend? I would greatly appreciate any guidance.

Additionally, I have already raised this issue in the OpenTelemetry community:
open-telemetry/opentelemetry-java-instrumentation#12878

@xiaohai-78
Copy link

Observability for function calls would help me pinpoint performance bottlenecks more accurately. Looking forward to this feature!

@ThomasVitale
Copy link
Contributor

@Cirilla-zmh thanks for submitting this issue. I've started working on an implementation for this.

It's worth noting that Spring AI is instrumented the same way the rest of the Spring portfolio, based on the Micrometer Observation API. You can export traces and metrics via OTLP to an OpenTelemetry backend by adding the needed Micrometer dependencies.

Using opentelemetry-java-instrumentation would be an alternative to Spring-own instrumentation. You can get the metrics and traces supported by the OpenTelemetry Java Instrumentation, but you won't get the ones defined by Spring itself (including Spring AI).

You can find full observability examples for models and vector stores here: https://github.com/ThomasVitale/llm-apps-java-spring-ai/tree/main/observability

@Cirilla-zmh
Copy link
Author

I've started working on an implementation for this.

Cool! Thank you and look forward to seeing it soon. ;)

As for best practices when using OpenTelemetry, let's discuss those under the OpenTelemetry project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants