-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
End event contains wrong data when streaming structured output #7114
Comments
The issue you're experiencing with the To address this, you might want to ensure that the event handling logic correctly accumulates and processes the streamed tokens before emitting the If this is indeed a bug in the library, it might require a fix in the event handling logic to ensure that only the final, correctly assembled output is included in the |
Hello, we are a group of students at the University of Toronto looking into fixing this issue. |
Amazing, thanks! |
I have identified this to be an issue with the _streamIterator method of the RunnableSequence class within /src/runnables/base.ts. It looks like concatSupported is hardcoded to true and only the true case is implemented. I am almost finished adding support for concatSupported = false, which the ChatOpenAI model should be ran with. |
Hi, @Stadly. I'm Dosu, and I'm helping the LangChain JS team manage their backlog and am marking this issue as stale. Issue Summary:
Next Steps:
Thank you for your understanding and contribution! |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
The
end
event is a concatenation of all the intermediary messages, and not just the final result. If, for example, these are the streamed tokens:Why
was
the
, theend
event containsWhyWhy wasWhy was the
instead ofWhy was the
.System Info
The text was updated successfully, but these errors were encountered: