-
Notifications
You must be signed in to change notification settings - Fork 2.8k
fix: enable handleLLMNewToken events for useResponsesApi=true #8578
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
- Fix ChatOpenAI._streamResponseChunks to pass runManager to responses._streamResponseChunks - Fix ChatOpenAIResponses._streamResponseChunks to accept and use runManager parameter - Add handleLLMEnd event emission for non-streaming responses - Add regression test to verify handleLLMNewToken events are emitted - Resolves token-by-token streaming issue when useResponsesApi=true is enabled
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Skipped Deployment
|
I believe this PR also fixes #8283 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice catch @ninjudd! I don't believe this is something that was ever supported. The writeup + PR is very much appreciated ❤️
One note I had about handleLLMEnd, and I don't think the params being passed to handleLLMNewToken
are exactly accurate. I'll pick up on this tomorrow since I think it's a little tricker then what is initially let on.
|
||
// Emit handleLLMEnd event for non-streaming responses | ||
await runManager?.handleLLMEnd({ | ||
generations: [result.generations], | ||
llmOutput: result.llmOutput, | ||
}); | ||
|
||
return result; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe this is necessary? This gets handled upstream from the base chat model (if it doesn't it should, I'll take a look)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, you may be right! Given that everything but streaming works without this PR, emitting handleLLMEnd
here is probably not necessary. I can update the PR to remove this if you agree.
Thanks @hntrl! Any idea when the next release will go out? Also, where do I track that? Looks like the last tagged release on Github was June 19. |
Hey @ninjudd! Apologies for the delay! We haven't dont a good job of posting our releases as of late. This was published under |
Fixes #8577