-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Closed
Description
Hi! 👋
Firstly, thanks for your work on this project! 🙂
Today I used patch-package to patch @langchain/[email protected]
for the project I'm working on.
Issue: For one of my project I'm using ChatOpenAI with custom tool. I notice that the respone wasn't streaming when the tool call is being generated. A look into the distribution build shows that the Response API event response.custom_tool_call_input.delta
isn't captured in the function _convertResponsesDeltaToBaseMessageChunk
.
Here is the diff that solved my problem:
diff --git a/node_modules/@langchain/openai/dist/chat_models.cjs b/node_modules/@langchain/openai/dist/chat_models.cjs
index 3d06fbe..06a2c92 100644
--- a/node_modules/@langchain/openai/dist/chat_models.cjs
+++ b/node_modules/@langchain/openai/dist/chat_models.cjs
@@ -1374,7 +1374,7 @@ class ChatOpenAIResponses extends BaseChatOpenAI {
response_metadata[key] = value;
}
}
- else if (chunk.type === "response.function_call_arguments.delta") {
+ else if (chunk.type === "response.function_call_arguments.delta" || chunk.type === "response.custom_tool_call_input.delta") {
tool_call_chunks.push({
type: "tool_call_chunk",
args: chunk.delta,
diff --git a/node_modules/@langchain/openai/dist/chat_models.js b/node_modules/@langchain/openai/dist/chat_models.js
index d87a382..01b2984 100644
--- a/node_modules/@langchain/openai/dist/chat_models.js
+++ b/node_modules/@langchain/openai/dist/chat_models.js
@@ -1368,7 +1368,7 @@ export class ChatOpenAIResponses extends BaseChatOpenAI {
response_metadata[key] = value;
}
}
- else if (chunk.type === "response.function_call_arguments.delta") {
+ else if (chunk.type === "response.function_call_arguments.delta" || chunk.type === "response.custom_tool_call_input.delta") {
tool_call_chunks.push({
type: "tool_call_chunk",
args: chunk.delta,
This issue body was partially generated by patch-package.
Metadata
Metadata
Assignees
Labels
No labels