Skip to content

When LangChain makes a streaming call to a function with no parameters, it does not receive a toolcall response. #8518

@LuckyMouseLai

Description

@LuckyMouseLai

Checked other resources

  • This is a bug, not a usage question. For questions, please use the LangChain Forum (https://forum.langchain.com/).
  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
import { HumanMessage, SystemMessage} from "@langchain/core/messages";
import { z } from "zod";
export async function runQwenWithTools() {
    const llm_openai = new ChatOpenAI({
        model: "qwen-max",
        apiKey: process.env.QWEN_API_KEY,
        configuration: {
            baseURL: process.env.QWEN_API_URL
        },
    });

    const getCurrentTime = tool(
        async () => { 
            return (`current time: ${new Date().toLocaleString()}`); 
        },
        {
            name: "get_current_time",
            description: "get current time",
            // schema: z.object({ dummy: z.optional(z.string().describe("placeholder"))}),
            schema: z.object({}),
        }
    );

    console.log(JSON.stringify(getCurrentTime));
    const llm_with_tools = llm_openai.bindTools([getCurrentTime]);

    const dialogs = [ 
        new SystemMessage({content: "You are a helpful assistant."}),
        new HumanMessage({content: "get current time"}),
    ];

    console.log("streaming mode:");
    const stream = await llm_with_tools.stream(dialogs);
    for await (const chunk of stream) {
        console.log(`chunk: ${JSON.stringify(chunk)}`);
    }

    console.log("\nnon-streaming mode:");
    const result = await llm_with_tools.invoke(dialogs);
    console.log(`full resposse: ${JSON.stringify(result)}`);
}

Error Message and Stack Trace (if applicable)

streaming mode:
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","ChatMessageChunk"],"kwargs":{"content":"","response_metadata":{"prompt":0,"completion":0,"usage":{}},"additional_kwargs":{}}}
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","ChatMessageChunk"],"kwargs":{"content":"","response_metadata":{"prompt":0,"completion":0,"finish_reason":"tool_calls","system_fingerprint":null,"model_name":"qwen-max","usage":{}},"additional_kwargs":{}}}
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"","response_metadata":{"usage":{"prompt_tokens":175,"completion_tokens":10,"total_tokens":185,"prompt_tokens_details":{"cached_tokens":0}}},"usage_metadata":{"input_tokens":175,"output_tokens":10,"total_tokens":185,"input_token_details":{"cache_read":0},"output_token_details":{}},"tool_calls":[],"invalid_tool_calls":[],"tool_call_chunks":[],"additional_kwargs":{}}}****

non-streaming mode:
full resposse: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessage"],"kwargs":{"content":"","additional_kwargs":{"tool_calls":[{"function":{"name":"get_current_time","arguments":"{}"},"index":0,"id":"call_9950f1ec785b4fe5b044a3","type":"function"}]},"response_metadata":{"tokenUsage":{"promptTokens":175,"completionTokens":10,"totalTokens":185},"finish_reason":"tool_calls","model_name":"qwen-max"},"id":"chatcmpl-e5c0637f-5e0f-9ea6-a5d0-62664b777462","tool_calls":[{"name":"get_current_time","args":{},"type":"tool_call","id":"call_9950f1ec785b4fe5b044a3"}],"invalid_tool_calls":[],"usage_metadata":{"output_tokens":10,"input_tokens":175,"total_tokens":185,"input_token_details":{"cache_read":0},"output_token_details":{}}}}

Description

When using streaming mode, the tool function is not triggered (details are shown in Error Message), while in non-streaming mode it works as expected.
To work around this, I added an optional parameter to the function schema.
However, I still expect the function to receive a toolcall in streaming mode, even if it doesn't require any actual parameters.

add an optional parameter:

    const getCurrentTime = tool(
        async () => { 
            return (`current time: ${new Date().toLocaleString()}`); 
        },
        {
            name: "get_current_time",
            description: "get current time",
            schema: z.object({ dummy: z.optional(z.string().describe("placeholder"))}),
            // schema: z.object({}),
        }
    );
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"","tool_call_chunks":[{"name":"get_current_time","args":"{\"dummy","id":"call_7ddc512083844de7b3a15d","index":0,"type":"tool_call_chunk"}],"additional_kwargs":{"tool_calls":[{"index":0,"id":"call_7ddc512083844de7b3a15d","type":"function","function":{"name":"get_current_time","arguments":"{\"dummy"}}]},"id":"chatcmpl-3c9fcd3a-4155-90a4-9597-194b9adb0adc","response_metadata":{"prompt":0,"completion":0,"usage":{}},"tool_calls":[],"invalid_tool_calls":[{"name":"get_current_time","args":"{\"dummy","id":"call_7ddc512083844de7b3a15d","error":"Malformed args.","type":"invalid_tool_call"}]}}
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"","tool_call_chunks":[{"args":"\": \"dummy\"}","id":"","index":0,"type":"tool_call_chunk"}],"additional_kwargs":{"tool_calls":[{"index":0,"id":"","type":"function","function":{"arguments":"\": \"dummy\"}"}}]},"id":"chatcmpl-3c9fcd3a-4155-90a4-9597-194b9adb0adc","response_metadata":{"prompt":0,"completion":0,"usage":{}},"tool_calls":[],"invalid_tool_calls":[{"args":"\": \"dummy\"}","id":"","error":"Malformed args.","type":"invalid_tool_call"}]}}
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"","tool_call_chunks":[{"args":null,"id":"","index":0,"type":"tool_call_chunk"}],"additional_kwargs":{"tool_calls":[{"function":{"arguments":null},"index":0,"id":"","type":"function"}]},"id":"chatcmpl-3c9fcd3a-4155-90a4-9597-194b9adb0adc","response_metadata":{"prompt":0,"completion":0,"usage":{}},"tool_calls":[{"name":"","args":{},"id":"","type":"tool_call"}],"invalid_tool_calls":[]}}
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"","tool_call_chunks":[],"additional_kwargs":{},"id":"chatcmpl-3c9fcd3a-4155-90a4-9597-194b9adb0adc","response_metadata":{"prompt":0,"completion":0,"finish_reason":"tool_calls","system_fingerprint":null,"model_name":"qwen-max","usage":{}},"tool_calls":[],"invalid_tool_calls":[]}}
chunk: {"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"","response_metadata":{"usage":{"prompt_tokens":190,"completion_tokens":19,"total_tokens":209,"prompt_tokens_details":{"cached_tokens":0}}},"usage_metadata":{"input_tokens":190,"output_tokens":19,"total_tokens":209,"input_token_details":{"cache_read":0},"output_token_details":{}},"tool_calls":[],"invalid_tool_calls":[],"tool_call_chunks":[],"additional_kwargs":{}}}

System Info

linux
langchain: "^0.3.27"
node version v20.19.2
yarn version v1.22.22

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions