Skip to content

jbrowning/llama-index-openairesponses-bug

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LlamaIndex OpenAIResponses Bug Demo

This repository showcases a current bug in the LlamaIndex OpenAIResponses implementation.

When the following conditions are met:

  1. A FunctionAgent with tools
  2. When OpenAIResponses is the llm
  3. When the response is NOT streamed...

LlamaIndex will return an empty response immediately after the tool call instead of fetching the LLM's final response.

To run this example:

  1. Copy .env.example to .env
  2. Add your OpenAI key to .env
  3. Run the script:
$ uv run python main.py

This will result in the following output:

OpenAI response: The result of \(2 \times 3 + 4\) is 10.
OpenAIResponses response:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages