Skip to content

Conversation

@adityaomar3
Copy link
Contributor

This PR adds the streaming feature by Open-AI assistants API.
Previous: Chats were concatenated together and displayed on screen, the wait time for replies in that approach was too long.
Streaming enables the communication of server sent events, which reduces the latency and wait time.
Below is a sample test-video displaying the working of stream.

streaming-2024-06-26_17.47.02.mp4

Copy link
Contributor

@rmackay9 rmackay9 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this. This is the feature that I've most been looking forward to adding to the chat module!

I've tested this myself and it is working as expected.

In case anyone is interested in the details, the OpenAI documentation shows two different ways that streaming can be done. We chose to go with the first method shown below.

  1. One method involves overriding the EventHandler class and using a with/as block
with client.beta.threads.runs.stream(
  thread_id=thread.id,
  assistant_id=assistant.id,
  instructions="Please address the user as Jane Doe. The user has a premium account.",
  event_handler=EventHandler(),
) as stream:
  stream.until_done()
  1. Another method involves creating a run and then iterating over the stream events (see here and slide the "Streaming" slider to the right)
stream = client.beta.threads.runs.create(
  thread_id="thread_123",
  assistant_id="asst_123",
  stream=True
)

for event in stream:
  print(event)

@peterbarker peterbarker merged commit 6bd0450 into ArduPilot:master Jun 27, 2024
@adityaomar3 adityaomar3 deleted the streaming_chat branch June 27, 2024 05:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants