Skip to content

Conversation

AlejandroArciniegas
Copy link
Contributor

@AlejandroArciniegas AlejandroArciniegas commented Mar 21, 2023

Based on this issue I have added a drop in conditional import to create an http client from the fetch client if the app is compiled for the web.

This allows Flutter Web clients to get streamed chat completions responses word by word providing a better user experience for chat applications using this package.

PD: This change doesn't affect unit tests or other platforms and it is implemented with minimum overhead.

@AlejandroArciniegas
Copy link
Contributor Author

@anasfik please check when you can :)

@anasfik
Copy link
Owner

anasfik commented Mar 24, 2023

Hi, sorry for late respond, I will check it rn.

@AlejandroArciniegas
Copy link
Contributor Author

Strange, the test ran locally just fine :/

@anasfik
Copy link
Owner

anasfik commented Mar 24, 2023

Hi, there is no issue with your code, it is an OpenAI's, merging..

@anasfik anasfik merged commit 32889e5 into anasfik:main Mar 24, 2023
@anasfik
Copy link
Owner

anasfik commented Mar 24, 2023

especially an issue right now with the /edits API, which throws that the model: code-davinci-edit-001 does not exist. your code is merged and thanks you very much for your contribution.

@odedsolutions
Copy link

I'm still seeing an issue with stream on Flutter web - dart_openai: ^4.0.0 -

Streaming works great on other platforms, but not on web. It seems the entire response arrives, and is then released at once to the code

My understanding from this thread, is that this should be resolved by now? Am I missing a configuration step?

Many thanks!! 🙏🏻

@vikramsinghrajpoot
Copy link

is it fixed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants