Skip to content

Conversation

danny-avila
Copy link
Owner

Fully addresses and closes #229 so that max context is handled for all gpt models appropriately (around 4k, 8k, and 32k for gpt-3.5, gpt-4, and gpt-4-32k respectively) and no manual edit of the code is necessary.

#263 actually was causing issues with openAI API as I was mistaken about the max_tokens parameter.

Closes #237

Fixes react errors introduced by #260

@danny-avila danny-avila merged commit 6049c9e into main May 14, 2023
@danny-avila danny-avila deleted the minor-fixes branch May 14, 2023 21:26
cnkang pushed a commit to cnkang/LibreChat that referenced this pull request Feb 6, 2024
…vila#269)

* fix: react errors

* fix: max tokens issue

* fix: max tokens issue
jinzishuai pushed a commit to aitok-ai/LibreChat that referenced this pull request May 20, 2024
…vila#269)

* fix: react errors

* fix: max tokens issue

* fix: max tokens issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

popup window display problem How to set the Max token of GPT-4
1 participant