Skip to content

Conversation

danny-avila
Copy link
Owner

Summary

This PR addresses several fixes across the Anthropic client and related services. The changes include a correct passing of client options during Anthropic client initialization, the refactoring of database interactions to occur after final client responses, the proper setting of default temperature values, and the correction of preset titles and settings for the Anthropic endpoint.

The PR also includes several critical fixes to enhance the user experience and maintain the intended functionality. Notably, a fix has been implemented in response to a user-reported issue where the model preset and temperature settings were being incorrectly altered upon editing Claude's responses. The issue, which was reproduced across multiple browsers and devices, involved the model preset defaulting to Claude-1 and disrupting user-defined temperature settings. This PR addresses this behavior to ensure consistency and reliability when interacting with the Claude models.

The PR re-introduces Claude-1.2 into the ModelService default list of models for Anthropic

Change Type

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Refactoring (non-breaking change which optimizes existing functionality)

Testing

  • Confirmed that messages are saved to the database only after the final response is sent to the client.
  • Wrote new test for getModelMaxTokens.

To validate the fixes, conduct thorough testing encompassing the following scenarios:

  • Ensure the Claude-1.2 model is successfully integrated and functions as expected.
  • Test the passing of client options during initialization to verify that the settings are correctly applied.
  • After editing a response, confirm that the model preset and temperature settings remain unchanged and accurately reflect the user's original configuration.
  • Save messages to the database only after the final response is sent to the client and ensure that no conversation data is lost or incorrectly saved.
  • Lastly, ensure that the preset titles and default settings within the Anthropic endpoint are displayed correctly.

Checklist

  • My code adheres to this project's style guidelines
  • I have performed a self-review of my own code
  • I have commented in any complex areas of my code
  • I have made pertinent documentation changes
  • My changes do not introduce new warnings
  • I have written tests demonstrating that my changes are effective or that my feature works
  • Local unit tests pass with my changes
  • Any changes dependent on mine have been merged and published in downstream modules.

…max context tokens, correctly set default temperature to 1, use only 2 params for class constructor, use `getResponseSender` to add correct sender to response message
…e final response is sent to the client, and do not save conversation from route controller
@danny-avila danny-avila merged commit d7ef459 into main Nov 26, 2023
@danny-avila danny-avila deleted the anthropic-fixes branch November 26, 2023 19:44
cnkang pushed a commit to cnkang/LibreChat that referenced this pull request Feb 6, 2024
* fix: correct preset title for Anthropic endpoint

* fix(Settings/Anthropic): show correct default value for LLM temperature

* fix(AnthropicClient): use `getModelMaxTokens` to get the correct LLM max context tokens, correctly set default temperature to 1, use only 2 params for class constructor, use `getResponseSender` to add correct sender to response message

* refactor(/api/ask|edit/anthropic): save messages to database after the final response is sent to the client, and do not save conversation from route controller

* fix(initializeClient/anthropic): correctly pass client options (endpointOption) to class initialization

* feat(ModelService/Anthropic): add claude-1.2
jinzishuai pushed a commit to aitok-ai/LibreChat that referenced this pull request May 20, 2024
* fix: correct preset title for Anthropic endpoint

* fix(Settings/Anthropic): show correct default value for LLM temperature

* fix(AnthropicClient): use `getModelMaxTokens` to get the correct LLM max context tokens, correctly set default temperature to 1, use only 2 params for class constructor, use `getResponseSender` to add correct sender to response message

* refactor(/api/ask|edit/anthropic): save messages to database after the final response is sent to the client, and do not save conversation from route controller

* fix(initializeClient/anthropic): correctly pass client options (endpointOption) to class initialization

* feat(ModelService/Anthropic): add claude-1.2
BertKiv pushed a commit to BertKiv/LibreChat that referenced this pull request Dec 10, 2024
* fix: correct preset title for Anthropic endpoint

* fix(Settings/Anthropic): show correct default value for LLM temperature

* fix(AnthropicClient): use `getModelMaxTokens` to get the correct LLM max context tokens, correctly set default temperature to 1, use only 2 params for class constructor, use `getResponseSender` to add correct sender to response message

* refactor(/api/ask|edit/anthropic): save messages to database after the final response is sent to the client, and do not save conversation from route controller

* fix(initializeClient/anthropic): correctly pass client options (endpointOption) to class initialization

* feat(ModelService/Anthropic): add claude-1.2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant