Skip to content

Add: documentation for enhanced save_pretrained parameters #1377

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 23, 2025

Conversation

rahul-tuli
Copy link
Collaborator

This PR adds comprehensive documentation for the compression parameters available in the enhanced save_pretrained method. These parameters are critical for users working with model compression but were previously undocumented.

Changes

  • Adds a new docs/save_pretrained.md file explaining:
    • How the enhanced save_pretrained method works
    • Detailed descriptions of all compression parameters
    • Code examples showing common usage patterns
    • Notes on compatibility with loading compressed models

Benefits

  • Better User Experience: Users can clearly understand all available options
  • Improved Onboarding: New users can quickly learn how to save compressed models
  • Comprehensive Examples: Shows different approaches for saving models with compression

This documentation supports ticket and will help users leverage the full capabilities of the compression functionality in the save process.

@rahul-tuli rahul-tuli requested a review from Copilot April 23, 2025 20:49
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces new documentation in docs/save_pretrained.md to explain the enhanced save_pretrained functionality with additional compression parameters.

  • Provides detailed descriptions of new parameters.
  • Offers complete code examples for both automatic and manual wrapping approaches.
  • Includes notes on compatibility with loading compressed models.

@rahul-tuli rahul-tuli self-assigned this Apr 23, 2025
@rahul-tuli rahul-tuli marked this pull request as ready for review April 23, 2025 20:49
kylesayrs
kylesayrs previously approved these changes Apr 23, 2025
@rahul-tuli rahul-tuli force-pushed the add-save-pretrained-doc branch from c6e7bbd to 38faad8 Compare April 23, 2025 22:01
@dsikka dsikka enabled auto-merge (squash) April 23, 2025 22:05
@dsikka dsikka merged commit e627edf into main Apr 23, 2025
7 of 8 checks passed
@dsikka dsikka deleted the add-save-pretrained-doc branch April 23, 2025 22:09
kylesayrs pushed a commit that referenced this pull request Apr 29, 2025
This PR adds comprehensive documentation for the compression parameters
available in the enhanced `save_pretrained` method. These parameters are
critical for users working with model compression but were previously
undocumented.

## Changes

- Adds a new `docs/save_pretrained.md` file explaining:
  - How the enhanced `save_pretrained` method works
  - Detailed descriptions of all compression parameters
  - Code examples showing common usage patterns
  - Notes on compatibility with loading compressed models

## Benefits

- **Better User Experience:** Users can clearly understand all available
options
- **Improved Onboarding:** New users can quickly learn how to save
compressed models
- **Comprehensive Examples:** Shows different approaches for saving
models with compression

This documentation supports
[ticket](https://issues.redhat.com/browse/INFERENG-578) and will help
users leverage the full capabilities of the compression functionality in
the save process.

---------

Signed-off-by: Rahul Tuli <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants