Skip to content

[Logging] Support logging once #1431

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
May 14, 2025
Merged

[Logging] Support logging once #1431

merged 4 commits into from
May 14, 2025

Conversation

kylesayrs
Copy link
Collaborator

Purpose

  • Add ability to log messages only once

Changes

  • Add support_log_once filter
  • Integrate _initialize_metric_logging function to reduce verbosity

Testing

from llmcompressor.logger import logger
logger.bind(log_once=True).info("This will only log once")
# 2025-05-14T10:14:30.987512-0400 | <module> | INFO - This will only log once
logger.bind(log_once=True).info("This will only log once")
logger.bind(log_once=True).info("This will only log once")

logger.bind(log_once=True).info("This is unique")
# 2025-05-14T10:14:42.987610-0400 | <module> | INFO - This is unique
logger.bind(log_once=True).warning("This is unique")
# 2025-05-14T10:14:57.907901-0400 | <module> | WARNING - This is unique

Signed-off-by: Kyle Sayers <[email protected]>
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@kylesayrs kylesayrs added the ready When a PR is ready for review label May 14, 2025
Signed-off-by: Kyle Sayers <[email protected]>
Copy link
Collaborator

@brian-dellabetta brian-dellabetta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be worth adding this to a logger invocation somewhere in this PR, so we can see an example in the code

Signed-off-by: Kyle Sayers <[email protected]>
Copy link
Collaborator

@brian-dellabetta brian-dellabetta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very nice!

@kylesayrs kylesayrs enabled auto-merge (squash) May 14, 2025 20:19
@kylesayrs kylesayrs merged commit c00f238 into main May 14, 2025
11 checks passed
@kylesayrs kylesayrs deleted the kylesayrs/log-once-filter branch May 14, 2025 21:00
dsikka pushed a commit that referenced this pull request May 29, 2025
## Purpose ##
* Test discrepancies between initialized parameters and values
calculated by observers
* Reveal potential issue with how qparams are initialized
neuralmagic/compressed-tensors#308
* Add warning for when users attempt to quantize groups that aren't
perfectly divisible

## Prerequisites ##
* #1431

## Changes ##
* Added `test_observers_update` in
`tests/llmcompressor/modifiers/calibration/test_observers.py`
* Add a warning for attempts to quantize indivisible groups

```
Attempting to quantize a module weight whose columns (3420) are not divisible by group_size (128). This scheme is not supported by vLLM, please consider adjusting the group_size for modules with this number of columns
```

## Testing ##
* This test fails without CT changes, but succeeds with them

---------

Signed-off-by: Kyle Sayers <[email protected]>
aireilly pushed a commit to aireilly/llm-compressor that referenced this pull request Jul 30, 2025
## Purpose ##
* Add ability to log messages only once

## Changes ##
* Add `support_log_once` filter
* Integrate `_initialize_metric_logging` function to reduce verbosity

## Testing ##
```python3
from llmcompressor.logger import logger
logger.bind(log_once=True).info("This will only log once")
# 2025-05-14T10:14:30.987512-0400 | <module> | INFO - This will only log once
logger.bind(log_once=True).info("This will only log once")
logger.bind(log_once=True).info("This will only log once")

logger.bind(log_once=True).info("This is unique")
# 2025-05-14T10:14:42.987610-0400 | <module> | INFO - This is unique
logger.bind(log_once=True).warning("This is unique")
# 2025-05-14T10:14:57.907901-0400 | <module> | WARNING - This is unique
```

---------

Signed-off-by: Kyle Sayers <[email protected]>
aireilly pushed a commit to aireilly/llm-compressor that referenced this pull request Jul 30, 2025
## Purpose ##
* Test discrepancies between initialized parameters and values
calculated by observers
* Reveal potential issue with how qparams are initialized
neuralmagic/compressed-tensors#308
* Add warning for when users attempt to quantize groups that aren't
perfectly divisible

## Prerequisites ##
* vllm-project#1431

## Changes ##
* Added `test_observers_update` in
`tests/llmcompressor/modifiers/calibration/test_observers.py`
* Add a warning for attempts to quantize indivisible groups

```
Attempting to quantize a module weight whose columns (3420) are not divisible by group_size (128). This scheme is not supported by vLLM, please consider adjusting the group_size for modules with this number of columns
```

## Testing ##
* This test fails without CT changes, but succeeds with them

---------

Signed-off-by: Kyle Sayers <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants