Skip to content

♻️ fix vllm:main for model_config.task #341

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Jul 29, 2025
Merged

Conversation

prashantgupta24
Copy link
Collaborator

@prashantgupta24 prashantgupta24 commented Jul 28, 2025

It seems model_config.task is deprecated, instead from what I understand we can use model_config.supported_tasks, which is initialized when an llm engine is instantiated:
https://github.com/vllm-project/vllm/pull/21470/files#diff-7eaad0b7dee0626bf29d10081b0f0c5e3ea15a4af97e7b182a4e0d35f8346953R705-R706

To maintain backward compatibility, it's a bit tricky since:

  • Earlier version had model_config.task pointing to the task and model_config.supported_tasks as a list of all tasks which could contain more than 1 task
    model_config.task :  generate
    model_config.supported_tasks :  {'embed', 'reward', 'generate', 'classify'}
    
  • Latest main now populates model_config.supported_tasks as the only task the model supports.
    model_config.task :  None
    model_config.supported_tasks :  ['generate']
    

Copy link

👋 Hi! Thank you for contributing to vLLM support on Spyre.
Just a reminder: Make sure that your code passes all the linting checks, otherwise your PR won't be able to be merged. To do so, first install the linting requirements, then run format.sh and commit the changes. This can be done with uv directly:

uv sync --frozen --group lint --active --inexact

Or this can be done with pip:

uv pip compile --group lint > requirements-lint.txt
pip install -r requirements-lint.txt
bash format.sh

Now you are good to go 🚀

@prashantgupta24 prashantgupta24 marked this pull request as ready for review July 28, 2025 22:28
@prashantgupta24 prashantgupta24 changed the title 🐛 fix vllm:main ♻️ fix vllm:main Jul 28, 2025
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Co-authored-by: Max de Bayser <[email protected]>

Signed-off-by: Max de Bayser <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
Signed-off-by: Prashant Gupta <[email protected]>
@prashantgupta24
Copy link
Collaborator Author

bot:test
MARKERS="spyre"

Copy link
Collaborator

@yannicks1 yannicks1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Collaborator

@maxdebayser maxdebayser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@prashantgupta24 prashantgupta24 merged commit aa3874e into main Jul 29, 2025
18 checks passed
@prashantgupta24 prashantgupta24 deleted the fix-main-999 branch July 29, 2025 20:49
@prashantgupta24 prashantgupta24 changed the title ♻️ fix vllm:main ♻️ fix vllm:main for model_config.task Aug 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants