Skip to content

Conversation

@jkaniecki
Copy link

No description provided.

@michalkuligowski michalkuligowski force-pushed the private/jkaniecki/cross_attn_kv_cache_update branch from 8990608 to 0757696 Compare March 25, 2025 06:55
Copy link

@michalkuligowski michalkuligowski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/run-gaudi-tests

@michalkuligowski
Copy link

/run-gaudi-tests

@jkaniecki jkaniecki changed the title Align cross-attn kv cache update to new hpu-extansion Align cross-attn kv cache update to new hpu-extension Mar 25, 2025
@michalkuligowski michalkuligowski force-pushed the private/jkaniecki/cross_attn_kv_cache_update branch from 4079ab3 to 80f087a Compare March 25, 2025 08:22
@jkaniecki
Copy link
Author

/run-gaudi-tests

1 similar comment
@michalkuligowski
Copy link

/run-gaudi-tests

@michalkuligowski
Copy link

@jkaniecki only code owner can trigger tests https://github.com/HabanaAI/vllm-fork/actions/runs/14079642472/job/39429422419

@michalkuligowski
Copy link

/run-gaudi-tests

@michalkuligowski
Copy link

/run-gaudi-tests

@michalkuligowski michalkuligowski merged commit fb1f49a into habana_main Mar 28, 2025
39 checks passed
@michalkuligowski michalkuligowski deleted the private/jkaniecki/cross_attn_kv_cache_update branch March 28, 2025 09:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants