Skip to content

Conversation

@KohakuBlueleaf
Copy link
Collaborator

Description

Fix the bugs in torch.nn.MultiheadAttention since it doesn't have "weight" attribute

also improve the logic in sending orig_weight into network_module

Checklist:

@KohakuBlueleaf KohakuBlueleaf changed the title Fix bugs for torch.nn.MultiheadAttention Fix bugs in built-in lora system for torch.nn.MultiheadAttention Mar 9, 2024
@KohakuBlueleaf KohakuBlueleaf changed the title Fix bugs in built-in lora system for torch.nn.MultiheadAttention Fix built-in lora system bugs caused by torch.nn.MultiheadAttention Mar 9, 2024
@AUTOMATIC1111 AUTOMATIC1111 merged commit 4c9a7b8 into dev Mar 9, 2024
@AUTOMATIC1111 AUTOMATIC1111 deleted the dora-weight-decompose branch March 9, 2024 05:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants