Skip to content

Conversation

@runzhech
Copy link
Contributor

PR Category

Custom Device

PR Types

Improvements

Description

Using "ShareDataWith" instead of "TensorCopy" for logits and softmax to avoid redundant memory alloc.

@paddle-bot
Copy link

paddle-bot bot commented Oct 23, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the XPU label Oct 23, 2024
zhiqiu
zhiqiu previously approved these changes Oct 24, 2024
Copy link
Contributor

@zhiqiu zhiqiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for shareDataWith

@houj04
Copy link
Contributor

houj04 commented Oct 24, 2024

这个PR修改了CSoftmaxWithCrossEntropyProcessGroupFunctor里面的相关代码,那CSoftmaxWithCrossEntropyFunctor里面也有类似的代码,需要改吗?

Copy link
Contributor

@lj970926 lj970926 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

logits_2d.Resize({N, D});
softmax_2d.Resize({N, D});
logits_2d.ShareDataWith(*logits).Resize({N, D});
softmax_2d.ShareDataWith(*softmax).Resize({N, D});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

下面CSoftmaxWithCrossEntropyFunctor(line 389~line 395)那里是不是也可以一起改下?

logits_2d.Resize({N, D});
softmax_2d.Resize({N, D});
logits_2d.ShareDataWith(*logits).Resize({N, D});
softmax_2d.ShareDataWith(*softmax).Resize({N, D});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sharedatawith我记得paddle同学说过,是访问同一块地址,会有潜在的风险。之前paddle cast,在数据类型相同时,就是用了sharedatawith,模型训练有时就会有奇怪的现象

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个算子比较特殊,sharedatawith的是两个临时tensor。其中logits_2d不会被修改,softmax_2d在修改后原本就会被写回原地址,逻辑上跟之前一样,所以不会有问题。

Copy link
Contributor

@zhiqiu zhiqiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@lj970926 lj970926 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@RuohengMa RuohengMa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@houj04 houj04 merged commit c22c2f5 into PaddlePaddle:develop Oct 25, 2024
27 checks passed
@runzhech runzhech deleted the softmax_crossentropy branch December 6, 2024 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants