Skip to content

Conversation

@winter-wang
Copy link
Contributor

@winter-wang winter-wang commented Jan 2, 2025

PR Category

Auto Parallel

PR Types

New features

Description

Add high level api LocalLayer as a wrapper for local view. (#70525

Other

Pcard-67164

@winter-wang winter-wang force-pushed the develop branch 2 times, most recently from 11c28f1 to 82e4348 Compare January 3, 2025 02:30
@paddle-bot
Copy link

paddle-bot bot commented Jan 6, 2025

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Copy link
Contributor

@jeff41404 jeff41404 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

from paddle import nn

class CustomLayer(LocalLayer):
def __init__(self, mesh):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

推荐给用户的使用方式,是不是在CustomLayer里直接传入dist_attr更好一些?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

好的,下一个 pr 里面改为转入 out_dist_attrs

# doctest: +REQUIRES(env:DISTRIBUTED)
mesh = dist.ProcessMesh([0, 1], dim_names=["x"])
custom_layer = CustomLayer(mesh)
input_tensor = dist.auto_parallel.api.dtensor_from_local(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个示例中,能不能加一些注释展示input_tensor经过local_layer前后的数据变化,以让用户通过这个例子能更直观地理解loacl视角和global视角操作的不同。

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

下一个 pr 里面修改为类似 mask_lm_loss 的例子

inputs[idx]
)
outputs = Layer.__call__(self, *inputs, **kwargs)
list_outs = paddle.utils.flatten(outputs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里是否可以检查一下out_dist_attrs和list_outs的数量是否匹配,这应该是用户使用时很容易犯的一个错误。

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里是否可以检查一下out_dist_attrs和list_outs的数量是否匹配,这应该是用户使用时很容易犯的一个错误。

下一个 pr 添加

@winter-wang winter-wang merged commit afcd24b into PaddlePaddle:develop Jan 8, 2025
30 of 31 checks passed
@AndSonder
Copy link
Contributor

@From00 后续修改已添加在 #70668

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants