Skip to content

Commit 261d2c4

Browse files
committed
fix bug with encoder relative pos emb
1 parent e033224 commit 261d2c4

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
setup(
44
name = 'x-transformers',
55
packages = find_packages(exclude=['examples']),
6-
version = '0.0.19',
6+
version = '0.0.20',
77
license='MIT',
88
description = 'X-Transformers - Pytorch',
99
author = 'Phil Wang',

x_transformers/x_transformers.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -239,7 +239,7 @@ def __init__(self, dim, depth, dim_head = 64, heads = 8, use_scalenorm = False,
239239
super().__init__()
240240
self.dim = dim
241241
self.layers = nn.ModuleList([])
242-
self.rel_pos = RelativePositionBias(causal = True) if rel_pos_bias else None
242+
self.rel_pos = RelativePositionBias() if rel_pos_bias else None
243243

244244
norm_class = ScaleNorm if use_scalenorm else nn.LayerNorm
245245
prenorm_fn = partial(PreNorm, dim, norm_class = norm_class)

0 commit comments

Comments
 (0)