带有填充掩码的 TransformerEncoder

Pou*_*lou 2 transformer-model attention-model pytorch

我正在尝试使用不等于 none 的 src_key_padding_mask 实现 torch.nn.TransformerEncoder。想象一下输入的形状src = [20, 95],二进制填充掩码的形状为src_mask = [20, 95],填充标记的位置为 1,其他位置为 0。我制作了一个 8 层的 Transformer 编码器,每一层都包含一个带有 8 个头和隐藏维度 256 的注意力:

layer=torch.nn.TransformerEncoderLayer(256, 8, 256, 0.1)
encoder=torch.nn.TransformerEncoder(layer, 6)
embed=torch.nn.Embedding(80000, 256)
src=torch.randint(0, 1000, (20, 95))
src = emb(src)
src_mask = torch.randint(0,2,(20, 95))
output =  encoder(src, src_mask)
Run Code Online (Sandbox Code Playgroud)

但我收到以下错误:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-107-31bf7ab8384b> in <module>
----> 1 output =  encoder(src, src_mask)

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    545             result = self._slow_forward(*input, **kwargs)
    546         else:
--> 547             result = self.forward(*input, **kwargs)
    548         for hook in self._forward_hooks.values():
    549             hook_result = hook(self, input, result)

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/transformer.py in forward(self, src, mask, src_key_padding_mask)
    165         for i in range(self.num_layers):
    166             output = self.layers[i](output, src_mask=mask,
--> 167                                     src_key_padding_mask=src_key_padding_mask)
    168 
    169         if self.norm:

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    545             result = self._slow_forward(*input, **kwargs)
    546         else:
--> 547             result = self.forward(*input, **kwargs)
    548         for hook in self._forward_hooks.values():
    549             hook_result = hook(self, input, result)

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/transformer.py in forward(self, src, src_mask, src_key_padding_mask)
    264         """
    265         src2 = self.self_attn(src, src, src, attn_mask=src_mask,
--> 266                               key_padding_mask=src_key_padding_mask)[0]
    267         src = src + self.dropout1(src2)
    268         src = self.norm1(src)

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    545             result = self._slow_forward(*input, **kwargs)
    546         else:
--> 547             result = self.forward(*input, **kwargs)
    548         for hook in self._forward_hooks.values():
    549             hook_result = hook(self, input, result)

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/activation.py in forward(self, query, key, value, key_padding_mask, need_weights, attn_mask)
    781                 training=self.training,
    782                 key_padding_mask=key_padding_mask, need_weights=need_weights,
--> 783                 attn_mask=attn_mask)
    784 
    785 

~/anaconda3/lib/python3.7/site-packages/torch/nn/functional.py in multi_head_attention_forward(query, key, value, embed_dim_to_check, num_heads, in_proj_weight, in_proj_bias, bias_k, bias_v, add_zero_attn, dropout_p, out_proj_weight, out_proj_bias, training, key_padding_mask, need_weights, attn_mask, use_separate_proj_weight, q_proj_weight, k_proj_weight, v_proj_weight, static_k, static_v)
   3250     if attn_mask is not None:
   3251         attn_mask = attn_mask.unsqueeze(0)
-> 3252         attn_output_weights += attn_mask
   3253 
   3254     if key_padding_mask is not None:

RuntimeError: The size of tensor a (20) must match the size of tensor b (95) at non-singleton dimension 2
Run Code Online (Sandbox Code Playgroud)

我想知道是否有人可以帮我解决这个问题。

谢谢

Mic*_*ngo 8

所需的形状显示在nn.Transformer.forward- 形状中(变压器的所有构建块都参考它)。与编码器相关的有:

  • 源代码:(S、N、E)
  • src_mask: (S, S)
  • src_key_padding_mask: (N, S)

其中S是序列长度,N是批量大小,E是嵌入维度(特征数量)。

填充掩码的形状应为[95, 20],而不是[20, 95]。这假设您的批次大小为 95,序列长度为 20,但如果相反,您将不得不转置src

此外,在调用编码器时,您没有指定src_key_padding_mask,而是指定src_mask,因为 的签名torch.nn.TransformerEncoder.forward是:

forward(src, mask=None, src_key_padding_mask=None)
Run Code Online (Sandbox Code Playgroud)

填充掩码必须指定为关键字参数src_key_padding_mask而不是第二个位置参数。并且为了避免混淆,您src_mask应该重命名为src_key_padding_mask.

src_key_padding_mask = torch.randint(0,2,(95, 20))
output =  encoder(src, src_key_padding_mask=src_key_padding_mask)
Run Code Online (Sandbox Code Playgroud)