PyTorch:为什么要创建同一类型图层的多个实例?

use*_*330 5 python instance pytorch dropout

此代码来自 PyTorch 转换器:

    self.linear1 = Linear(d_model, dim_feedforward, **factory_kwargs)
    self.dropout = Dropout(dropout)
    self.linear2 = Linear(dim_feedforward, d_model, **factory_kwargs)
    self.norm1 = LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)
    self.norm2 = LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)
    self.norm3 = LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)
    self.dropout1 = Dropout(dropout)
    self.dropout2 = Dropout(dropout)
    self.dropout3 = Dropout(dropout)
Run Code Online (Sandbox Code Playgroud)

为什么他们要添加self.dropout1, ...2,...3self.dropout已经存在并且是完全相同的功能时?

self.linear1另外,( , self.linear2) 和 之间有什么区别self.linear

Sau*_*Rai 0

这是因为要将一个 Linear 层或 Dropout 层彼此分开。这是非常简单的逻辑。您正在使用 Dropout 函数的网络中创建不同的实例或层self.dropout = Dropout(dropout)