我在Keras实施了一个捆绑权重自动编码器,并成功训练了它.
我的目标是仅使用自动编码器的解码器部分作为另一个网络的最后一层,以微调网络和解码器.
正如你在摘要中看到的那样,解码器没有带有我的绑定权重实现的参数,所以没有什么可以微调的.(decoder.get_weights()退货[])
我的问题是:我是否应该更改绑定权重的实现,以便绑定层仍然可以保持权重,即编码器的转置权重?如果有,怎么样?
还是我离开了?
下面是autoencoder模型的摘要以及绑定的Dense层的类(稍微修改自https://github.com/nanopony/keras-convautoencoder/blob/master/autoencoder_layers.py.)
Layer (type) Output Shape Param # Connected to
====================================================================================================
encoded (Dense) (None, Enc_dim) 33000 dense_input_1[0][0]
____________________________________________________________________________________________________
tieddense_1 (TiedtDense) (None, Out_Dim) 0 encoded[0][0]
====================================================================================================
Total params: 33,000
Trainable params: 33,000
Non-trainable params: 0
________________________________________________________________________
class TiedtDense(Dense):
def __init__(self, output_dim, master_layer, init='glorot_uniform', activation='linear', weights=None,
W_regularizer=None, b_regularizer=None, activity_regularizer=None,
W_constraint=None, b_constraint=None, input_dim=None, **kwargs):
self.master_layer = master_layer
super(TiedtDense, self).__init__(output_dim, **kwargs)
def build(self, input_shape):
assert len(input_shape) >= 2
input_dim = input_shape[-1]
self.input_dim = input_dim …Run Code Online (Sandbox Code Playgroud)