Ser*_*scu 3 nlp machine-learning python-3.x tf.keras tensorflow2.0
我对验证数据有疑问。我有这个神经网络,我将数据分为 train_generator、val_generator、test_generator。
我制作了一个具有定制配合的定制模型。
class MyModel(tf.keras.Model):
def __init__(self):
def __call__(.....)
def train_step(....)
Run Code Online (Sandbox Code Playgroud)
那么我有:
train_generator = DataGenerator(....)
val_generator = DataGenerator(....)
test_generator = DataGenerator(....)
Run Code Online (Sandbox Code Playgroud)
然后 :
model = MyModel()
model.compile(optimizer=keras.optimizers.Adam(clipnorm=5.),
metrics=["accuracy"])
model.fit(train_generator, validation_data = val_generator, epochs=40)
Run Code Online (Sandbox Code Playgroud)
好的,程序没有给我任何错误,但我的问题是:我如何知道我的validation_data发生了什么?它的处理方式是否与 train_step 函数中的 train_data (train_generator) 相同?或者我是否需要指定如何处理验证数据?
如果有帮助的话我也会参加 MyModel 课程
class MyModel(tf.keras.Model):
def __init__(self):
super(MyModel2, self).__init__()
self.dec2 = Decoder2()
def __call__(self, y_hat, **kwargs):
print(y_hat.shape)
z_hat = self.dec2(y_hat)
return z_hat
def train_step(self, dataset):
with tf.GradientTape() as tape:
y_hat = dataset[0]
z_true = dataset[1]
z_pred = self(y_hat, training=True)
#print("This is z_true : ", z_true.shape)
#print("This is z_pred : ", z_pred.shape)
loss = tf.reduce_mean(tf.abs(tf.cast(z_pred, tf.float64) - tf.cast(z_true, tf.float64)))
print("loss: ", loss)
global_loss.append(loss)
# Compute gradients. TRE SA FAC GRADIENT CLIPPING
trainable_vars = self.trainable_variables
gradients = tape.gradient(loss, trainable_vars)
# Update weights
self.optimizer.apply_gradients(zip(gradients, trainable_vars))
# Update metrics (includes the metric that tracks the loss)
self.compiled_metrics.update_state(z_true, z_pred)
# Return a dict mapping metric names to current value
return {m.name: m.result() for m in self.metrics}
Run Code Online (Sandbox Code Playgroud)