Him*_*oon 16 python python-3.x tensorflow tensorflow2.0
在tensorflow 2.0 指南中的tf 2.0 DC Gan 示例中,有两个梯度磁带。见下文。
@tf.function
def train_step(images):
noise = tf.random.normal([BATCH_SIZE, noise_dim])
with tf.GradientTape() as gen_tape, tf.GradientTape() as disc_tape:
generated_images = generator(noise, training=True)
real_output = discriminator(images, training=True)
fake_output = discriminator(generated_images, training=True)
gen_loss = generator_loss(fake_output)
disc_loss = discriminator_loss(real_output, fake_output)
gradients_of_generator = gen_tape.gradient(gen_loss, generator.trainable_variables)
gradients_of_discriminator = disc_tape.gradient(disc_loss, discriminator.trainable_variables)
generator_optimizer.apply_gradients(zip(gradients_of_generator, generator.trainable_variables))
discriminator_optimizer.apply_gradients(zip(gradients_of_discriminator, discriminator.trainable_variables))
Run Code Online (Sandbox Code Playgroud)
如您所见,有两个渐变色带。我想知道使用单个磁带有什么区别并将其更改为以下内容
@tf.function
def train_step(images):
noise = tf.random.normal([BATCH_SIZE, noise_dim])
with tf.GradientTape() as tape:
generated_images = generator(noise, training=True)
real_output = discriminator(images, training=True)
fake_output = discriminator(generated_images, training=True)
gen_loss = generator_loss(fake_output)
disc_loss = discriminator_loss(real_output, fake_output)
gradients_of_generator = tape.gradient(gen_loss, generator.trainable_variables)
gradients_of_discriminator = tape.gradient(disc_loss, discriminator.trainable_variables)
generator_optimizer.apply_gradients(zip(gradients_of_generator, generator.trainable_variables))
discriminator_optimizer.apply_gradients(zip(gradients_of_discriminator, discriminator.trainable_variables))
Run Code Online (Sandbox Code Playgroud)
这给了我以下错误:
RuntimeError: GradientTape.gradient can only be called once on non-persistent tapes.
Run Code Online (Sandbox Code Playgroud)
我想知道为什么需要两盘磁带。截至目前,有关 tf2.0 API 的文档很少。任何人都可以解释或指向我正确的文档/教程吗?
Spa*_*y05 14
从文件中GradientTape:
默认情况下,一旦 GradientTape.gradient() 方法被调用,GradientTape 持有的资源就会被释放。要在同一计算中计算多个梯度,请创建一个持久梯度磁带。这允许在磁带对象被垃圾回收时释放资源时多次调用梯度()方法。
可以使用创建持久渐变with tf.GradientTape(persistent=True) as tape并且可以/应该手动删除del tape(此@zwep,@Crispy13 的学分)。
| 归档时间: |
|
| 查看次数: |
7102 次 |
| 最近记录: |