更新:请参阅我自己对这个问题的回答。这是tensorflow Efficientnet的一个bug
我想要什么 我
想微调高效网络。首先,我成功完成了训练并保存了一个模型。它由一个冻结的高效网络和全连接层组成。我用SavedModel格式来保存它(见train.py)。然后,在微调阶段(参见finetune.py),我尝试加载SavedModel,但加载失败。
问题
我无法SavedModel成功加载和重新训练包含 Efficientnet。
我试过的
我试过load_model和load_weights,但都没有帮助。有谁知道怎么做?GradientTape 与 SavedMmodel 不兼容?。我应该使用除load_model或 之外的其他东西load_weights吗?
环境
macOS:10.15.6
Tensorflow==2.3.1
日志输出
... (a very long line of something like this below)
WARNING:tensorflow:Importing a function (__inference_my_model_layer_call_and_return_conditional_losses_3683150) with ops with custom gradients. Will likely fail if a gradient is requested.
ail if a gradient is requested.
WARNING:tensorflow:Importing a function (__inference_my_model_layer_call_and_return_conditional_losses_3683150) with ops with custom gradients. Will likely fail if a gradient is requested.
...
File "finetune.py", line 90, in <module>
_train_loss = train_step(train_images, train_labels).numpy()
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 780, in __call__
result = self._call(*args, **kwds)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 823, in _call
self._initialize(args, kwds, add_initializers_to=initializers)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 697, in _initialize
*args, **kwds))
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 2855, in _get_concrete_function_internal_garbage_collected
graph_function, _, _ = self._maybe_define_function(args, kwargs)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3213, in _maybe_define_function
graph_function = self._create_graph_function(args, kwargs)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3075, in _create_graph_function
capture_by_value=self._capture_by_value),
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 986, in func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 600, in wrapped_fn
return weak_wrapped_fn().__wrapped__(*args, **kwds)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 973, in wrapper
raise e.ag_error_metadata.to_exception(e)
tensorflow.python.autograph.impl.api.StagingError: in user code:
finetune.py:54 train_step *
gradients = tape.gradient(loss, model.trainable_variables)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/backprop.py:1073 gradient **
unconnected_gradients=unconnected_gradients)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/imperative_grad.py:77 imperative_grad
compat.as_str(unconnected_gradients.value))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:797 _backward_function
return self._rewrite_forward_and_call_backward(call_op, *args)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:712 _rewrite_forward_and_call_backward
forward_function, backwards_function = self.forward_backward(len(doutputs))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:621 forward_backward
forward, backward = self._construct_forward_backward(num_doutputs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:669 _construct_forward_backward
func_graph=backwards_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py:986 func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:659 _backprop_function
src_graph=self._func_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:669 _GradientsHelper
lambda: grad_fn(op, *out_grads))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:336 _MaybeCompile
return grad_fn() # Exit early
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:669 <lambda>
lambda: grad_fn(op, *out_grads))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:712 _rewrite_forward_and_call_backward
forward_function, backwards_function = self.forward_backward(len(doutputs))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:621 forward_backward
forward, backward = self._construct_forward_backward(num_doutputs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:669 _construct_forward_backward
func_graph=backwards_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py:986 func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:659 _backprop_function
src_graph=self._func_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:623 _GradientsHelper
(op.name, op.type))
LookupError: No gradient defined for operation 'efficientnetb0/top_activation/IdentityN' (op type: IdentityN)
Run Code Online (Sandbox Code Playgroud)
源代码
火车.py
... (a very long line of something like this below)
WARNING:tensorflow:Importing a function (__inference_my_model_layer_call_and_return_conditional_losses_3683150) with ops with custom gradients. Will likely fail if a gradient is requested.
ail if a gradient is requested.
WARNING:tensorflow:Importing a function (__inference_my_model_layer_call_and_return_conditional_losses_3683150) with ops with custom gradients. Will likely fail if a gradient is requested.
...
File "finetune.py", line 90, in <module>
_train_loss = train_step(train_images, train_labels).numpy()
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 780, in __call__
result = self._call(*args, **kwds)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 823, in _call
self._initialize(args, kwds, add_initializers_to=initializers)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 697, in _initialize
*args, **kwds))
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 2855, in _get_concrete_function_internal_garbage_collected
graph_function, _, _ = self._maybe_define_function(args, kwargs)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3213, in _maybe_define_function
graph_function = self._create_graph_function(args, kwargs)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 3075, in _create_graph_function
capture_by_value=self._capture_by_value),
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 986, in func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 600, in wrapped_fn
return weak_wrapped_fn().__wrapped__(*args, **kwds)
File "/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 973, in wrapper
raise e.ag_error_metadata.to_exception(e)
tensorflow.python.autograph.impl.api.StagingError: in user code:
finetune.py:54 train_step *
gradients = tape.gradient(loss, model.trainable_variables)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/backprop.py:1073 gradient **
unconnected_gradients=unconnected_gradients)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/imperative_grad.py:77 imperative_grad
compat.as_str(unconnected_gradients.value))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:797 _backward_function
return self._rewrite_forward_and_call_backward(call_op, *args)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:712 _rewrite_forward_and_call_backward
forward_function, backwards_function = self.forward_backward(len(doutputs))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:621 forward_backward
forward, backward = self._construct_forward_backward(num_doutputs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:669 _construct_forward_backward
func_graph=backwards_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py:986 func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:659 _backprop_function
src_graph=self._func_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:669 _GradientsHelper
lambda: grad_fn(op, *out_grads))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:336 _MaybeCompile
return grad_fn() # Exit early
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:669 <lambda>
lambda: grad_fn(op, *out_grads))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:712 _rewrite_forward_and_call_backward
forward_function, backwards_function = self.forward_backward(len(doutputs))
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:621 forward_backward
forward, backward = self._construct_forward_backward(num_doutputs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:669 _construct_forward_backward
func_graph=backwards_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py:986 func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/eager/function.py:659 _backprop_function
src_graph=self._func_graph)
/Users/a/my_awesome_project/.venv/lib/python3.7/site-packages/tensorflow/python/ops/gradients_util.py:623 _GradientsHelper
(op.name, op.type))
LookupError: No gradient defined for operation 'efficientnetb0/top_activation/IdentityN' (op type: IdentityN)
Run Code Online (Sandbox Code Playgroud)
Finetune.py(我重构了最小化复制,所以错误日志中的行号不匹配)
import datetime
import os
import tensorflow as tf
from myutils import decode_jpg # defined in another module
class MyModel(tf.keras.Model):
def __init__(self):
super(MyModel, self).__init__()
self.base_model = tf.keras.applications.EfficientNetB0(
input_shape=(256, 256, 3),
include_top=False,
weights='imagenet')
self.base_model.trainable = False # unfreeze at finetuning stage later
self.global_average_layer = tf.keras.layers.GlobalAveragePooling2D()
self.prediction_layer = tf.keras.layers.Dense(200)
def call(self, x):
x = self.base_model(x)
x = self.global_average_layer(x)
x = self.prediction_layer(x)
return x
model = MyModel()
loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
optimizer = tf.keras.optimizers.Adam()
@tf.function
def train_step(images, labels):
with tf.GradientTape() as tape:
predictions = model(images, training=True)
loss = loss_object(labels, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
data = tf.data.Dataset.list_files('./data/*/*.jpg').batch(128).map(decode_jpg)
for epoch in range(100):
for images, labels in data:
train_step(images, labels).
model.save('saved_models/{}'.format(epoch + 1))
Run Code Online (Sandbox Code Playgroud)
我试图在 Colab 上重现,但看到了不同的错误消息 https://colab.research.google.com/drive/1gzOwSWJ1Kvwzo01SEpjqGq6Lb-OsI-ob?usp=sharing
现在我在 tensorflow/tensorflow 存储库上提出了一个问题。 https://github.com/tensorflow/tensorflow/issues/43806
| 归档时间: |
|
| 查看次数: |
1591 次 |
| 最近记录: |