如何绘制 Keras/Tensorflow 子类化 API 模型?

Nic*_*ais 9 python plot deep-learning keras tensorflow

我使用 Keras Subclassing API 创建了一个可以正确运行的模型。也model.summary()可以正常工作。当尝试使用tf.keras.utils.plot_model()可视化模型的架构时,它只会输出以下图像:

在此输入图像描述

这几乎感觉像是 Keras 开发团队的一个笑话。这是完整的架构:

import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
from sklearn.datasets import load_diabetes
import tensorflow as tf
tf.keras.backend.set_floatx('float64')
from tensorflow.keras.layers import Dense, GaussianDropout, GRU, Concatenate, Reshape
from tensorflow.keras.models import Model

X, y = load_diabetes(return_X_y=True)

data = tf.data.Dataset.from_tensor_slices((X, y)).\
    shuffle(len(X)).\
    map(lambda x, y: (tf.divide(x, tf.reduce_max(x)), y))

training = data.take(400).batch(8)
testing = data.skip(400).map(lambda x, y: (tf.expand_dims(x, 0), y))

class NeuralNetwork(Model):
    def __init__(self):
        super(NeuralNetwork, self).__init__()
        self.dense1 = Dense(16, input_shape=(10,), activation='relu', name='Dense1')
        self.dense2 = Dense(32, activation='relu', name='Dense2')
        self.resha1 = Reshape((1, 32))
        self.gru1 = GRU(16, activation='tanh', recurrent_dropout=1e-1)
        self.dense3 = Dense(64, activation='relu', name='Dense3')
        self.gauss1 = GaussianDropout(5e-1)
        self.conca1 = Concatenate()
        self.dense4 = Dense(128, activation='relu', name='Dense4')
        self.dense5 = Dense(1, name='Dense5')

    def call(self, x, *args, **kwargs):
        x = self.dense1(x)
        x = self.dense2(x)
        a = self.resha1(x)
        a = self.gru1(a)
        b = self.dense3(x)
        b = self.gauss1(b)
        x = self.conca1([a, b])
        x = self.dense4(x)
        x = self.dense5(x)
        return x


skynet = NeuralNetwork()
skynet.build(input_shape=(None, 10))
skynet.summary()

model = tf.keras.utils.plot_model(model=skynet,
         show_shapes=True, to_file='/home/nicolas/Desktop/model.png')
Run Code Online (Sandbox Code Playgroud)

M.I*_*nat 10

我找到了一些解决方法来使用模型子分类 API 进行绘图。原因显而易见子类API不支持顺序或者功能类似的APImodel.summary()和漂亮的可视化使用plot_model。在这里,我将展示两者。

class my_model(Model):
    def __init__(self, dim):
        super(my_model, self).__init__()
        self.Base  = VGG16(input_shape=(dim), include_top = False, weights = 'imagenet')
        self.GAP   = L.GlobalAveragePooling2D()
        self.BAT   = L.BatchNormalization()
        self.DROP  = L.Dropout(rate=0.1)
        self.DENS  = L.Dense(256, activation='relu', name = 'dense_A')
        self.OUT   = L.Dense(1, activation='sigmoid')
    
    def call(self, inputs):
        x  = self.Base(inputs)
        g  = self.GAP(x)
        b  = self.BAT(g)
        d  = self.DROP(b)
        d  = self.DENS(d)
        return self.OUT(d)
    
    # AFAIK: The most convenient method to print model.summary() 
    # similar to the sequential or functional API like.
    def build_graph(self):
        x = Input(shape=(dim))
        return Model(inputs=[x], outputs=self.call(x))

dim = (124,124,3)
model = my_model((dim))
model.build((None, *dim))
model.build_graph().summary()
Run Code Online (Sandbox Code Playgroud)

它将产生如下:

Layer (type)                 Output Shape              Param #   
=================================================================
input_67 (InputLayer)        [(None, 124, 124, 3)]     0         
_________________________________________________________________
vgg16 (Functional)           (None, 3, 3, 512)         14714688  
_________________________________________________________________
global_average_pooling2d_32  (None, 512)               0         
_________________________________________________________________
batch_normalization_7 (Batch (None, 512)               2048      
_________________________________________________________________
dropout_5 (Dropout)          (None, 512)               0         
_________________________________________________________________
dense_A (Dense)              (None, 256)               402192    
_________________________________________________________________
dense_7 (Dense)              (None, 1)                 785       
=================================================================
Total params: 14,848,321
Trainable params: 14,847,297
Non-trainable params: 1,024
Run Code Online (Sandbox Code Playgroud)

现在通过使用该build_graph函数,我们可以简单地绘制整个架构。

Layer (type)                 Output Shape              Param #   
=================================================================
input_67 (InputLayer)        [(None, 124, 124, 3)]     0         
_________________________________________________________________
vgg16 (Functional)           (None, 3, 3, 512)         14714688  
_________________________________________________________________
global_average_pooling2d_32  (None, 512)               0         
_________________________________________________________________
batch_normalization_7 (Batch (None, 512)               2048      
_________________________________________________________________
dropout_5 (Dropout)          (None, 512)               0         
_________________________________________________________________
dense_A (Dense)              (None, 256)               402192    
_________________________________________________________________
dense_7 (Dense)              (None, 1)                 785       
=================================================================
Total params: 14,848,321
Trainable params: 14,847,297
Non-trainable params: 1,024
Run Code Online (Sandbox Code Playgroud)

它将产生如下:-)

一种