包裹Tensorflow用于Keras

PF1*_*PF1 7 keras tensorflow

我在项目的其余部分使用Keras,但也希望利用Tensorflow实施的Bahdanau注意模块(参见tf.contrib.seq2seq.BahdanauAttention).我一直试图通过Keras Layer约定来实现这一点,但不确定这是否合适.

是否有一些以这种方式包装Tensorflow组件以与计算图形兼容的约定?

我已经包含了我迄今为止编写的代码(还没有工作),并且会感谢任何指针.

from keras import backend as K
from keras.engine.topology import Layer
from keras.models import Model
import numpy as np
import tensorflow as tf

class BahdanauAttention(Layer):

# The Bahdanau attention layer has to attend to a particular set of memory states
# These are usually the output of some encoder process, where we take the output of
# GRU states
def __init__(self, memory, num_units, **kwargs):
    self.memory = memory
    self.num_units = num_units
    super(BahdanauAttention, self).__init__(**kwargs)

def build(self, input_shape):
    # The attention component will be in control of attending to the given memory
    attention = tf.contrib.seq2seq.BahdanauAttention(self.num_units, self.memory)
    cell = tf.contrib.rnn.GRUCell(num_units)

    cell_with_attention = tf.contrib.seq2seq.DynamicAttentionWrapper(cell, attention, num_units)
    self.outputs, _ = tf.nn.dynamic_rnn(cell_with_attention, inputs, dtype=tf.float32)

    super(MyLayer, self).build(input_shape)

def call(self, x):
    return

def compute_output_shape(self, input_shape):
    return (input_shape[0], self.memory[1], self.num_units)
Run Code Online (Sandbox Code Playgroud)

All*_*hvk 1

新版本的 Keras 使用 tf.keras.layers.AdditiveAttention()。这应该是现成的。

或者,可以编写自定义 Bahdanau 层,如六行代码所示:Custom Attention Layer using in Keras