Mar*_*oma 35 python keras keras-layer
有时默认的标准激活,如ReLU,tanh,softmax,......和LeakyReLU等高级激活是不够的.它也可能不属于keras-contrib.
你如何创建自己的激活功能?
Mar*_*oma 51
# Creating a model
from keras.models import Sequential
from keras.layers import Dense
# Custom activation function
from keras.layers import Activation
from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
get_custom_objects().update({'custom_activation': Activation(custom_activation)})
# Usage
model = Sequential()
model.add(Dense(32, input_dim=784))
model.add(Activation(custom_activation, name='SpecialActivation'))
print(model.summary())
Run Code Online (Sandbox Code Playgroud)
请记住,在保存和恢复模型时必须导入此功能.请参阅keras-contrib的注释.
Epo*_*ous 12
比Martin Thoma的回答要简单一些:您可以只创建一个自定义的元素后端函数并将其用作参数。在加载模型之前,您仍然需要导入此功能。
from keras import backend as K
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
model.add(Dense(32 , activation=custom_activation))
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
19751 次 |
| 最近记录: |