我正在使用Keras库来创建神经网络.我有一个iPython Notebook,用于加载训练数据,初始化网络并"适应"神经网络的权重.最后,我使用save_weights()方法保存权重.代码如下:
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import SGD
from keras.regularizers import l2
from keras.callbacks import History
[...]
input_size = data_X.shape[1]
output_size = data_Y.shape[1]
hidden_size = 100
learning_rate = 0.01
num_epochs = 100
batch_size = 75
model = Sequential()
model.add(Dense(hidden_size, input_dim=input_size, init='uniform'))
model.add(Activation('tanh'))
model.add(Dropout(0.2))
model.add(Dense(hidden_size))
model.add(Activation('tanh'))
model.add(Dropout(0.2))
model.add(Dense(output_size))
model.add(Activation('tanh'))
sgd = SGD(lr=learning_rate, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='mse', optimizer=sgd)
model.fit(X_NN_part1, Y_NN_part1, batch_size=batch_size, nb_epoch=num_epochs, validation_data=(X_NN_part2, Y_NN_part2), callbacks=[history])
y_pred = model.predict(X_NN_part2) # works well
model.save_weights('keras_w')
Run Code Online (Sandbox Code Playgroud)
然后,在另一个iPython Notebook中,我只想使用这些权重并根据输入预测一些输出值.我初始化相同的神经网络,然后加载权重. …
我正在尝试预测时间序列:给定50个先前的值,我想预测下5个值.
为此,我正在使用该skflow包(基于TensorFlow),这个问题与Github repo中提供的Boston示例相对接近.
我的代码如下:
%matplotlib inline
import pandas as pd
import skflow
from sklearn import cross_validation, metrics
from sklearn import preprocessing
filepath = 'CSV/FILE.csv'
ts = pd.Series.from_csv(filepath)
nprev = 50
deltasuiv = 5
def load_data(data, n_prev = nprev, delta_suiv=deltasuiv):
docX, docY = [], []
for i in range(len(data)-n_prev-delta_suiv):
docX.append(np.array(data[i:i+n_prev]))
docY.append(np.array(data[i+n_prev:i+n_prev+delta_suiv]))
alsX = np.array(docX)
alsY = np.array(docY)
return alsX, alsY
X, y = load_data(ts.values)
# Scale data to 0 mean and unit std dev.
scaler …Run Code Online (Sandbox Code Playgroud) 我想使用高斯过程来解决回归任务.我的数据如下:每个X向量的长度为37,每个Y向量的长度为8.
我正在使用该sklearn软件包,Python但尝试使用高斯过程导致Exception:
from sklearn import gaussian_process
print "x :", x__
print "y :", y__
gp = gaussian_process.GaussianProcess(theta0=1e-2, thetaL=1e-4, thetaU=1e-1)
gp.fit(x__, y__)
Run Code Online (Sandbox Code Playgroud)
x:[[136.1377.137. 132. 130. 130. 132. 133. 134.
135. 135. 134. 134. 1139.1019.0.0.0.0.0.0.0.0. 0. 0 0. 0 0. 0 0. 70. 24. 55. 0. 9. 0. 0.] [136. 137. 137. 132. 130. 130. 132. 133. 134. 135. 135. 134. 134. 1139.1019.0.0.0.0.0.0.0.0.0.0.0.0.0,0.70.24. 55. 0.9. 0. 0. [82.76. 80. 103. 135. 155. 159. 156. 145. 138. …
我目前正在使用 OpenCV 2.4 对二进制图像进行形态学变换
我刚刚注意到,使用 OpenCV 的内置函数,我所有像素的位置都向右和向下移动了一个(即之前位于 (i,j) 的像素现在位于 (i+1, j+1) )
import cv2
import numpy as np
from skimage.morphology import opening
image = cv2.imread('input.png', 0)
kernel = np.ones((16,16), np.uint8)
opening_opencv = cv2.morphologyEx(image, cv2.MORPH_OPEN, kernel)
opening_skimage = opening(image, kernel)
cv2.imwrite('opening_opencv.png', opening_opencv)
cv2.imwrite('opening_skimage.png', opening_skimage)
Run Code Online (Sandbox Code Playgroud)
输入 :

输出 :

由于我不明白为什么,我只是使用skimage绑定了相同的操作,并且在形态转换过程中并没有产生这种“间隙”。
输出:

关于这个问题的任何想法?
谢谢 !
我正在尝试将我的GRU模型与我的训练数据相匹配时遇到问题.快速浏览StackOverflow后,我发现这篇文章与我的问题非常相似:
我自己的模型如下:
nn = Sequential()
nn.add(Embedding(input_size, hidden_size))
nn.add(GRU(hidden_size_2, return_sequences=False))
nn.add(Dropout(0.2))
nn.add(Dense(output_size))
nn.add(Activation('linear'))
nn.compile(loss='mse', optimizer="rmsprop")
history = History()
nn.fit(X_train, y_train, batch_size=30, nb_epoch=200, validation_split=0.1, callbacks=[history])
Run Code Online (Sandbox Code Playgroud)
错误是:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-14-e2f199af6e0c> in <module>()
1 history = History()
----> 2 nn.fit(X_train, y_train, batch_size=30, nb_epoch=200, validation_split=0.1, callbacks=[history])
C:\Users\XXXX\AppData\Local\Continuum\Anaconda\lib\site-packages\keras\models.pyc in fit(self, X, y, batch_size, nb_epoch, verbose, callbacks, validation_split, validation_data, shuffle, show_accuracy, class_weight, sample_weight)
487 verbose=verbose, callbacks=callbacks,
488 val_f=val_f, val_ins=val_ins,
--> 489 shuffle=shuffle, metrics=metrics)
490
491 def predict(self, X, batch_size=128, verbose=0): …Run Code Online (Sandbox Code Playgroud) python ×5
forecasting ×2
keras ×2
theano ×2
gaussian ×1
keyerror ×1
opencv ×1
regression ×1
scikit-learn ×1
skflow ×1
tensorflow ×1