小编Ras*_*pel的帖子

Parallel fitting of multiple Keras Models on single GPU

I'm trying to fit multiple small Keras models in parallel on a single GPU. Because of reasons i need to get them out of a list and train them one step at a time. Since I was not lucky with the standard multiprocessing module i use pathos.

What I tried to do is something like this:

from pathos.multiprocessing import ProcessPool as Pool
import tensorflow as tf
import keras.backend as K

def multiprocess_step(self, model):
    K.set_session(sess)
    with sess.graph.as_default():
        model = step(model, sess) …
Run Code Online (Sandbox Code Playgroud)

python multiprocessing keras tensorflow

6
推荐指数
1
解决办法
1486
查看次数

使用纸张输入选择的访问文件

我正在尝试上传通过Polymer <paper-input type="file" id="filepicker">元素选择的文件但是当我尝试访问该文件时:

var file = this.$.filepicker.files
Run Code Online (Sandbox Code Playgroud)

我收到一个files is not defined错误.

我还没有找到任何其他方法来访问纸张输入中的文件,所以我不确定这里的问题是什么.

任何帮助,将不胜感激!

javascript web-component polymer

5
推荐指数
1
解决办法
1120
查看次数

自定义 Keras 损失的奇怪 Nan 损失

\n我正在尝试在 Keras 中实现自定义损失,但无法让它工作。

\n\n

我已经在 numpy 和 keras.backend 中实现了它:

\n\n
def log_rmse_np(y_true, y_pred):\n    d_i = np.log(y_pred) -  np.log(y_true)\n    loss1 = (np.sum(np.square(d_i))/np.size(d_i))\n    loss2 = ((np.square(np.sum(d_i)))/(2 * np.square(np.size(d_i))))\n    loss = loss1 - loss2\n    print(\'np_loss =  %s - %s = %s\'%(loss1, loss2, loss))\n    return loss\n\ndef log_rmse(y_true, y_pred):\n    d_i = (K.log(y_pred) -  K.log(y_true))\n    loss1 = K.mean(K.square(d_i))\n    loss2 = K.square(K.sum(K.flatten(d_i),axis=-1))/(K.cast_to_floatx(2) * K.square(K.cast_to_floatx(K.int_shape(K.flatten(d_i))[0])))\n    loss = loss1 - loss2\n    return loss\n
Run Code Online (Sandbox Code Playgroud)\n\n

当我测试并比较以下函数的损失时,一切似乎都工作得很好。

\n\n
def check_loss(_shape):\n    if _shape == \'2d\':\n        shape = (6, 7)\n    elif _shape == …
Run Code Online (Sandbox Code Playgroud)

python machine-learning python-3.x keras tensorflow

5
推荐指数
1
解决办法
4784
查看次数

从列表列表中删除重复项

我有一个包含~300个列表的列表,但是其中一些是重复的,我想删除它们.我试过了:

cleanlist = [cleanlist.append(x) for x in oldlist if x not in cleanlist]
Run Code Online (Sandbox Code Playgroud)

但它一直RuntimeError: maximum recursion depth exceeded in comparison在向我投掷.我尝试过,sys.setrecursionlimit(1500)但没有帮助.

有什么更好的方法呢?

python list duplicates nested-lists

0
推荐指数
1
解决办法
1385
查看次数