我想弄清楚为什么有时会tf.GradientTape().gradient返回None,所以我使用了以下三个损失函数(mmd0(), mmd1(), mmd2()),虽然 mmd0 和 mmd1 的格式有点不同,但仍然返回梯度,但对于 mmd2,梯度是None. 我打印出这三个函数的损失,有人为什么会这样?
def mmd0(x, y): # a and b are lists of aribiturary lengths
return x
def mmd1(x1, x2): # a and b are lists of aribiturary lengths
dis = sum([x**2 for x in x1])/len(x1) - sum([x**2 for x in x2])/len(x2)
return dis**2
def mmd2(x, y):
dis = x-y
return [tf.convert_to_tensor(elem) for elem in dis]
def get_MMD_norm(errors, sigma=0.1):
x2 = np.random.normal(0, sigma, len(errors)) …Run Code Online (Sandbox Code Playgroud) 我按照说明(https://github.com/huggingface/transfer-learning-conv-ai)从 Huggingface 安装 conv-ai,但我陷入了 docker 构建步骤:docker build -t convai .
我使用的是Mac 10.15,python 3.8,将Docker内存增加到4G。
\n我已尝试以下方法来解决该问题:
\nnumpyrequirements.txtRUN pip3 install --upgrade setuptools到 Dockerfile 中--upgrade到RUN pip3 install -r /tmp/requirements.txtDockerfile 中RUN pip3 install numpy之前添加RUN pip3 install -r /tmp/requirements.txtRUN apt-get install python3-numpy之前添加RUN pip3 install -r /tmp/requirements.txt