我读到的关于将退出应用于rnn的所有内容都参考了Zaremba等人的论文.al表示不在经常性连接之间应用辍学.应在LSTM层之前或之后随机丢弃神经元,而不是LSTM间层.好.
在每个人都引用的论文中,似乎在每个时间步长应用一个随机的"丢失掩码",而不是生成一个随机的"丢失掩码"并重新使用它,将它应用于丢弃的给定层中的所有时间步长.然后在下一批产生一个新的"辍学掩码".
此外,可能更重要的是,tensorflow是如何做到的?我已经检查了tensorflow api,并试图寻找详细的解释,但还没有找到一个.
我一直无法在Anaconda环境中安装Keras ......
~$ pip install keras
pip install keras
Collecting keras
Collecting pyyaml (from keras)
Using cached PyYAML-3.12.tar.gz
Complete output from command python setup.py egg_info:
running egg_info
creating pip-egg-info/PyYAML.egg-info
writing top-level names to pip-egg-info/PyYAML.egg-info/top_level.txt
writing dependency_links to pip-egg-info/PyYAML.egg-info/dependency_links.txt
writing pip-egg-info/PyYAML.egg-info/PKG-INFO
writing manifest file 'pip-egg-info/PyYAML.egg-info/SOURCES.txt'
warning: manifest_maker: standard file '-c' not found
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-build-b74rx7yf/pyyaml/setup.py", line 339, in <module>
cmdclass=cmdclass,
File "/home/bee/anaconda3/envs/roar/lib/python3.5/distutils/core.py", line 148, in setup
dist.run_commands()
File "/home/bee/anaconda3/envs/roar/lib/python3.5/distutils/dist.py", line 955, …
Run Code Online (Sandbox Code Playgroud) anaconda ×1
dropout ×1
keras ×1
lstm ×1
pip ×1
python ×1
tensorflow ×1
time-series ×1
ubuntu ×1