如何抑制详细的Tensorflow日志记录?

ton*_*ian 16 nose deep-learning tensorflow

我正在使用nosetests对我的Tensorflow代码进行单元测试,但它会产生如此冗长的输出,这使得它无用.

以下测试

import unittest
import tensorflow as tf

class MyTest(unittest.TestCase):

    def test_creation(self):
        self.assertEquals(True, False)
Run Code Online (Sandbox Code Playgroud)

运行时nosetests会产生大量无用的日志记录:

FAIL: test_creation (tests.test_tf.MyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/cebrian/GIT/thesis-nilm/code/deepmodels/tests/test_tf.py", line 10, in test_creation
    self.assertEquals(True, False)
AssertionError: True != False
-------------------- >> begin captured logging << --------------------
tensorflow: Level 1: Registering Const (<function _ConstantShape at 0x7f4379131c80>) in shape functions.
tensorflow: Level 1: Registering Assert (<function no_outputs at 0x7f43791319b0>) in shape functions.
tensorflow: Level 1: Registering Print (<function _PrintGrad at 0x7f4378effd70>) in gradient.
tensorflow: Level 1: Registering Print (<function unchanged_shape at 0x7f4379131320>) in shape functions.
tensorflow: Level 1: Registering HistogramAccumulatorSummary (None) in gradient.
tensorflow: Level 1: Registering HistogramSummary (None) in gradient.
tensorflow: Level 1: Registering ImageSummary (None) in gradient.
tensorflow: Level 1: Registering AudioSummary (None) in gradient.
tensorflow: Level 1: Registering MergeSummary (None) in gradient.
tensorflow: Level 1: Registering ScalarSummary (None) in gradient.
tensorflow: Level 1: Registering ScalarSummary (<function _ScalarShape at 0x7f4378f042a8>) in shape functions.
tensorflow: Level 1: Registering MergeSummary (<function _ScalarShape at 0x7f4378f042a8>) in shape functions.
tensorflow: Level 1: Registering AudioSummary (<function _ScalarShape at 0x7f4378f042a8>) in shape functions.
tensorflow: Level 1: Registering ImageSummary (<function _ScalarShape at 0x7f4378f042a8>) in shape functions.
tensorflow: Level 1: Registering HistogramSummary (<function _ScalarShape at 0x7f4378f042a8>) in shape functions.
tensorflow: Level 1: Registering HistogramAccumulatorSummary (<function _ScalarShape at 0x7f4378f042a8>) in shape functions.
tensorflow: Level 1: Registering Pack (<function _PackShape at 0x7f4378f047d0>) in shape functions.
tensorflow: Level 1: Registering Unpack (<function _UnpackShape at 0x7f4378f048c0>) in shape functions.
tensorflow: Level 1: Registering Concat (<function _ConcatShape at 0x7f4378f04938>) in shape functions.
tensorflow: Level 1: Registering ConcatOffset (<function _ConcatOffsetShape at 0x7f4378f049b0>) in shape functions.

......
Run Code Online (Sandbox Code Playgroud)

而使用来自ipython控制台的tensorflow似乎并不那么冗长:

$ ipython
Python 2.7.11+ (default, Apr 17 2016, 14:00:29) 
Type "copyright", "credits" or "license" for more information.

IPython 4.2.0 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.

In [1]: import tensorflow as tf
I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcublas.so locally
I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcudnn.so locally
I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcufft.so locally
I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcuda.so locally
I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcurand.so locally

In [2]:
Run Code Online (Sandbox Code Playgroud)

运行nosetests时如何抑制前一次记录?

cra*_*ael 30

1.0更新(5/20/17):

在TensorFlow 1.0中,根据此问题,您现在可以通过调用的环境变量控制日志记录TF_CPP_MIN_LOG_LEVEL; 它默认为0(显示所有日志),但可以设置为1以过滤TF_CPP_MIN_LOG_LEVEL日志,2 可以设置为另外过滤tf.autograph.set_verbosity日志,3可以设置为另外过滤TF_CPP_MIN_LOG_LEVEL日志.使用Python查看以下通用OS示例:

import tensorflow as tf
tf.get_logger().setLevel('INFO')
Run Code Online (Sandbox Code Playgroud)

对于TensorFlow或TF-Learn Logging的早期版本,请参阅以下内容:

有关TensorFlow日志记录的信息,请查看下面的页面; 与新的更新,你能在日志记录级别设置为Level,tf_logging,DEBUG,INFO,或WARN.例如:

# Can also be set using the AUTOGRAPH_VERBOSITY environment variable
tf.autograph.set_verbosity(1)
Run Code Online (Sandbox Code Playgroud)

该页面另外还有可以与TF-Learn模型一起使用的监视器.这是页面.

但这并不会阻止所有日志记录(仅限TF-Learn).我有两个解决方案; 一个是"技术上正确"的解决方案(Linux),另一个涉及重建TensorFlow.

  Level | Level for Humans | Level Description                  
 -------|------------------|------------------------------------ 
  0     | DEBUG            | [Default] Print all messages       
  1     | INFO             | Filter out INFO messages           
  2     | WARNING          | Filter out INFO & WARNING messages 
  3     | ERROR            | Filter out all messages      
Run Code Online (Sandbox Code Playgroud)

另一方面,请参阅此答案,其中涉及修改源和重建TensorFlow.


小智 9

运行测试nosetests --nologcapture将禁用这些日志的显示.有关测试的测试的更多信息:https://nose.readthedocs.io/en/latest/plugins/logcapture.html


Yar*_*tov 0

这是执行此操作的示例。不幸的是,这需要修改源代码并重建。这是一个跟踪错误,可以让它更容易


归档时间:

查看次数:

21038 次

最近记录:

6 年,3 月 前