如何处理BigTable Scan InvalidChunk异常?

Jos*_*ben 6 python happybase google-cloud-bigtable

我试图扫描BigTable数据,其中一些行是"脏" - 但这取决于扫描失败,导致(序列化?)InvalidChunk异常.代码如下:

from google.cloud import bigtable
from google.cloud import happybase
client = bigtable.Client(project=project_id, admin=True)
instance = client.instance(instance_id)
connection = happybase.Connection(instance=instance)
table = connection.table(table_name)

for key, row in table.scan(limit=5000):  #BOOM!
    pass
Run Code Online (Sandbox Code Playgroud)

省略一些列或将行限制为更少或指定开始和停止键,允许扫描成功.我无法从堆栈跟踪中检测出哪些值存在问题 - 它在列之间变化 - 扫描失败.这使得在源头清理数据成为问题.

当我利用python调试器时,我看到chunk(类型为google.bigtable.v2.bigtable_pb2.CellChunk)没有值(它是NULL/undefined):

ipdb> pp chunk.value
b''
ipdb> chunk.value_size
0
Run Code Online (Sandbox Code Playgroud)

我可以使用rowkey中的HBase shell来确认这一点(我从self._row.row_key获得)

所以问题就变成了:BigTable如何扫描过滤掉具有未定义/空/空值的列?

我从两个谷歌云API中得到同样的问题,这些API返回生成器,这些生成器在内部将数据作为块通过gRPC传输:

  • google.cloud.happybase .table.Table# 扫描()
  • google.cloud.bigtable .table.Table #read_rows().consume_all()

缩写的stacktrace如下:

---------------------------------------------------------------------------
InvalidChunk                              Traceback (most recent call last)
<ipython-input-48-922c8127f43b> in <module>()
      1 row_gen = table.scan(limit=n) 
      2 rows = []
----> 3 for kvp in row_gen:
      4     pass
.../site-packages/google/cloud/happybase/table.py in scan(self, row_start, row_stop, row_prefix, columns, timestamp, include_timestamp, limit, **kwargs)
    391         while True:
    392             try:
--> 393                 partial_rows_data.consume_next()
    394                 for row_key in sorted(rows_dict):
    395                     curr_row_data = rows_dict.pop(row_key)

.../site-packages/google/cloud/bigtable/row_data.py in consume_next(self)
    273         for chunk in response.chunks:
    274 
--> 275             self._validate_chunk(chunk)
    276 
    277             if chunk.reset_row:

.../site-packages/google/cloud/bigtable/row_data.py in _validate_chunk(self, chunk)
    388             self._validate_chunk_new_row(chunk)
    389         if self.state == self.ROW_IN_PROGRESS:
--> 390             self._validate_chunk_row_in_progress(chunk)
    391         if self.state == self.CELL_IN_PROGRESS:
    392             self._validate_chunk_cell_in_progress(chunk)

.../site-packages/google/cloud/bigtable/row_data.py in _validate_chunk_row_in_progress(self, chunk)
    368         self._validate_chunk_status(chunk)
    369         if not chunk.HasField('commit_row') and not chunk.reset_row:
--> 370             _raise_if(not chunk.timestamp_micros or not chunk.value)
    371         _raise_if(chunk.row_key and
    372                   chunk.row_key != self._row.row_key)

.../site-packages/google/cloud/bigtable/row_data.py in _raise_if(predicate, *args)
    439     """Helper for validation methods."""
    440     if predicate:
--> 441         raise InvalidChunk(*args)

InvalidChunk: 
Run Code Online (Sandbox Code Playgroud)

你能告诉我如何从Python中扫描BigTable,忽略/记录引发InvalidChunk的脏行吗? (尝试......除了不会在生成器周围工作,这是在google云API row_data PartialRowsData类中)

另外,你能告诉我在BigTable中使用表格扫描的块流代码吗?似乎不支持 HappyBase batch_sizescan_batching.

小智 1

这可能是由于此错误造成的:https ://github.com/googleapis/google-cloud-python/issues/2980

该错误已被修复,因此这应该不再是问题。