我正在使用CSV格式的大型数据集.我试图逐列处理数据,然后将数据附加到HDF文件中的帧.所有这些都是使用Pandas完成的.我的动机是,虽然整个数据集比我的物理内存大得多,但列大小是可管理的.在稍后阶段,我将通过逐列将列加载回内存并对其进行操作来执行特征逐步逻辑回归.
我能够创建一个新的HDF文件并使用第一列创建一个新框架:
hdf_file = pandas.HDFStore('train_data.hdf')
feature_column = pandas.read_csv('data.csv', usecols=[0])
hdf_file.append('features', feature_column)
Run Code Online (Sandbox Code Playgroud)
但在那之后,我在尝试向框架追加新列时遇到了ValueError:
feature_column = pandas.read_csv('data.csv', usecols=[1])
hdf_file.append('features', feature_column)
Run Code Online (Sandbox Code Playgroud)
堆栈跟踪和错误消息:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/pandas/io/pytables.py", line 658, in append self._write_to_group(key, value, table=True, append=True, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/pandas/io/pytables.py", line 923, in _write_to_group s.write(obj = value, append=append, complib=complib, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/pandas/io/pytables.py", line 2985, in write **kwargs)
File "/usr/local/lib/python2.7/dist-packages/pandas/io/pytables.py", line 2675, in create_axes raise ValueError("cannot match existing table structure for [%s] on appending data" % items)
ValueError: …Run Code Online (Sandbox Code Playgroud)