Han*_*ham 3 export python-3.x google-bigquery
我使用此代码将数据导出到 csv 文件,它可以工作:
project_id = 'project_id'
client = bigquery.Client()
dataset_id = 'dataset_id'
bucket_name = 'bucket_name'
table_id = 'table_id'
destination_uri = 'gs://{}/{}'.format(bucket_name, 'file.csv')
dataset_ref = client.dataset(dataset_id, project=project_id)
table_ref = dataset_ref.table(table_id)
extract_job = client.extract_table(
table_ref,
destination_uri)
extract_job.result()
Run Code Online (Sandbox Code Playgroud)
但我更喜欢GZ文件,因为我的表高达700M。谁能帮我将数据导出到 GZ 文件中?
您需要添加一个jobConfig喜欢:
job_config = bigquery.job.ExtractJobConfig()
job_config.compression = 'GZIP'
Run Code Online (Sandbox Code Playgroud)
完整代码:
from google.cloud import bigquery
client = bigquery.Client()
project_id = 'fh-bigquery'
dataset_id = 'public_dump'
table_id = 'afinn_en_165'
bucket_name = 'your_bucket'
destination_uri = 'gs://{}/{}'.format(bucket_name, 'file.csv.gz')
dataset_ref = client.dataset(dataset_id, project=project_id)
table_ref = dataset_ref.table(table_id)
job_config = bigquery.job.ExtractJobConfig()
job_config.compression = 'GZIP'
extract_job = client.extract_table(
table_ref,
destination_uri,
job_config = job_config
)
extract_job.result()
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
1088 次 |
| 最近记录: |