Miz*_*ury 6 python google-bigquery airflow
我需要在 python 中运行一个 bigquery 脚本,它需要在谷歌云存储中以 CSV 形式输出。目前,我的脚本触发大查询代码并直接保存到我的电脑。
但是,我需要让它在 Airflow 中运行,这样我就不能有任何本地依赖项。
我当前的脚本将输出保存到本地计算机,然后我必须将其移至 GCS。网上查了一下,也搞不懂。(ps,我对 python 很陌生,所以如果之前有人问过这个问题,我提前表示抱歉!)
import pandas as pd
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
def run_script():
df = pd.read_gbq('SELECT * FROM `table/veiw` LIMIT 15000',
project_id='PROJECT',
dialect='standard'
)
df.to_csv('XXX.csv', index=False)
def copy_to_gcs(filename, bucket, destination_filename):
credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)
body = {'name': destination_filename}
req = service.objects().insert(bucket=bucket,body=body, media_body=filename)
resp = req.execute()
current_date = datetime.date.today()
filename = (r"C:\Users\LOCALDRIVE\ETC\ETC\ETC.csv")
bucket = 'My GCS BUCKET'
str_prefix_datetime = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
destfile = 'XXX' + str_prefix_datetime + '.csv'
print('')
```
Run Code Online (Sandbox Code Playgroud)
Airflow 提供了多个用于使用 BigQuery 的运算符。
您可以在 Cloud Composer 代码示例 中查看运行查询,然后将结果导出到 CSV的示例。
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Query recent StackOverflow questions.
bq_recent_questions_query = bigquery_operator.BigQueryOperator(
task_id='bq_recent_questions_query',
sql="""
SELECT owner_display_name, title, view_count
FROM `bigquery-public-data.stackoverflow.posts_questions`
WHERE creation_date < CAST('{max_date}' AS TIMESTAMP)
AND creation_date >= CAST('{min_date}' AS TIMESTAMP)
ORDER BY view_count DESC
LIMIT 100
""".format(max_date=max_query_date, min_date=min_query_date),
use_legacy_sql=False,
destination_dataset_table=bq_recent_questions_table_id)
# Export query result to Cloud Storage.
export_questions_to_gcs = bigquery_to_gcs.BigQueryToCloudStorageOperator(
task_id='export_recent_questions_to_gcs',
source_project_dataset_table=bq_recent_questions_table_id,
destination_cloud_storage_uris=[output_file],
export_format='CSV')
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
13864 次 |
| 最近记录: |