bigquery DataFlow 错误:在 EU 中读写时无法在不同位置读写

dan*_*era 2 python google-bigquery google-cloud-dataflow apache-beam

我有一个简单的 Google DataFlow 任务。它从 BigQuery 表中读取数据并写入另一个表,如下所示:

(p
 |  beam.io.Read( beam.io.BigQuerySource(
        query='select dia, import from DS1.t_27k where true', 
        use_standard_sql=True))
 |  beam.io.Write(beam.io.BigQuerySink(
                  output_table,
                  dataset='DS1', 
                  project=project, 
                  schema='dia:DATE, import:FLOAT',
                  create_disposition=CREATE_IF_NEEDED,
                      write_disposition=WRITE_TRUNCATE
                     )
                )
Run Code Online (Sandbox Code Playgroud)

我想问题是这个管道似乎需要一个临时数据集才能完成工作。我无法强制该临时数据集的位置。因为我的 DS1 位于欧盟 (#EUROPE-WEST1),而临时数据集位于美国(我猜),所以任务失败:

WARNING:root:Dataset m-h-0000:temp_dataset_e433a0ef19e64100000000000001a does not exist so we will create it as temporary with location=None
WARNING:root:A task failed with exception.
 HttpError accessing <https://www.googleapis.com/bigquery/v2/projects/m-h-000000/queries/b8b2f00000000000000002bed336369d?alt=json&maxResults=10000>: response: <{'status': '400', 'content-length': '292', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'expires': 'Sat, 14 Oct 2017 20:29:15 GMT', 'vary': 'Origin, X-Origin', 'server': 'GSE', '-content-encoding': 'gzip', 'cache-control': 'private, max-age=0', 'date': 'Sat, 14 Oct 2017 20:29:15 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'quic=":443"; ma=2592000; v="39,38,37,35"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
 "error": {
  "errors": [
   {
    "domain": "global",
    "reason": "invalid",
    "message": "Cannot read and write in different locations: source: EU, destination: US"
   }
  ],
  "code": 400,
  "message": "Cannot read and write in different locations: source: EU, destination: US"
 }
}
Run Code Online (Sandbox Code Playgroud)

管道选项:

options = PipelineOptions()

google_cloud_options = options.view_as(GoogleCloudOptions)
google_cloud_options.project = 'm-h'
google_cloud_options.job_name = 'myjob3'
google_cloud_options.staging_location = r'gs://p_df/staging'  #EUROPE-WEST1
google_cloud_options.region=r'europe-west1'
google_cloud_options.temp_location = r'gs://p_df/temp' #EUROPE-WEST1
options.view_as(StandardOptions).runner =   'DirectRunner'  #'DataflowRunner'

p = beam.Pipeline(options=options)
Run Code Online (Sandbox Code Playgroud)

我该如何做才能避免这个错误?

请注意,仅当我将其运行为DirectRunner.

Mar*_*cki 5

该错误Cannot read and write in different locations非常明显,可能是由于以下原因发生的:

  • BigQuery 数据集位于欧盟,而您在美国运行 DataFlow
  • 您的 GCS 存储桶位于欧盟,而您在美国运行 DataFlow

正如您在问题中指定的,您已在欧盟的 GCS 中创建了临时位置,并且您的 BigQuery 数据集也位于欧盟,因此您也必须在欧盟运行 DataFlow 作业。

为了实现这一点,您需要zone在 中指定参数PipelineOptions,如下所示:

options = PipelineOptions()

wo = options.view_as(WorkerOptions)  # type: WorkerOptions
wo.zone = "europe-west1-b"


# rest of your options:
google_cloud_options = options.view_as(GoogleCloudOptions)
google_cloud_options.project = 'm-h'
google_cloud_options.job_name = 'myjob3'
google_cloud_options.staging_location = r'gs://p_df/staging'  # EUROPE-WEST1
google_cloud_options.region = r'europe-west1'
google_cloud_options.temp_location = r'gs://p_df/temp'  # EUROPE-WEST1
options.view_as(StandardOptions).runner = 'DataFlowRunner'

p = beam.Pipeline(options=options)
Run Code Online (Sandbox Code Playgroud)