应用引擎标准环境调用bigquery python

Cha*_*han 1 google-app-engine python-2.7 google-bigquery

我正在尝试在 python 中部署一个简单的标准应用程序引擎,然后从那里通过 python bigquery 客户端进行 bigquery 查询。

代码很简单:

from __future__ import absolute_import
import webapp2
import os
from google.cloud import bigquery

class MainPage(webapp2.RequestHandler):
  def get(self):


  client = bigquery.Client(project = "ancient-ceiling-125223")
  project_name = str(client.project)

  query_job = client.query("select 1")

  assert query_job.state == 'RUNNING'

  iterator = query_job.result(timeout= 30)

  rows = list(iterator)

  self.response.write('nothing to see %s' % (project_name))


app = webapp2.WSGIApplication(
[('/', MainPage)], debug=True)
Run Code Online (Sandbox Code Playgroud)

错误日志:错误出现在虚拟查询请求中

*('Connection broken: IncompleteRead(209 bytes read)', IncompleteRead(209 bytes read)) (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:1528)
Traceback (most recent call last):
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1511, in __call__
    rv = self.handle_exception(request, response, e)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1505, in __call__
    rv = self.router.dispatch(request, response)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1253, in default_dispatcher
    return route.handler_adapter(request, response)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1077, in __call__
    return handler.dispatch()
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 547, in dispatch
    return self.handle_exception(e, self.app.debug)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 545, in dispatch
    return method(*args, **kwargs)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/main.py", line 35, in get
    query_job = client.query("select 1")
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/cloud/bigquery/client.py", line 986, in query
    job._begin(retry=retry)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/cloud/bigquery/job.py", line 397, in _begin
    method='POST', path=path, data=self._build_resource())
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/cloud/bigquery/client.py", line 271, in _call_api
    return call()
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/api_core/retry.py", line 260, in retry_wrapped_func
    on_error=on_error,
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/api_core/retry.py", line 177, in retry_target
    return target()
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/cloud/_http.py", line 290, in api_request
    headers=headers, target_object=_target_object)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/cloud/_http.py", line 183, in _make_request
    return self._do_request(method, url, headers, data, target_object)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/cloud/_http.py", line 212, in _do_request
    url=url, method=method, headers=headers, data=data)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/google/auth/transport/requests.py", line 186, in request
    method, url, data=data, headers=request_headers, **kwargs)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/requests/sessions.py", line 502, in request
    resp = self.send(prep, **send_kwargs)
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/requests/sessions.py", line 652, in send
    r.content
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/requests/models.py", line 825, in content
    self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
  File "/base/data/home/apps/s~ancient-ceiling-125223/20171115t104156.405543624689752939/lib/requests/models.py", line 750, in generate
    raise ChunkedEncodingError(e)
ChunkedEncodingError: ('Connection broken: IncompleteRead(209 bytes read)', IncompleteRead(209 bytes read))*
Run Code Online (Sandbox Code Playgroud)

Cha*_*han 5

我无法使用 python bigquery 客户端库使其工作,这是我在标准应用程序引擎环境中发现的工作,

from __future__ import absolute_import
import webapp2

from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials


class MainPage(webapp2.RequestHandler):
  def get(self):

    credentials = GoogleCredentials.get_application_default()
    service = build('bigquery', 'v2', credentials=credentials)

    datasets = service.datasets().list(projectId="ancient-ceiling-125223").execute()

    self.response.write('datasets: %s' % datasets)


app = webapp2.WSGIApplication(
    [('/', MainPage)], debug=True)
Run Code Online (Sandbox Code Playgroud)

  • 去!这就是为什么我问你关于环境的原因,我也无法让 google-cloud API 为 Standard 工作,只有 googleapiclient 也是如此。我推荐的是在请求之外创建您的服务(在这个 [repo](https://github.com/WillianFuks/example_dataproc_twitter/tree/master/gae/exporter) 中有一个我如何做的例子)。另请注意,如果您的查询时间超过 60 秒,则此方法将不起作用。那么您可能需要安排您的查询。 (4认同)