![](/img/trans.png)
[英]CloudDataflow can not use “google.cloud.datastore” package?
[英]Google Cloud Dataflow can't import 'google.cloud.datastore'
這是我的導入代碼
from __future__ import absolute_import
import datetime
import json
import logging
import re
import apache_beam as beam
from apache_beam import combiners
from apache_beam.io.gcp.bigquery import parse_table_schema_from_json
from apache_beam.io.gcp.datastore.v1.datastoreio import ReadFromDatastore
from apache_beam.pvalue import AsDict
from apache_beam.pvalue import AsSingleton
from apache_beam.options.pipeline_options import PipelineOptions
from google.cloud.proto.datastore.v1 import query_pb2
from google.cloud import datastore
from googledatastore import helper as datastore_helper, PropertyFilter
# datastore entities that we need to perform the mapping computations
#from models import UserPlan, UploadIntervalCount, RollingMonthlyCount
這就是我的 requirements.txt 文件的樣子
$ cat requirements.txt
Flask==0.12.2
apache-beam[gcp]==2.1.1
gunicorn==19.7.1
google-cloud-dataflow==2.1.1
six==1.10.0
google-cloud-datastore==1.3.0
google-cloud
這一切都在/lib
目錄中。 /lib
目錄有以下內容
$ ls -1 lib/google/cloud
__init__.py
_helpers.py
_helpers.pyc
_http.py
_http.pyc
_testing.py
_testing.pyc
bigquery
bigtable
client.py
client.pyc
datastore
dns
environment_vars.py
environment_vars.pyc
error_reporting
exceptions.py
exceptions.pyc
gapic
iam.py
iam.pyc
language
language_v1
language_v1beta2
logging
monitoring
obselete.py
obselete.pyc
operation.py
operation.pyc
proto
pubsub
resource_manager
runtimeconfig
spanner
speech
speech_v1
storage
translate.py
translate.pyc
translate_v2
videointelligence.py
videointelligence.pyc
videointelligence_v1beta1
vision
vision_v1
請注意, google.cloud.datastore
和google.cloud.proto
存在於/lib
文件夾中。 但是,此導入行工作正常
from google.cloud.proto.datastore.v1 import query_pb2
但是這個失敗了
from google.cloud import datastore
這是例外(取自在線谷歌雲數據流控制台)
(9b49615f4d91c1fb): Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 294, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10607)
def start(self):
File "apache_beam/runners/worker/operations.py", line 295, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10501)
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 300, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:9702)
pickler.loads(self.spec.serialized_fn))
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 225, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
return load(file)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
obj = pik.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module
return getattr(__import__(module, None, None, [obj]), obj)
File "/usr/local/lib/python2.7/dist-packages/dataflow_pipeline/counters_pipeline.py", line 25, in <module>
from google.cloud import datastore
ImportError: No module named datastore
為什么找不到包?
外部依賴項必須安裝在setup.py
並且此文件應在管道參數中指定為--setup_file
。 在setup.py
您可以使用自定義命令安裝包
pip install google-cloud-datastore==1.3.0
或者通過將您的包添加到REQUIRED_PACKAGES
:
REQUIRED_PACKAGES = ["google-cloud-datastore==1.3.0"]
您需要在setup.py
指定它的原因是因為在 DataFlow 執行期間沒有使用您在appengine_config
中的庫。 App Engine 在這里僅充當調度程序,僅將作業部署到 DataFlow 引擎。 然后,DataFlow 創建一些工作機器來執行您的管道 - 這些工作人員不會以任何方式連接到 App Engine。 DataFlow 工作人員必須擁有執行管道所需的每個包,這就是您需要在setup.py
文件中指定所需包的原因。 DataFlow 工作人員使用此文件來“設置自己”。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.