簡體   English   中英

模塊 google.cloud 沒有屬性 storage

[英]module google.cloud has no attribute storage

我正在嘗試按照本教程在 GCP 上的 python 中運行梁腳本:

[https://levelup.gitconnected.com/scaling-scikit-learn-with-apache-beam-251eb6fcf75b][1]

但我不斷收到以下錯誤:

AttributeError: module 'google.cloud' has no attribute 'storage'

我的 requirements.txt 中有 google-cloud-storage,所以真的不確定我在這里缺少什么。

我的完整腳本:

import apache_beam as beam
import json

query = """
    SELECT 
    year, 
    plurality, 
    apgar_5min, 
    mother_age, 
    father_age,
    gestation_weeks,
    ever_born,
    case when mother_married = true then 1 else 0 end as mother_married,
    weight_pounds as weight,
    current_timestamp as time,
    GENERATE_UUID() as guid
    FROM `bigquery-public-data.samples.natality` 
    order by rand()
    limit 100    
""" 

class ApplyDoFn(beam.DoFn):
    def __init__(self):
        self._model = None
        from google.cloud import storage
        import pandas as pd
        import pickle as pkl
        self._storage = storage
        self._pkl = pkl
        self._pd = pd
    
    def process(self, element):
        if self._model is None:
            bucket = self._storage.Client().get_bucket('bqr_dump')
            blob = bucket.get_blob('natality/sklearn-linear')
            self._model = self._pkl.loads(blob.download_as_string())
            
        new_x = self._pd.DataFrame.from_dict(element,
                                            orient='index').transpose().fillna(0)
        pred_weight = self._model.predict(new_x.iloc[:, 1:8])[0]
        return [ {'guid': element['guid'],
                 'predicted_weight': pred_weight,
                 'time': str(element['time'])}]



# set up pipeline options
options = {'project': my-project-name,
           'runner': 'DataflowRunner',
           'temp_location': 'gs://bqr_dump/tmp',
           'staging_location': 'gs://bqr_dump/tmp'
           }

pipeline_options = beam.pipeline.PipelineOptions(flags=[], **options)

with beam.Pipeline(options=pipeline_options) as pipeline:
    (
        pipeline
        | 'ReadTable' >> beam.io.Read(beam.io.BigQuerySource(
            query=query,
            use_standard_sql=True))
        | 'Apply Model' >> beam.ParDo(ApplyDoFn())
        | 'Save to BigQuery' >> beam.io.WriteToBigQuery(
            'pzn-pi-sto:beam_test.weight_preds', 
            schema='guid:STRING,weight:FLOAT64,time:STRING', 
            write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND,
            create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED))

和我的 requirements.txt:

google-cloud==0.34.0
google-cloud-storage==1.30.0
apache-beam[GCP]==2.20.0

這個問題通常與兩個主要原因有關:模塊沒有安裝好,這意味着在安裝過程中出現了問題;第二個原因是模塊的import沒有正確完成。

要解決此問題,如果原因是模塊損壞,則在虛擬環境中重新安裝或檢查將是解決方案。 如此處所示,與您的情況類似,這應該可以解決您的情況。

對於第二個原因,嘗試更改代碼並導入代碼開頭的所有模塊,如這里官方示例所示。 你的代碼應該是這樣的:

import apache_beam as beam
import json
import pandas as pd
import pickle as pkl

from google.cloud import storage
...

讓我知道這些信息是否對您有幫助!

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM