简体   繁体   English

从jupyter执行Google Cloud Shell命令

[英]Executing Google Cloud Shell commands from jupyter

I need to upload file from FTP to Google Cloud Storage on daily basis. 我需要每天将文件从FTP上传到Google Cloud Storage。 I have managed to make it in Python by downloading file from FTP and uploading to Google Cloud Storage, but it seems too heavy. 通过从FTP下载文件并将其上传到Google Cloud Storage,我设法用Python制作了它,但是它看起来太沉重了。 So, i am curious if it is possible to make it by executing commands in Google Cloud Shell, or is there any other possible solutions to make it in more optimal way. 因此,我很好奇是否可以通过在Google Cloud Shell中执行命令来实现它,或者是否有其他可能的解决方案以更优化的方式实现它。

from gcloud import storage
from oauth2client.service_account import ServiceAccountCredentials
import os

ftp = ftplib.FTP("ftp_url")
ftp.login('login', 'password')
ftp.cwd("/")


with open('file', 'wb') as f:
    ftp.retrbinary('RETR ' + 'file', f.write)

credentials = ServiceAccountCredentials.from_json_keyfile_dict(
    credentials_dict
)

client = storage.Client(credentials=credentials, project='project_id')

bucket = client.get_bucket('bucket')
blob = bucket.blob('file')
blob.upload_from_filename('file')

Maybe you could use "Cloud Storage FUSE" to mount Cloud Storage buckets as file systems directly in your FTP server to help in the issue reported in the description of this question. 也许您可以使用“ Cloud Storage FUSE”将Cloud Storage存储桶作为文件系统直接安装在FTP服务器中,以帮助解决此问题的描述中报告的问题。

Now, answering the title of the question, you can use this code as an example: 现在,回答问题的标题,您可以使用以下代码作为示例:

Pip Install: 点安装:

!pip install google-cloud
!pip install google-api-python-client
!pip install oauth2client
!pip install google-cloud-bigquery

Code: 码:

import subprocess
import logging
from google.cloud import storage

logger = logging.Logger('catch_all')

def execute_bash(parameters):
    try:
        return subprocess.check_output(parameters)
    except Exception as e: 
       logger.error(e) 
       logger.error('ERROR: Looking in jupyter console for more information')

def example_list_bucket_gcs():
    list_bucket = execute_bash(['gsutil', 'ls']).decode("utf-8").split('\n')
    for bucket in list_bucket:
        print(bucket)

def example_list_bucket_api(client_gcs):
    list_bucket = client_gcs.list_buckets()
    for bucket in list_bucket:
        print(bucket.name)

JSON_FILE_NAME = 'sa_bq.json'
client_gcs = storage.Client.from_service_account_json(JSON_FILE_NAME)
example_list_bucket_api(client_gcs)
example_list_bucket_gcs()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM