简体   繁体   English

验证 CSV 文件并将其从 GCS 传输到 Python 中的远程网络驱动器

[英]Authenticating and transferring CSV files from GCS to a remote network drive in Python

I'm wanting to use a containerised Python app to connect and authenticate to an AD managed.network drive on a physical file server to then transfer some csv files to it from a Google Cloud bucket and I'm wondering what the best options are to do this?我想使用容器化的 Python 应用程序连接到物理文件服务器上的 AD managed.network 驱动器并进行身份验证,然后将一些 csv 文件从 Google Cloud 存储桶传输到它,我想知道最好的选择是什么做这个?

So far I have established that I can see the server using:到目前为止,我已经确定我可以看到服务器使用:

    try:
        smbclient.register_session("xx.xx.xx.xx", username="user", password="pass")
    except Exception as exec:
        print(exec)

But this leads me to the problem of the best way to authenticate against it.但这让我想到了对它进行身份验证的最佳方式的问题。 I'm unfortunately not a very experienced programmer, I was wondering if there was some sort of token exchange that I could implement?不幸的是,我不是一个非常有经验的程序员,我想知道是否可以实现某种令牌交换?

I used:我用了:

mount_string = f'mount -t cifs -o username={secret_dict["user"]}, {secret_dict["password"]} //<serverip>/<share> {mount}'

try:
    os.system(mount_string)
except Exception as exec:
    print(exec)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何从 GCP 中的 python 脚本将文件上传到 GCS? - How to upload files to GCS from a python script in GCP? 从 GCS 导入 csv 文件并使用 Dataflow 进行转换,然后使用 Airflow 传感器接收到 BigQuery - Import csv files from GCS and transform using Dataflow then sink to BigQuery using Airflow sensors 从 GCS 加载_Csv_data 到 Bigquery - Load_Csv_data from GCS to Bigquery 在 GCS 中将文件从一个存储桶移动到另一个存储桶 - Moving files from one bucket to another in GCS 将文件从 GCS 存储桶传输到 Webdav 服务器 - Transfer files from GCS bucket to Webdav server 是否可以从 GCS 异步中删除文件 - Is it possible to delete files from GCS async apache-beam 从 GCS 桶的多个文件夹中读取多个文件并加载它 biquery python - apache-beam reading multiple files from multiple folders of GCS buckets and load it biquery python 从 GCS 读取 CSV 到 class 数据流 Java - Read CSV to a class Dataflow Java from GCS 是否可以在处理过程中对 GCS 中的 CSV 文件进行零字节检查? - Is it possible to implement zero byte check on CSV files in GCS during processing? 如何从谷歌数据流 apache 光束 python 中的 GCS 存储桶中读取多个 JSON 文件 - How to read multiple JSON files from GCS bucket in google dataflow apache beam python
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM