简体   繁体   English

将文件从 Mac (iCloud) 保存到 S3 存储桶 (AWS) 的脚本

[英]Script to save files from Mac (iCloud) to S3 bucket (AWS)

So i guess first we need to write a script to save all the files from my iCloud which I guess we would have to do an API call for that?所以我想首先我们需要编写一个脚本来保存我的 iCloud 中的所有文件,我想我们必须为此调用 API? Then backup those files onto my aws S3 bucket.然后将这些文件备份到我的 aws S3 存储桶中。 Then the next question is, do I have to manually run the script or can we automate the runtime of the script to go off every hour.那么下一个问题是,我是否必须手动运行脚本,或者我们能否将脚本的运行时间自动化到每小时 go。

Thanks!谢谢!

If you don't mind, you can use a python script to do a combination of the ff:如果你不介意,你可以使用 python 脚本来组合 ff:

  1. pyicloud - To interact with your iCloud services and download your files locally. pyicloud - 与您的 iCloud 服务交互并在本地下载您的文件。
from pyicloud import PyiCloudService

api = PyiCloudService('jappleseed@apple.com', 'password')

# Some more setup if you have two-factor (or two-step) authentication

api.files.dir()

# Download the files you want locally, you can also open up a file stream if needed
  1. boto3 - To interact with S3 service configured in your AWS account and upload (or backup) the files to your bucket. boto3 - 与您的 AWS 帐户中配置的 S3 服务交互并将文件上传(或备份)到您的存储桶。
import boto3

client = boto3.client(
    's3',
    aws_access_key_id=os.getenv('AWS_ACCESS_KEY_ID'),
    aws_secret_access_key=os.getenv('AWS_SECRET_ACCESS_KEY')
)

# Loop over your local files to back them up to S3

response = client.upload_file(local_filename, bucket_name, s3_filename)
  1. Finally, you can setup some cron job to schedule your script to run daily if you want to be the one to manage the box where it will run.最后,如果您想成为管理脚本运行位置的人,您可以设置一些cron job来安排您的脚本每天运行。 An alternative here is to build a more Cloud-native solution which is to place your entire python code on a Lambda function that will be triggered by CloudWatch event which will be scheduled daily.这里的另一种选择是构建一个更加云原生的解决方案,即将您的整个 python 代码放在Lambda function上,该代码将由每天安排的CloudWatch event触发。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从 s3 存储桶中获取唯一文件 - Get unique files from s3 bucket Python AWS Boto3:如何从 S3 存储桶中读取文件? - Python AWS Boto3: How to read files from S3 bucket? 创建 AWS lambda function 以在 s3 存储桶中拆分 pdf 个文件 - Creating an AWS lambda function to split pdf files in a s3 bucket 将图像从 aws s3 存储桶加载到 django 应用程序 - Load an image from aws s3 bucket to django application 如何在脚本 elixir 中将文件夹的所有文件从另一个文件夹移动到同一个 S3 存储桶 - How to move all files of folder from another folder to same S3 bucket in script elixir 如何在 Airflow 中使用 Airflow AWS 连接凭证使用 BashOprator 将文件从 AWS s3 存储桶传输到 GCS - How to use Airflow AWS connection credentials in Airflow using BashOprator to transfer files from AWS s3 bucket to GCS 使用正则表达式从 aws s3 url 中提取存储桶名称 - Extract bucket name from aws s3 url using regex 如何从 amazon s3 存储桶中删除文件? - how to delete files from amazon s3 bucket? 从 s3 存储桶中获取具有特定子字符串的文件列表 - Get list of files from s3 bucket with a particular substring 如何将 AWS S3 存储桶与远程非 AWS 服务器上的文件同步? - How do I sync an AWS S3 bucket with files on a remote non-AWS server?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM