繁体   English   中英

通过 boto3 同步两个存储桶

[英]Sync two buckets through boto3

有什么方法可以使用 boto3 在两个不同的存储桶(源和目标)中循环存储桶内容,如果在源中找到与目标不匹配的任何键,则将其上传到目标存储桶。 请注意,我不想使用 aws s3 同步。 我目前正在使用以下代码来完成这项工作:

import boto3

s3 = boto3.resource('s3')
src = s3.Bucket('sourcenabcap')
dst = s3.Bucket('destinationnabcap')
objs = list(dst.objects.all())
for k in src.objects.all():
 if (k.key !=objs[0].key):
  # copy the k.key to target

如果您决定不使用 boto3。 同步命令仍然不适用于 boto3,因此您可以直接使用它

# python 3

import os

sync_command = f"aws s3 sync s3://source-bucket/ s3://destination-bucket/"
os.system(sync_command)

我刚刚为此实现了一个简单的类(将本地文件夹同步到存储桶)。 我把它贴在这里希望它可以帮助任何有同样问题的人。

您可以修改 S3Sync.sync 以将文件大小考虑在内。

class S3Sync:
    """
    Class that holds the operations needed for synchronize local dirs to a given bucket.
    """

    def __init__(self):
        self._s3 = boto3.client('s3')

    def sync(self, source: str, dest: str) -> [str]:
        """
        Sync source to dest, this means that all elements existing in
        source that not exists in dest will be copied to dest.

        No element will be deleted.

        :param source: Source folder.
        :param dest: Destination folder.

        :return: None
        """

        paths = self.list_source_objects(source_folder=source)
        objects = self.list_bucket_objects(dest)

        # Getting the keys and ordering to perform binary search
        # each time we want to check if any paths is already there.
        object_keys = [obj['Key'] for obj in objects]
        object_keys.sort()
        object_keys_length = len(object_keys)
        
        for path in paths:
            # Binary search.
            index = bisect_left(object_keys, path)
            if index == object_keys_length:
                # If path not found in object_keys, it has to be sync-ed.
                self._s3.upload_file(str(Path(source).joinpath(path)),  Bucket=dest, Key=path)

    def list_bucket_objects(self, bucket: str) -> [dict]:
        """
        List all objects for the given bucket.

        :param bucket: Bucket name.
        :return: A [dict] containing the elements in the bucket.

        Example of a single object.

        {
            'Key': 'example/example.txt',
            'LastModified': datetime.datetime(2019, 7, 4, 13, 50, 34, 893000, tzinfo=tzutc()),
            'ETag': '"b11564415be7f58435013b414a59ae5c"',
            'Size': 115280,
            'StorageClass': 'STANDARD',
            'Owner': {
                'DisplayName': 'webfile',
                'ID': '75aa57f09aa0c8caeab4f8c24e99d10f8e7faeebf76c078efc7c6caea54ba06a'
            }
        }

        """
        try:
            contents = self._s3.list_objects(Bucket=bucket)['Contents']
        except KeyError:
            # No Contents Key, empty bucket.
            return []
        else:
            return contents

    @staticmethod
    def list_source_objects(source_folder: str) -> [str]:
        """
        :param source_folder:  Root folder for resources you want to list.
        :return: A [str] containing relative names of the files.

        Example:

            /tmp
                - example
                    - file_1.txt
                    - some_folder
                        - file_2.txt

            >>> sync.list_source_objects("/tmp/example")
            ['file_1.txt', 'some_folder/file_2.txt']

        """

        path = Path(source_folder)

        paths = []

        for file_path in path.rglob("*"):
            if file_path.is_dir():
                continue
            str_file_path = str(file_path)
            str_file_path = str_file_path.replace(f'{str(path)}/', "")
            paths.append(str_file_path)

        return paths


if __name__ == '__main__':
    sync = S3Sync()
    sync.sync("/temp/some_folder", "some_bucket_name")

if file_path.is_dir():替换为if not file_path.is_file():让它绕过无法解析的链接和其他类似的废话,感谢@keithpjolley 指出这一点。

如果您只想按 Key 进行比较(忽略对象内的差异),您可以使用以下内容:

s3 = boto3.resource('s3')
source_bucket = s3.Bucket('source')
destination_bucket = s3.Bucket('destination')
destination_keys = [object.key for object in destination_bucket.objects.all()]
for object in source_bucket.objects.all():
  if (object.key not in destination_keys):
    # copy object.key to destination
  1. 获取目标帐户 ID DEST_ACCOUNT_ID

  2. 创建源存储桶并添加此策略

 { "Version": "2012-10-17", "Statement": [ { "Sid": "DelegateS3Access", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::DEST_ACCOUNT_ID:root" }, "Action": [ "s3:ListBucket", "s3:GetObject" ], "Resource": [ "arn:aws:s3:::s3-copy-test/*", "arn:aws:s3:::s3-copy-test" ] } ] }

  1. 创建要复制的文件

  2. 在目标账户上创建用户并使用该用户配置 AWS CLI

  3. 在目标账户上创建目标存储桶

  4. 将此策略附加到目标账户上的 IAM 用户

     { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetObject" ], "Resource": [ "arn:aws:s3:::s3-copy-test", "arn:aws:s3:::s3-copy-test/*" ] }, { "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:PutObject", "s3:PutObjectAcl" ], "Resource": [ "arn:aws:s3:::s3-copy-test-dest", "arn:aws:s3:::s3-copy-test-dest/*" ] } ]

    }

  5. 执行文件同步

aws s3 sync s3://s3-copy-test s3://s3-copy-test-dest --source-region eu-west-1 --region eu-west-1

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM