简体   繁体   English

使用 Python 将文件移入和移出 Amazon S3 存储桶密钥

[英]Moving files to and from an Amazon S3 bucket key using Python

I do not have access to the root bucket but I do have access to a key (KEY NAME) within the bucket.我无权访问根存储桶,但我可以访问存储桶中的密钥(KEY NAME)。

Example: I cannot access 'BUCKET NAME' but I can access 'BUCKET NAME/KEY NAME'示例:我无法访问“BUCKET NAME”,但可以访问“BUCKET NAME/KEY NAME”

I have been trying to move files within 'KEY NAME'.我一直在尝试在“KEY NAME”中移动文件。 In the code below, what I've managed to get working is list_objects_v2.在下面的代码中,我设法开始工作的是 list_objects_v2。

upload_file gives me the following error: upload_file给我以下错误:

An error occurred (AccessDenied) when calling the PutObject operation: Access Denied调用 PutObject 操作时发生错误 (AccessDenied):访问被拒绝

download_file gives me the following error: download_file给我以下错误:

PermissionError: [WinError 5] Access is denied: 'C/Users/username/Desktop' PermissionError:[WinError 5] 访问被拒绝:'C/Users/username/Desktop'

I'm very new to the AWS environment.我对 AWS 环境非常陌生。 What can I do on my end to fully get the access I need?我可以做些什么来完全获得我需要的访问权限?

import logging
import sys
import boto3
import boto
import boto.s3.connection
from botocore.exceptions import ClientError
from boto3.session import Session


def main():

    arguments = len(sys.argv) - 1

    if arguments < 1:
        print("You must supply a folder name")
        return

    bucket_name = 'BUCKET NAME'
    key_name = 'KEY NAME'
    folder = sys.argv[1]


    s3 = boto3.client('s3')
    objects = s3.list_objects_v2(Bucket = bucket_name,
                                 Prefix = key_name + '/' + folder + '/',
                                 Delimiter = '/')
    i = 1

    #
    # Print the bucket's objects within 'KEY NAME'
    #
    if objects is not None:
        # List the object names
        logging.info('Objects in {bucket_name}')
        print("Length of Objects: " + str(len(objects)))
        for obj in objects:
            print("......\n")
            print(i)
            print("....\n")
            print(obj)
            print("..\n")
            print(objects[obj])
            i += 1
    else:
        # Didn't get any keys
        logging.info('No objects in {bucket_name}')

    #
    # Test to see if we can isolate a folder within 'KEY NAME'
    #
    print("\n")
    print("Common Prefixes" + str(objects['CommonPrefixes']) + "\n")
    keys = objects['CommonPrefixes']
    print ("Object 0" + str(keys[0]) + '\n')

    s3 = boto3.resource('s3')
    s3.meta.client.upload_file('C:/Users/username/Desktop/Test/Test.txt',
                               bucket_name,
                               key_name)
    # s3.meta.client.download_file(bucket_name,
    #                              key_name + '/' + folder + '/' + 'Test.txt',
    #                              'C:/Users/username/Desktop')

if __name__ == '__main__':
    main()

The most important part is to ensure that you have been given adequate permissions to upload/download/list the prefix.最重要的部分是确保您已获得足够的权限来上传/下载/列出前缀。

Here is an example policy that grants access to a prefix of special/ :下面是一个示例策略,它授予对special/前缀的访问权限:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowUserToSeeBucketListInTheConsole",
            "Action": [
                "s3:ListAllMyBuckets",
                "s3:GetBucketLocation"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::*"
            ]
        },
        {
            "Sid": "AllowListingOfPrefix",
            "Action": [
                "s3:ListBucket"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::my-bucket"
            ],
            "Condition": {
                "StringEquals": {
                    "s3:prefix": [
                        "special/"
                    ],
                    "s3:delimiter": [
                        "/"
                    ]
                }
            }
        },
        {
            "Sid": "UploadDownload",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::my-bucket/special/*"
        }
    ]
}

Then, you can run code like this:然后,您可以运行如下代码:

import boto3

s3_client = boto3.client('s3')

# Upload a file to S3
s3_client.upload_file('/tmp/hello.txt', 'my-bucket', 'special/hello.txt')

# Download an object
s3_client.download_file('my-bucket', 'special/hello.txt', '/tmp/hello2.txt')

# List objects using Client method
response = s3_client.list_objects_v2(Bucket='my-bucket',Delimiter='/',Prefix='special/')
for object in response['Contents']:
  print(object['Key'], object['Size'])

# List objects using Resource method
s3_resource = boto3.resource('s3')
bucket = s3_resource.Bucket('my-bucket')

for object in bucket.objects.filter(Delimiter='/',Prefix='special/'):
  print(object.key, object.size)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM