简体   繁体   中英

Boto3 - python script to view all directories and files

I tried to follow the Boto3 examples, but can literally only manage to get the very basic listing of all my S3 buckets via the example they give:

import boto3
s3 = boto3.resource('s3')
for bucket in s3.buckets.all():
    print(bucket.name)

I cannot find documentation that explains how I would be able to traverse or change into folders and then access individual files.

I'm trying to get to my SNS delivery reports, which are stored in a folder for each day of the month - so it is a pain to manually have to download each file for the month and then to concatenate the contents of each file in order to get the count of all SMS messages sent for a month.

Does anyone have an example of a script that can help me with this, or pointers to really basic documentation/examples of helping me to do this?

I have 3 S3 buckets, and all the files are located in sub folders in one of them:

bucketname
|->Year
  |->Month
     |->Day1
     |->Day2
     |->Day3 
     |->Day4

etc etc Underneath the "Day" folder is a single text file called 001.txt SO I am trying to concatenate all the 001.txt files for each day of a month and then find the rowcount of the concatenated text file - which would give me the count of all SMS sent - successful and failed.

Any help, much appreciated.

There are no folders, only S3 object keys.

Using the Bucket Resource interface, you can filter the list of objects in a bucket using the objects collection filter() method (see example ).

You can also use the Client interface to call list_objects() with a suitable prefix and delimiter to retrieve subsets of objects.

See Listing Keys Hierarchically for a high-level description.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM