简体   繁体   中英

File Migration from EC2 to S3

We are currently creating a website that is kind of an upgrade to an old existing one. We would like to keep the old posts (that include images) in the new website. The old files are kept in an ec2 instance while the new website is serverless and keeps all it's files in s3.

My question, is there any way I could transfer the old files (from ec2) to the new s3 bucket using Python. I would like rename and relocate the files in the new filename/filepathing pattern that we devs decided.

There is boto3, python aws toolkit.

https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html

import logging
import boto3
from botocore.exceptions import ClientError


def upload_file(file_name, bucket, object_name=None):
    """Upload a file to an S3 bucket

    :param file_name: File to upload
    :param bucket: Bucket to upload to
    :param object_name: S3 object name. If not specified then file_name is used
    :return: True if file was uploaded, else False
    """

    # If S3 object_name was not specified, use file_name
    if object_name is None:
        object_name = file_name

    # Upload the file
    s3_client = boto3.client('s3')
    try:
        response = s3_client.upload_file(file_name, bucket, object_name)
    except ClientError as e:
        logging.error(e)
        return False
    return True

You can write script with S3 upload_file function, then run on your ec2 locally.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM