[英]Unzip file content hosted in s3 to multiple cloudfront url through a single lambda function
Is there any specific way to unzip single file contents from s3 to multiple cloudfront urls by triggering lambda once.是否有任何特定的方法可以通过一次触发 lambda 将单个文件内容从 s3 解压缩到多个云端 url。
Lets say in there is a zip file contains multiple jpg/ png files already uploaded to s3.假设有一个 zip 文件包含多个已上传到 s3 的 jpg/png 文件。 Intention is to run lambda function only once to unzip all its file content and make them available in multiple cloudfront urls.
目的是仅运行一次 lambda function 以解压缩其所有文件内容并使它们在多个云端 url 中可用。
in s3 bucket在 s3 桶中
archive.zip
a.jpg
b.jpg
c.jpg
through cloudfront通过云端
https://1232.cloudfront.net/a.jpg
https://1232.cloudfront.net/b.jpg
https://1232.cloudfront.net/c.jpg
I am looking for a solution such that lambda function trigger function calls whenever a s3 upload happens and make all files available in the zip through cloudfront multiple urls.我正在寻找一种解决方案,这样 lambda function 就会在 s3 上传发生时触发 function 调用,并通过云端多个 url 使 zip 中的所有文件可用。
Hello Prathap Parameswar,您好 Prathap Parameswar,
I think you can resolve your problem like this:我认为你可以这样解决你的问题:
This is lambda python function:这是 lambda python function:
import json
import boto3
from io import BytesIO
import zipfile
def lambda_handler(event, context):
# TODO implement
s3_resource = boto3.resource('s3')
source_bucket = 'upload-zip-folder'
target_bucket = 'upload-extracted-folder'
my_bucket = s3_resource.Bucket(source_bucket)
for file in my_bucket.objects.all():
if(str(file.key).endswith('.zip')):
zip_obj = s3_resource.Object(bucket_name=source_bucket, key=file.key)
buffer = BytesIO(zip_obj.get()["Body"].read())
z = zipfile.ZipFile(buffer)
for filename in z.namelist():
file_info = z.getinfo(filename)
try:
response = s3_resource.meta.client.upload_fileobj(
z.open(filename),
Bucket=target_bucket,
Key=f'{filename}'
)
except Exception as e:
print(e)
else:
print(file.key+ ' is not a zip file.')
Hope this can help you希望这可以帮到你
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.