简体   繁体   English

AWS S3提供压缩文件,但不可读

[英]AWS S3 serving gzipped files but not readable

I'm using AWS S3 to host a static webpage, almost all assets are gzipped before being uploaded. 我正在使用AWS S3托管静态网页,几乎所有资产在上传前都要压缩。

During the upload the "content-encoding" header is correctly set to "gzip" (and this also reflects when actually loading the file from AWS). 在上传过程中,“ content-encoding”标头已正确设置为“ gzip”(这也反映了从AWS实际加载文件时的情况)。

The thing is, the files can't be read and are still in gzip format although the correct headers are set... 事实是,尽管设置了正确的标头,但无法读取文件并且它们仍为gzip格式...

The files are uploaded using npm s3-deploy , here's a screenshot of what the request looks like: 这些文件是使用npm s3-deploy上传的,以下是该请求的屏幕截图:

档案要求

and the contents of the file in the browser: 以及浏览器中文件的内容:

文件内容

If I upload the file manually and set the content-encoding header to "gzip" it works perfectly. 如果我手动上传文件并将content-encoding标头设置为“ gzip”,则它可以完美运行。 Sadly I have a couple hundred files to upload for every deployment and can not do this manually all the time (I hope that's understandable ;) ). 可悲的是,对于每个部署,我都有上百个文件要上传,并且不能一直手动进行(我希望这是可以理解的;))。

Has anyone an idea of what's going on here? 有谁知道这里发生的事情吗? Anyone worked with s3-deploy and can help? 有人与s3-deploy合作,可以提供帮助吗?

I use my own bash script for S3 deployments, you can try to do it: 我将自己的bash脚本用于S3部署,您可以尝试这样做:

webpath='path'
BUCKET='BUCKETNAME'

for file in $webpath/js/*.gz; do
        aws s3 cp "$file" s3://"$BUCKET/js/" --content-encoding 'gzip' --region='eu-west-1'
done

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM