简体   繁体   English

static Amazon S3 文件的 GZIP 压缩

[英]GZIP Compression on static Amazon S3 files

I would like to implement GZIP compression on my site.我想在我的网站上实施GZIP压缩。 I've implemented it on IIS and the HTML page is compressed successfully as expected.我已经在IIS上实现了它,并且HTML 页面按预期成功压缩

Now the issue is with CSS and JS file, which I get from Amazon S3 .现在问题出在CSS 和我从Amazon S3获得的 JS 文件上。 They are not at all compressed.它们根本没有被压缩。 I wanted to compress them too.我也想压缩它们。

Please guide me how to do it.请指导我如何去做。 Sharing links for it help me a lot.分享它的链接对我帮助很大。

Update: I've added Meta Header on S3 files as "Content-Encoding:gzip", now its showing in Response header. Still the file size is same and no effect of Particular CSS in page .更新:我在 S3 文件上添加了 Meta Header 作为“Content-Encoding:gzip”,现在它显示在 Response header 中。文件大小仍然相同,并且没有影响页面中的特殊 CSS And i can't even open it in browser.而且我什至无法在浏览器中打开它。 Here is the [link][1] of particular css.这是特定 css 的 [链接][1]。

Thanks谢谢

Files should be compressed before being uploaded to Amazon S3.文件上传到 Amazon S3之前应该被压缩。

For some examples, see:有关一些示例,请参阅:

If you use CloudFront in front of your S3 bucket, there is no need to manually compress HTML ressources (CloudFront will compress them on-the-fly).如果您在 S3 存储桶前面使用 CloudFront,则无需手动压缩 HTML 资源(CloudFront 会即时压缩它们)。 Please note CloudFront only compress in gzip (no deflate, brotli) and only CSS / JS / HTML (based on content-type).请注意 CloudFront 仅在 gzip(无 deflate、brotli)和 CSS/JS/HTML(基于内容类型)中压缩。 See https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html#compressed-content-cloudfront-file-types .请参阅https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html#compressed-content-cloudfront-file-types To make it works, you have to forward some http headers from CloudFront to S3 (see doc).要使其正常工作,您必须将一些 http 标头从 CloudFront 转发到 S3(请参阅文档)。

If your S3 bucket have resources not supported by Cloudfront (generic "binary/octet-stream" mime type, like "hdr" texture or "nds" ROM), you need to compress them by yourself before uploading to S3, then set the "content-encoding" http meta on the resource.如果您的 S3 存储桶包含 Cloudfront 不支持的资源(通用“二进制/八位字节流”mime 类型,如“hdr”纹理或“nds”ROM),您需要在上传到 S3 之前自行压缩它们,然后设置“内容编码”资源上的 http 元。 Note that only browsers supporting the gz encoding will be able to download and decompress the file.请注意,只有支持 gz 编码的浏览器才能下载和解压缩文件。

If you don't want to compress the file one-by-one by the hand, you can use a Lambda function如果不想手动一个一个压缩文件,可以使用Lambda函数

  • triggered on each PUT of an object (a file) in the bucket在存储桶中对象(文件)的每个 PUT 上触发
  • if the file is not already compressed and if compression is usefull, then replace the original uploaded file with the compressed version如果文件尚未压缩并且压缩有用,则将原始上传的文件替换为压缩版本
  • set http headers content-encoding to gzip将 http 标头内容编码设置为 gzip

I wrote a GIST for this, it can inspire you to create your own process.我为此写了一个 GIST,它可以激发您创建自己的流程。 See https://gist.github.com/psa-jforestier/1c74330df8e0d1fd6028e75e210e5042https://gist.github.com/psa-jforestier/1c74330df8e0d1fd6028e75e210e5042

And dont forget to invalidate (=purge) Cloudfront to apply your change.并且不要忘记使(=清除)Cloudfront 无效以应用您的更改。

If you simply want to gzip the existing files in your S3 bucket, you can write a Lambda function for it.如果您只是想对 S3 存储桶中的现有文件进行 gzip 压缩,您可以为其编写 Lambda function。 Read the files into a buffer and then use the gzip library to compress them and re-upload to S3.将文件读入缓冲区,然后使用 gzip 库压缩它们并重新上传到 S3。 Something like this should work:这样的事情应该有效:

 gzipped_content = gzip.compress(f_in.read())
                destinationbucket.upload_fileobj(io.BytesIO(gzipped_content),
                                                        final_file_path,
                                                        ExtraArgs={"ContentType": "text/plain"}
                                                )

There's a full tutorial here: https://medium.com/p/f7bccf0099c9这里有一个完整的教程: https://medium.com/p/f7bccf0099c9

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM