[英]Django with S3 staticfiles optimization
I am trying to optimize my staticfiles using django with S3. 我正在尝试使用带有S3的django优化我的staticfiles。 I am using django compressor to compress and cache js and css files.
我正在使用django压缩器来压缩和缓存js和css文件。 Here are my settings :
这是我的设置:
AWS_ACCESS_KEY_ID = access_key
AWS_SECRET_ACCESS_KEY = secret_key
AWS_STORAGE_BUCKET_NAME='mybucketname'
AWS_QUERYSTRING_AUTH = False
S3_URL = 'https://%s.s3.amazonaws.com/' %AWS_STORAGE_BUCKET_NAME
MEDIA_URL = S3_URL + "media/"
STATIC_URL = S3_URL + "static/"
ADMIN_MEDIA_PREFIX = STATIC_URL + "admin/"
STATICFILES_DIRS = (
os.path.join(BASE_DIR, "static","static_dirs"),
#'/var/www/static/',
)
AWS_HEADERS = {
'Cache-Control': 'public,max-age=86400',
}
STATIC_ROOT = os.path.join(BASE_DIR, "static","static_root")
STATICFILES_STORAGE = 'lafabrique.settings.s3utils.CachedS3BotoStorage'
DEFAULT_FILE_STORAGE = 'lafabrique.settings.s3utils.MediaRootS3BotoStorage'
COMPRESS_STORAGE = 'lafabrique.settings.s3utils.CachedS3BotoStorage'
COMPRESS_URL = S3_URL
and in another file : 并在另一个文件中:
class CachedS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
self.local_storage = get_storage_class(
"compressor.storage.GzipCompressorFileStorage")()
def save(self, name, content):
name = super(CachedS3BotoStorage, self).save(name, content)
self.local_storage._save(name, content)
return name
What I don't understand is that when I test my page on https://developers.google.com/speed/pagespeed/insights/ , google still tells me that I should use gzip and cache on my static files... Also in my amazon http response I get : Cache-Control:max-age=0 ... ( actual website is lafabrique.io, just in case) 我不明白的是,当我在https://developers.google.com/speed/pagespeed/insights/上测试页面时,google仍然告诉我应该在静态文件上使用gzip和缓存...另外在我的亚马逊http响应中,我得到:Cache-Control:max-age = 0 ...(实际网站是lafabrique.io,以防万一)
Does somebody know what I did wrong ? 有人知道我做错了吗? Thanks a lot
非常感谢
Are you using Django-storages? 您正在使用Django存储吗? Try adding this to your settings:
尝试将其添加到您的设置中:
AWS_IS_GZIPPED = True
GZIP_CONTENT_TYPES = (
'text/css',
'application/javascript',
'application/x-javascript',
'text/javascript'
)
It looks like you're using gzipped storage on your local machine, but not for the file that you upload to S3. 看来您在本地计算机上使用的是gzip压缩存储,而不是您上传到S3的文件。
For the caching issue, try the solution here: Trouble setting cache-cotrol header for Amazon S3 key using boto 对于缓存问题,请在此处尝试解决方案: 使用boto设置Amazon S3密钥的cache-cotrol标头存在问题
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.