簡體   English   中英

Django存儲沒有檢測到更改的靜態文件

[英]Django-storages not detecting changed static files

我正在使用django-storages和amazon s3作為我的靜態文件。 在文檔之后,我將這些設置放在我的settings.py中

STATIC_URL = 'https://mybucket.s3.amazonaws.com/'

ADMIN_MEDIA_PREFIX = 'https://mybucket.s3.amazonaws.com/admin/'

INSTALLED_APPS += (
    'storages',
)

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'mybucket_key_id'
AWS_SECRET_ACCESS_KEY = 'mybucket_access_key'
AWS_STORAGE_BUCKET_NAME = 'mybucket'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

我第一次運行收集靜態一切正常,我的靜態文件上傳到我的s3桶。

但是,在對我的靜態文件進行更改並運行python manage.py collectstatic盡管靜態文件已被修改,但python manage.py collectstatic輸出

-----> Collecting static files
    0 static files copied, 81 unmodified.

但是,如果我重命名更改的靜態文件,更改的靜態文件將正確復制到我的s3存儲桶。

為什么django-storage上傳我更改的靜態文件? 是配置問題還是問題更嚴重?

如果“目標”文件比源文件“更年輕”,則collectstatic會跳過文件。 似乎亞馬遜S3存儲返回錯誤的日期為您的文件。

你可以調查[code] [1]並調試服務器響應。 也許時區存在問題。

或者您可以將--clear參數傳遞給collectstatic,以便在收集之前在S3上刪除所有文件

https://github.com/antonagestam/collectfast

來自readme.txt:自定義管理命令,用於比較來自S3的MD5總和和etag以及兩者是否相同的跳過文件副本。 如果您使用git作為更新時間戳的源控制系統,這使得運行collect static MUCH更快。

使用此配置創建僅用於collectstatic同步的設置文件:

TIME_ZONE = 'UTC'

使用以下行運行具有特定設置的collectstatic:

python manage.py collectstatic --settings=settings.collectstatic

這個問題有點舊,但如果將來有人幫助,我想我會分享我的經驗。 根據其他主題中的建議我確認,對我來說,這確實是由時區差異造成的。 我的django時間不正確,但設置為EST,S3設置為GMT。 在測試中,我恢復了django-storages 1.1.5,它確實似乎得到了收集靜態工作。 部分由於個人偏好,我不願意a)回滾三個版本的django-storage並丟失任何潛在的錯誤修復或b)改變我項目組件的時區,這基本上歸結為一個便利功能(盡管很重要)一)。

我寫了一個簡短的腳本來完成與collectstatic相同的工作而沒有上述改動。 它需要對您的應用進行一些修改,但如果它放在應用程序級別並且'static_dirs'替換為項目應用程序的名稱,則應該適用於標准情況。 它通過終端運行'python whatever_you_call_it.py -e environment_name(將其設置為你的aws存儲桶)。

import sys, os, subprocess
import boto3
import botocore
from boto3.session import Session
import argparse
import os.path, time
from datetime import datetime, timedelta
import pytz

utc = pytz.UTC
DEV_BUCKET_NAME = 'dev-homfield-media-root'
PROD_BUCKET_NAME = 'homfield-media-root'
static_dirs = ['accounts', 'messaging', 'payments', 'search', 'sitewide']

def main():
    try: 
        parser = argparse.ArgumentParser(description='Homfield Collectstatic. Our version of collectstatic to fix django-storages bug.\n')
        parser.add_argument('-e', '--environment', type=str, required=True, help='Name of environment (dev/prod)')
        args = parser.parse_args()
        vargs = vars(args)
        if vargs['environment'] == 'dev':
            selected_bucket = DEV_BUCKET_NAME
            print "\nAre you sure? You're about to push to the DEV bucket. (Y/n)"
        elif vargs['environment'] == 'prod':
            selected_bucket = PROD_BUCKET_NAME
            print "Are you sure? You're about to push to the PROD bucket. (Y/n)"
        else:
            raise ValueError

        acceptable = ['Y', 'y', 'N', 'n']
        confirmation = raw_input().strip()
        while confirmation not in acceptable:
            print "That's an invalid response. (Y/n)"
            confirmation = raw_input().strip()

        if confirmation == 'Y' or confirmation == 'y':
            run(selected_bucket)
        else:
            print "Collectstatic aborted."
    except Exception as e:
        print type(e)
        print "An error occured. S3 staticfiles may not have been updated."


def run(bucket_name):

    #open session with S3
    session = Session(aws_access_key_id='{aws_access_key_id}',
        aws_secret_access_key='{aws_secret_access_key}',
        region_name='us-east-1')
    s3 = session.resource('s3')
    bucket = s3.Bucket(bucket_name)

    # loop through static directories
    for directory in static_dirs:
        rootDir = './' + directory + "/static"
        print('Checking directory: %s' % rootDir)

        #loop through subdirectories
        for dirName, subdirList, fileList in os.walk(rootDir):
            #loop through all files in subdirectory
            for fname in fileList:
                try:
                    if fname == '.DS_Store':
                        continue

                    # find and qualify file last modified time
                    full_path = dirName + "/" + fname
                    last_mod_string = time.ctime(os.path.getmtime(full_path))
                    file_last_mod = datetime.strptime(last_mod_string, "%a %b %d %H:%M:%S %Y") + timedelta(hours=5)
                    file_last_mod = utc.localize(file_last_mod)

                    # truncate path for S3 loop and find object, delete and update if it has been updates
                    s3_path = full_path[full_path.find('static'):]
                    found = False
                    for key in bucket.objects.all():
                        if key.key == s3_path:
                            found = True 
                            last_mode_date = key.last_modified
                            if last_mode_date < file_last_mod:
                                key.delete()
                                s3.Object(bucket_name, s3_path).put(Body=open(full_path, 'r'), ContentType=get_mime_type(full_path))
                                print "\tUpdated : " + full_path
                    if not found:
                        # if file not found in S3 it is new, send it up
                        print "\tFound a new file. Uploading : " + full_path
                        s3.Object(bucket_name, s3_path).put(Body=open(full_path, 'r'), ContentType=get_mime_type(full_path))
                except:
                    print "ALERT: Big time problems with: " + full_path + ". I'm bowin' out dawg, this shitz on u." 


def get_mime_type(full_path):
    try:
        last_index = full_path.rfind('.')
        if last_index < 0:
            return 'application/octet-stream'
        extension = full_path[last_index:]
        return {
            '.js' : 'application/javascript',
            '.css' : 'text/css',
            '.txt' : 'text/plain',
            '.png' : 'image/png',
            '.jpg' : 'image/jpeg',
            '.jpeg' : 'image/jpeg',
            '.eot' : 'application/vnd.ms-fontobject',
            '.svg' : 'image/svg+xml',
            '.ttf' : 'application/octet-stream',
            '.woff' : 'application/x-font-woff',
            '.woff2' : 'application/octet-stream'
        }[extension]
    except:
        'ALERT: Couldn\'t match mime type for '+ full_path + '. Sending to S3 as application/octet-stream.'

if __name__ == '__main__':
    main()

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM