簡體   English   中英

如何創建備份腳本以比較日期並刪除最早的文件

[英]How to create a backup script to compare dates and delete oldest file

我有一個正在制作的Minecraft服務器的備份腳本。 我已經可以將所有文件合並到1個文件夾中,將其命名為當前日期和時間,然后將其壓縮為1個.zip文件,以便World Edit識別為備份。 問題是,我希望此腳本能夠識別出達到4個備份時,它將開始刪除比較日期的最早的一個。 我還需要它在出現4個備份文件時不要出現故障。 我該怎么做。 這是我的劇本。

#!/bin/bash
DIR="$(dirname "$0")"
cd $DIR
clear
while [ true ]
do
# Set $DATE to current date and time
DATE="`date +%Yy-%mm-%dd_%Hh-%Mm`"
# Make directory with date and time
mkdir $DATE
# copy all files into 1 folder with date and time
cp ~/Desktop/Test.command ~/Desktop/Test2.command $DIR/$DATE
sleep 1
# compress folder into $DATE and remove previos files
zip $DATE -rm $DATE
# wait for round 2 in 1 hour
echo Waiting 1 hour
sleep 3600
done

這是我最近為cron工作所做的半相關的bash腳本。 它查看存儲自動生成的備份的某個目錄,如果目錄的大小超過用戶指定的限制,則刪除最舊的備份。 所有同時維護.log文件。

#!/bin/bash

# IFS is needed for bash to recognize spaces
IFS=$'\n'

# Backup directory
BACKUPDIR="your directory here"

# Backup directory size limit (in kB)
DIRLIMIT=20971520

# Start logging
LOGFILE="where you want the log file stored"
echo -e "Cron job started at $(date +"%m/%d/%Y %r")\r" >> $LOGFILE
echo -e "Directory size limit: $DIRLIMIT [kB]\r" >> $LOGFILE

# If $BACKUPDIR exitsts, find its size
if [ -d "$BACKUPDIR" ]
then
    DIRSIZE=$(du -sk "$BACKUPDIR" | awk '{print $1}')
    echo -e "Current directory size: $DIRSIZE [kB]\r" >> $LOGFILE
else
    echo -e "$BACKUPDIR does not exist!\r" >> $LOGFILE

fi

# Check if $BACKUPDIR's size is greater than $DIRLIMIT. If so, delete
# old files. If not, exit.
if [ $DIRSIZE -gt $DIRLIMIT ]
then
    echo -e "Directory size over limit! Attempting to delete oldest log backups...\r" >> $LOGFILE
    LOOPSIZE=$DIRSIZE
    while  [ $LOOPSIZE -gt $DIRLIMIT ]
        do
        # This find command below finds files (-type f) in the $BACKUPDIR directory only 
        # (-maxdepth 1) and it prints their last modified time and filename followed by a new line (-printf "%T@ %p\n").
        # Then it sorts it based on time (sort -n) and selects the file which was modified the furthest in the past. The
        # awk command removes the timestamp (which is in field $1). Finally, xargs -I{} -f {} deletes the file even though spaces are
        # present in the full file name.    
        echo -e "Deleting file: $(find "$BACKUPDIR" -type f -maxdepth 1 -printf "%T@ %f\n" | sort -n | head -1 | awk '{ print substr($0, index($0,$2)) }')\r" >> $LOGFILE
        find "$BACKUPDIR" -type f -maxdepth 1 -printf "%T@ %p\n" | sort -n | head -1 | awk '{ print substr($0, index($0,$2)) }' | xargs -I{} rm -f {} 
        # This function calculates the $BACKUPDIR size again and is used in the loop to see when it is permissable to exit.
        LOOPSIZE=$(du -sk "$BACKUPDIR" | awk '{print $1}')
        echo -e "Directory size is now: $LOOPSIZE [kB]\r" >> $LOGFILE
        done
    echo -e "Operation completed successfully! Final directory size is: $LOOPSIZE [kB]\r" >> $LOGFILE
else
    echo -e "Directory size is less than limit. Exiting.\r" >> $LOGFILE
fi

# Finish logging
echo -e "\r" >> $LOGFILE
echo -e "Cron job exiting at $(date +"%m/%d/%Y %r")\r" >> $LOGFILE
echo -e "----------------------------------------------------------------------\r" >> $LOGFILE

為了確保.log文件不會變得過大,我制作了一個cron腳本來修剪最上面的條目:

#!/bin/bash

# IFS is needed for bash to recognize spaces
IFS=$'\n'

# .log files directory
LOGFILESDIR="your directory here"

# .log file size limits (in KB)
LOGFILELIMIT=35

# Find .log files in $LOGFILESDIR and remove the earliest logs if
# the filesize is greater than $LOGFILELIMIT.
for FILE in "$LOGFILESDIR"/*.log
do
    LOGSIZE=$(du -sk "$FILE" | awk '{print $1}')
    while [ $LOGSIZE -gt $LOGFILELIMIT ]
    do
        # The sed command deletes the rows from the top
        # until it encounters a bunch of "-"'s in a row
        # (which separates the logs in the log files)
        sed -i '1,/----------/d' "$FILE"
        LOGSIZE=$(du -sk "$FILE" | awk '{print $1}')
    done    
done
exit

嗨,我很早以前就編寫了一個腳本,它執行類似的操作,它將刪除7天的舊文件並將其從AWS S3中刪除。 希望這對您有用。

#!/bin/bash
NOWDATE=`date -I -d '7 days ago'`
BACKUPNAME="$NOWDATE.sql.gz"
/usr/local/bin/s3cmd del s3://path/to/bucket/$BACKUPNAME  
rm -f `ls -t ????y-??m-??d_??h-??m.zip|sed 1,4d`

要么

rm -f `ls -r ????y-??m-??d_??h-??m.zip|sed 1,4d`

將刪除最新的四個zip文件(分別基於修改時間或文件名)。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM