繁体   English   中英

如何创建备份脚本以比较日期并删除最早的文件

[英]How to create a backup script to compare dates and delete oldest file

我有一个正在制作的Minecraft服务器的备份脚本。 我已经可以将所有文件合并到1个文件夹中,将其命名为当前日期和时间,然后将其压缩为1个.zip文件,以便World Edit识别为备份。 问题是,我希望此脚本能够识别出达到4个备份时,它将开始删除比较日期的最早的一个。 我还需要它在出现4个备份文件时不要出现故障。 我该怎么做。 这是我的剧本。

#!/bin/bash
DIR="$(dirname "$0")"
cd $DIR
clear
while [ true ]
do
# Set $DATE to current date and time
DATE="`date +%Yy-%mm-%dd_%Hh-%Mm`"
# Make directory with date and time
mkdir $DATE
# copy all files into 1 folder with date and time
cp ~/Desktop/Test.command ~/Desktop/Test2.command $DIR/$DATE
sleep 1
# compress folder into $DATE and remove previos files
zip $DATE -rm $DATE
# wait for round 2 in 1 hour
echo Waiting 1 hour
sleep 3600
done

这是我最近为cron工作所做的半相关的bash脚本。 它查看存储自动生成的备份的某个目录,如果目录的大小超过用户指定的限制,则删除最旧的备份。 所有同时维护.log文件。

#!/bin/bash

# IFS is needed for bash to recognize spaces
IFS=$'\n'

# Backup directory
BACKUPDIR="your directory here"

# Backup directory size limit (in kB)
DIRLIMIT=20971520

# Start logging
LOGFILE="where you want the log file stored"
echo -e "Cron job started at $(date +"%m/%d/%Y %r")\r" >> $LOGFILE
echo -e "Directory size limit: $DIRLIMIT [kB]\r" >> $LOGFILE

# If $BACKUPDIR exitsts, find its size
if [ -d "$BACKUPDIR" ]
then
    DIRSIZE=$(du -sk "$BACKUPDIR" | awk '{print $1}')
    echo -e "Current directory size: $DIRSIZE [kB]\r" >> $LOGFILE
else
    echo -e "$BACKUPDIR does not exist!\r" >> $LOGFILE

fi

# Check if $BACKUPDIR's size is greater than $DIRLIMIT. If so, delete
# old files. If not, exit.
if [ $DIRSIZE -gt $DIRLIMIT ]
then
    echo -e "Directory size over limit! Attempting to delete oldest log backups...\r" >> $LOGFILE
    LOOPSIZE=$DIRSIZE
    while  [ $LOOPSIZE -gt $DIRLIMIT ]
        do
        # This find command below finds files (-type f) in the $BACKUPDIR directory only 
        # (-maxdepth 1) and it prints their last modified time and filename followed by a new line (-printf "%T@ %p\n").
        # Then it sorts it based on time (sort -n) and selects the file which was modified the furthest in the past. The
        # awk command removes the timestamp (which is in field $1). Finally, xargs -I{} -f {} deletes the file even though spaces are
        # present in the full file name.    
        echo -e "Deleting file: $(find "$BACKUPDIR" -type f -maxdepth 1 -printf "%T@ %f\n" | sort -n | head -1 | awk '{ print substr($0, index($0,$2)) }')\r" >> $LOGFILE
        find "$BACKUPDIR" -type f -maxdepth 1 -printf "%T@ %p\n" | sort -n | head -1 | awk '{ print substr($0, index($0,$2)) }' | xargs -I{} rm -f {} 
        # This function calculates the $BACKUPDIR size again and is used in the loop to see when it is permissable to exit.
        LOOPSIZE=$(du -sk "$BACKUPDIR" | awk '{print $1}')
        echo -e "Directory size is now: $LOOPSIZE [kB]\r" >> $LOGFILE
        done
    echo -e "Operation completed successfully! Final directory size is: $LOOPSIZE [kB]\r" >> $LOGFILE
else
    echo -e "Directory size is less than limit. Exiting.\r" >> $LOGFILE
fi

# Finish logging
echo -e "\r" >> $LOGFILE
echo -e "Cron job exiting at $(date +"%m/%d/%Y %r")\r" >> $LOGFILE
echo -e "----------------------------------------------------------------------\r" >> $LOGFILE

为了确保.log文件不会变得过大,我制作了一个cron脚本来修剪最上面的条目:

#!/bin/bash

# IFS is needed for bash to recognize spaces
IFS=$'\n'

# .log files directory
LOGFILESDIR="your directory here"

# .log file size limits (in KB)
LOGFILELIMIT=35

# Find .log files in $LOGFILESDIR and remove the earliest logs if
# the filesize is greater than $LOGFILELIMIT.
for FILE in "$LOGFILESDIR"/*.log
do
    LOGSIZE=$(du -sk "$FILE" | awk '{print $1}')
    while [ $LOGSIZE -gt $LOGFILELIMIT ]
    do
        # The sed command deletes the rows from the top
        # until it encounters a bunch of "-"'s in a row
        # (which separates the logs in the log files)
        sed -i '1,/----------/d' "$FILE"
        LOGSIZE=$(du -sk "$FILE" | awk '{print $1}')
    done    
done
exit

嗨,我很早以前就编写了一个脚本,它执行类似的操作,它将删除7天的旧文件并将其从AWS S3中删除。 希望这对您有用。

#!/bin/bash
NOWDATE=`date -I -d '7 days ago'`
BACKUPNAME="$NOWDATE.sql.gz"
/usr/local/bin/s3cmd del s3://path/to/bucket/$BACKUPNAME  
rm -f `ls -t ????y-??m-??d_??h-??m.zip|sed 1,4d`

要么

rm -f `ls -r ????y-??m-??d_??h-??m.zip|sed 1,4d`

将删除最新的四个zip文件(分别基于修改时间或文件名)。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM