[英]AWS data pipeline, run Unload of redshift from bash shell script and save to S3
[英]Bash Script to run a backup and upload to AWS S3 Failing
這是代碼:
#!/bin/bash
# -------------------------------------------------
# -------------------------------------------------
# Use a comma-delimited list of single-quoted
# strings of the usernames to batch
# if it's empty it will backup all user directories
# in /home
# -------------------------------------------------
# -------------------------------------------------
USER_ACCOUNT=();
# -------------------------------------------------
# -------------------------------------------------
# Make sure the aws script is installed on the
# server, and the bucket name to upload these too
# are exact... case-sensitive
# -------------------------------------------------
# -------------------------------------------------
S3BUCKET='Kevs-Bucket/Test';
# Loop through the user array
# If it's empty, then get all users in the /home
# directory, based on each folder
# do not include root user
if [ ${#USER_ACCOUNT[@]} -eq 0 ]; then
# turn off dotglob and nullglob
cd /home;
shopt -s dotglob;
shopt -s nullglob;
DIRARR=(*/);
# we have our directories, now loop them and grab the user
# once we have the user, skip the root user
for d in ${!DIRARR[@]}; do
# Assign an account variable
ACCT=stat -c '%U' ${DIRARR[$i]}; #NOT WORKING HERE
if [ "$ACCT" == "root" ]; then
echo "ROOT";
else
run_backup $ACCT $S3BUCKET;
fi;
done;
else
# we have our list, now loop through them all
for i in ${!USER_ACCOUNT[@]}; do
# Assign an account variable
ACCT=${USER_ACCOUNT[$i]};
run_backup $ACCT $S3BUCKET;
done;
fi;
# -------------------------------------------------
# -------------------------------------------------
# Run the actual backup
run_backup(){
LOGFILE=/batch-move.log
# Package the account
./scripts/pkgacct $1;
echo '##########################################' >> $LOGFILE;
echo "# Start: date +'%T'" >> $LOGFILE;
echo "# Backing Up: $1" >> $LOGFILE;
echo '##########################################' >> $LOGFILE;
# Upload it to S3
s3put $2/cpmove-$1.tar.gz /home/cpmove-$1.tar.gz;
echo '##########################################' >> $LOGFILE;
echo "# Uploading Backup: $1" >> $LOGFILE;
echo '##########################################' >> $LOGFILE;
# Remove the file from the server
/bin/rm -f /home/cpmove-$1.tar.gz;
echo '##########################################' >> $LOGFILE;
echo "# Removing Backup Up: $1" >> $LOGFILE;
echo "# Finish: date +'%T'" >> $LOGFILE;
echo '##########################################' >> $LOGFILE;
}
我在這里遇到錯誤ACCT=stat -c '%U' ${DIRARR[$i]}; #NOT WORKING HERE
ACCT=stat -c '%U' ${DIRARR[$i]}; #NOT WORKING HERE
,錯誤指出-c不是我的CentOS服務器上stat的有效選項
我已經通過其他方法驗證了stat -c
確實可以正常工作,因此我認為嘗試將文件夾所有者變為變量的代碼是錯誤的。
你能幫我解決嗎?
不起作用的行(下面)包含尚未定義的變量$ i,並且不存在$()表示法,請參見“編輯-嘗試以下”下面的添加代碼。
ACCT=stat -c '%U' ${DIRARR[$i]}; #NOT WORKING HERE
您以一種不尋常的方式遍歷數組。 這是一個如何遍歷Bash中文件名數組元素的示例。
files=( "/home/User/FileName1" "/home/User/FileName2" "/home/User/FileName3" )
for fileName in "${files[@]}" ; do
echo "$fileName"
done
同樣,不要使用遍歷來構建數組– DIRARR =(* /); –您可能要考慮使用循環遍歷文件,例如:
for fileName in /home/* ; do
echo "$fileName"
done
希望這可以幫助。
編輯-試試這個:
注意:在我的系統上,以下忽略“。”。 和“ ..”。
# To avoid confusion 'ACCT' would be better named as 'OWNER'.
# Loop through the files in: /home/
for filename in /home/* ; do
# Get the owner of $filename.
ACCT=$(stat -c '%U' "$filename")
# If the file is a directory NOT owned by root, run backup.
if [ -d "$filename" -a "$ACCT" != "root" ]; then
# Uncomment when satisfied
# run_backup "$ACCT" "$S3BUCKET"
echo "Run backup - not owned by root:"
echo "filename: $filename owner: $ACCT"
# If the file is a directory owned by root, DO NOT run backup.
elif [ -d "$filename" -a "$ACCT" = "root" ]; then
# Remove elif clause when satisfied.
echo "Do not run backup - owned by root:"
echo "filename: $filename owner: $ACCT"
fi
done
請注意在“ ACCT =“行中使用$()。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.