简体   繁体   中英

Linux: Move log files from a mounted dir to a local dir recursively while maintaining directory structure in local direcoty

So I've done a fair bit of Googling and I haven't found the answer to this yet. I have one server that needs to be live and respond quickly - I don't want my visitors/customers waiting. However, I've had enough server issues to know that I need a good backup and not the whenever I feel like it backups I've got now.

When I've tried mysqldump and tar on this server it works but makes the server is quite slow for hours. It can also lead to filling up the hard drive (if I forget to remove old backups for too long) which takes down MySQL and by extension my sites. Slowing down the server is unacceptable so my backups to date only happen about once a week on the most common slow day/time. However, I'm not fond of this solution.

So I now have another server that I have set up as my primary server's near real-time backup. I am setting up MySQL replication and a daily rsync. Then a daily mysqldump and tar can be run from this backup server without effecting response times on the primary. Great except I'd like to take it a step further.

I have a number of data files that get captured throughout the day - like log files and basic visitor tracking files (IP, referrer, user agent, nothing evil) - that don't need to be processed in real time and I've decided that it is best to have this backup server process these files rather than tie up my primary server's resources. This creates an issue for me though. I want to get these files off the primary server and onto my backup server for processing. The files are scattered across a lot of directories and that list of directories will only grow over time. So I want to avoid having run mv on several directories now and having to maintain that list going forward - some new directory will get forgotten eventually.

So I've mounted the necessary directory on the primary server under the /media folder on the backup server. If I just wanted to copy the files over I'd use scp but I want move them. So I'd like a command that does something like:

mv /media/primary/*.log /backup/.

where the mv command recursively looks into the /media/primary folder and goes down through the directory to each subfolder looking for any file with the .log extension. Then moves that file from /media/primary/ to an equivalent folder on /backup. So, for example, if I had log files in the directories /media/primary/tool1/logs and /media/primary/tool2/logs I'd like them moved to /backup/tool1/logs and /backup/tool2/logs respectively. I'd like (but don't require) that the command create the folder if it is missing. Once properly moved the files can then be processed and renamed/deleted from the local hard drive on the backup server.

I'm on Ubuntu 12.04 server for both machines. I work primarily in PHP so I've been creating a PHP script for the backup server to be executed daily. The script can execute system commands via SSH (phpseclib).

A single command for the CLI would be great but I understand that may not be possible.

Thanks

How about something like

rsync -azrR --include='*.log' -f 'hide,! */' /media/primary /backup

If you use man rsync and type 2343g it should take you to the line where it explains this filter. Basically it will hide everything that's not a directory (every file) from the pattern matching; however, since I've used the --include='*.log' flag as well it will override that and the pattern will match only .log files. You can also use the -nv flag to do a dry run and see what would happen.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM