i have 3 digital ocean droplets, how to automate one droplet files to another 2 droplets, so i was write a scp shell script. guys do you have any other suggestions or easily file transfer between two droplets?
#! /usr/bin/env bash
# Variables
SERVER1USER=root
SERVER1PSWD=killergirls
SERVER1IP=123.123.123.123
SERVER1PATH=/etc/var/html/www/
SERVER2USER=root
SERVER2PSWD=killerboys
SERVER2IP=121.121.121.121
SERVER2PATH=/etc/var/html/www/
echo -e "\n--- First Server files transferring now... ---\n"
sshpass -p "$SERVER1PSWD" scp -r /etc/var/html/www/ SERVER1USER@SERVER1IP:$SERVER1PATH
echo -e "\n--- Second Server files transferring now... ---\n"
sshpass -p "$SERVER2PSWD" scp -r /etc/var/html/www/ SERVER2USER@SERVER2IP:$SERVER2PATH
I prefer to program things like this in python. Below I have a sample of one script that I use to deploy hadoop to a cluster. It can be expanded easily by adding more files, commands, or ips.
Note that I like being able to print everything that is going to be executed before I actually run it. This is done with the DEBUG variable.
Also note, this can be accomplished in bash, but since I am not an accomplished Bash programmer, Python is much easier and less error prone for me.
DEBUG=1
import subprocess
call = subprocess.call
ips = []
for i in range(0,10):
ips.append("10.0.0.%d"%(i))
commands = 'ls; echo "Hello"'
def hadoop_setup(ip):
sshTemplate = "ssh hdfs@%s '%s;'"
for eachCommand in commands.split(";"):
sshCall = sshTemplate%(ip, eachCommand)
if DEBUG: print sshCall
else: call(sshCall, shell=True)
copyFileTemplate = 'scp %s hdfs@%s:/home/hdfs'
for file in ['.bashrc']:
for ip in ips:
copyFileCall = copyFileTemplate%(file,ip)
call(copyFileCall, shell=True)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.