简体   繁体   中英

Add ssh key in bash script (jenkins pipeline)

I have a Jenkins job which needs to execute a shell script. It does a connection via ssh key to a remote computer. Here is the pipeline job :

    stage ('Run') {
        try {
            sh "chmod +x \$(find . -name '*.sh')"
            wrap([$class: 'AnsiColorBuildWrapper', 'colorMapName': 'XTerm']) {
                sh "./run-ansible-playbook.sh -f ansible-playbook.yml"
            }
etc...

The .sh file does the ssh-agent and the ssh-add commands.

sshAgentCount=$(pgrep ssh-agent | wc -l)

if [[ $sshAgentCount -eq 0 ]]; then
        echo "# run ssh-agent #"
        eval `ssh-agent -s`
       ssh-add /var/lib/jenkins/id_rsa_ansible
fi

Problem is that sometimes this works and sometimes not. I think it is because of the tty of the user (here jenkins) which changes all time and the ssh-agent process is linked to a tty. I don't want to call eval ssh-agent -s always because it causes a out of memory on the machine by time.

Here the ps aux | grep ssh-agent status :

jenkins   1243  0.0  0.0  11140   320 ?        Ss   17:20   0:00 ssh-agent -s
jenkins   1397  0.0  0.0  11140   320 ?        Ss   17:23   0:00 ssh-agent -s
jenkins   1435  0.0  0.0  11140   320 ?        Ss   17:23   0:00 ssh-agent -s

Do you have an elegant solution for this problem ? (one ssh-agent only if needed)

Thank you a lot :)

I think it is because of the tty of the user (here jenkins) which changes all time and the ssh-agent process is linked to a tty.

No. SSH agent is not linked to tty . SSH agent connection is stored in the environment variable $SSH_AUTH_SOCK so once you will close the original shell, which started the ssh-agent , you will lose the connection to the agent.

But you can preserve this connection by storing the environment variable in file and loading it in the next shells (if is is still usable) or similar things.

Jenkins has an ssh agent plugin that will solve this without having to use the ssh commands directly:

https://jenkins.io/doc/pipeline/steps/ssh-agent/#-sshagent-%20ssh%20agent

node {
    sshagent (credentials: ['deploy-dev']) {
        sh 'ssh -o StrictHostKeyChecking=no -l cloudbees 192.168.1.106 uname -a'
    }
}

Thank you for your answer. Indeed it works with a file.

But I found something easier, just kill the process at the end of the job :

# add ssh key
echo "# run ssh-agent #"
eval `ssh-agent -s`
ssh-add /var/lib/jenkins/id_rsa_ansible

work...

kill $SSH_AGENT_PID

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM