简体   繁体   中英

Run Ansible playbook on OVH cloud instance with Terraform Cloud

I have a Terraform+Ansible combination that sets up an OVH cloud instance, and then runs an Ansible playbook on it using provisioners. When I run this locally, I can supply the public and private keys directly via the command line (not using file paths), and the terraform apply works perfectly.

On Terraform Cloud, I create the keys as variables. When I run the Terraform plan, the remote-exec provisioner works, and connects to the instance as it should. However, the local-exec fails with a Permission denied (publickey) . What am I missing?

My provisioner blocks:

# Dummy resource to hold the provisioner that runs ansible
resource "null_resource" "run_ansible" {
  provisioner "remote-exec" {
    inline = ["sudo apt update", "sudo apt install python3 -y", "echo Done!"]

    connection {
      host        = openstack_compute_instance_v2.test_instance.network[0].fixed_ip_v4
      type        = "ssh"
      user        = "ubuntu"
      private_key = var.pvt_key
    }
  }

  provisioner "local-exec" {
    command = "python3 -m pip install --no-input ansible; ANSIBLE_HOST_KEY_CHECKING=False ansible-playbook -u ubuntu -i '${openstack_compute_instance_v2.test_instance.network[0].fixed_ip_v4},' '--private-key=${var.pvt_key}' -e 'pub_key=${var.pub_key}' ansible/setup.yml"
  }
}

Terraform cloud run error:

TASK [Gathering Facts] *********************************************************
fatal: [xx.xxx.xxx.xx]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Warning: Permanently added 'xx.xxx.xxx.xx' (ECDSA) to the list of known hosts.\r\nno such identity: /home/tfc-agent/.tfc-agent/component/terraform/runs/run-AhaANkduM9YXJVoC/config/<<EOT\n-----BEGIN OPENSSH PRIVATE KEY-----<private-key>-----END OPENSSH PRIVATE KEY-----\nEOT: No such file or directory\r\nubuntu@xx.xxx.xxx.xx: Permission denied (publickey).", "unreachable": true}

EDIT: SOLVED

I solved the problem by creating (sensitive) key files on the Terraform Cloud host, and passing the paths to them to Ansible instead.

The variables are still supplied via TFCloud, but without the heredoc syntax.

I had to add an extra new line \n at the end of the key to get around it being stripped. See the following issue: https://github.com/ansible/awx/issues/9082 .

resource "local_sensitive_file" "key_file" {
    content  = "${var.pvt_key}\n"
    filename = "${path.root}/.ssh/key"
    file_permission = "600"
    directory_permission = "700"
}

resource "local_sensitive_file" "pubkey_file" {
    content  = "${var.pub_key}\n"
    filename = "${path.root}/.ssh/key.pub"
    file_permission = "644"
    directory_permission = "700"
}

I solved the problem by creating (sensitive) key files on the Terraform Cloud host, and passing the paths to them to Ansible instead.

The variables are still supplied via TFCloud, but without the heredoc syntax.

I had to add an extra new line \n at the end of the key to get around it being stripped. See the following issue: https://github.com/ansible/awx/issues/9082 .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM