简体   繁体   中英

Why can't I copy SSH keys to Vagrant VM?

When I execute my cp-sshkey.yml playbook (logged in as myself, not the vagrant user) from my top-level Vagrantfile directory...

ansible-playbook cp-sshkey.yml

I'm getting this error:

TASK: [authorized_key user=vagrant key="{{ lookup('file', './files/id_rsa_vagrant.pub') }}"] *** 
fatal: [web1] => Failed to template user=vagrant key="{{ lookup('file', './files/id_rsa_vagrant.pub') }}": could not locate file in lookup: ./files/id_rsa_vagrant.pub

I don't understand why this error is occurring. It's a very simple playbook and the public key file is where I say it is:

.
├── .vagrant
│   └── machines
├── Vagrantfile
├── ansible.cfg
├── bootstrap-mgmt.sh
├── files
│   └── id_rsa_vagrant.pub
├── inventory.ini
├── secrets.yml
├── site.yml
├── website
└── cp-sshkey.yml

Here's my config and host files and the playbook:

# ansible.cfg
[defaults]
hostfile = inventory.ini
remote_user = vagrant
private_key_file = .vagrant/machines/default/virtualbox/private_key
host_key_checking = False

# inventory.ini
[local]
localhost ansible_connection=local

[web]
web1 ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222

# cp-sshkey.yml
- name: Install vagrant's public key on VM
  hosts: web1
  sudo: True
  tasks:
    - authorized_key: user=vagrant key="{{ lookup('file', './files/id_rsa_vagrant.pub') }}"

What am I doing wrong here? Thanks.

Quick answer will refine - am playing with this myself but just learning:

I assume you are trying to add your public key (or other key on your ansible console that is no related to the vagrant keys ) to the vagrant machine to allow you to ssh into it without vagrant ssh

I assume that you have checked all the file permissions etc and that you aren't juggling multiple instances. Tried with 127.0.0.1 and localhost and that you've tried with the full file path instead of relative to working directory - my examples use files in subfolders with templates although not in the working snippet below.

  • Are you able to vagrant ssh and perhaps check the .ssh/authorized_keys file ?
  • Are you able to confirm that ansible can connect doing something like ansible web -a df
  • are you able to ssh into the Vagrant machine using ssh -i .vagrant/machines/default/virtualbox/private_key vagrant@127.0.0.1 -p 2222

In my role task file I have this task.

- name: Copy origin public key to auth keys
  authorized_key: user=vagrant key="{{  lookup('file', lookup('env','HOME') + '/.ssh/id_rsa.pub') }}"

Also my host definition has the user: web1 ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_user=vagrant but assume that the config you use should work.

This play worked is working for me although my directory structure is different - I expect you're comfortable that your is fine.

Some things to watch out for that caught me:

  • when you rebuild the Vagrant machine you will need to flush out your .ssh/known_hosts if you have already ssh'd - can remove from known_hosts with sh-keygen -R [localhost]:2222
  • makes me uneasy seeing localhost as a machine tag

My Setup:

  • ansible 2.2.0.0
  • vagrant Version: 1.9.1
  • Mac OSX
  • VBox 5.1.6
  • Vagrant Instance - ubuntu/trusty64

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM