Hi,
I’m currently testing Ansible for deploying instances on AWS. To reach hosts in private subnets, I’m using a bastion host. When deploying new instances I have a task that waits for the ssh port of the new instance to come up before proceding.
During the creation of the instance I also update my local inventory file with the ip of the instance and the ansible_ssh_private_key_file (aws generated key-pair) like this:
x.x.x.x ansible_private_key_file=~/.ssh/instance.pem
However, the task that waits for the ssh port to come up gives a timeout. This is the wait_for task:
- name: Wait for SSH
wait_for:
host: “{{ item.private_ip }}”
port: 22
delay: 60
timeout: 320
state: started
with_items: “{{ ec2.instances }}”
The host variable resolves to the correct ip of the new instance.
In my ssh config file I have configured the bastion host for each subnet. And ssh’ing into the new instance works without issues. I can also perfectly run ad-hoc commands against the new instance.
This is the ansible_cfg I’m using:
[defaults]
inventory=./hosts
ProxyCommand=“ssh -W %h:%p -q centos@”
host_key_checking=False
[ssh_connection]
ssh_args=-o ForwardAgent=yes
I also tried delegating the wait task to the bastion host, but that results in the same error. Is there something I’m missing?
Vincent