I asked this a while back and I did not find a good solution back then… Hopefully something has changed!
I am running Ansible outside of multiple VPCs, and have a set of configs in my ~/.ssh/config that get picked up when using a static inventory file.
Is there a way to utilize that config when using a dynamic inventory file, or specify proxy commands elsewhere, to enable connections though a bastion to each of those VPC’s?
Thanks for any help!
If you are dynamically building new VPC’s and you want your ssh config file to be updated dynamically. I would have a role that deploys the bastion host in the new VPC and right before the role exits, it should update your ~/.ssh/config file. (Not sure if this is what you are looking for) Example below…
`
- name: Provision EC2 Bastion server instances
ec2:
region: "{{ aws_region }}"
keypair: "{{ key_name }}"
group_id: "{{ bastion_sg.group_id }}"
instance_type: "{{ instance_type }}"
image: "{{ ami_id }}"
exact_count: 1
count_tag:
Name: bastion
instance_tags:
Name: bastion
assign_public_ip: True
wait: yes
#vpc_subnet_id: "{{ item.id }}"
vpc_subnet_id: "{{ vpc.results[0].subnets |parse_subnets_by_tag('Tier', 'public', return_count=False)|first }}"
register: bastion
- debug: var=bastion
- name: Add bastion instances to host group
add_host: name={{ item }} groups=bastion
with_items:
- "{{ bastion|parse_results(key='public_ip') }}"
- name: update ssh config
blockinfile:
dest: /home/foo/.ssh/config
block: |
Host 10.111.*
StrictHostKeyChecking no
ProxyCommand ssh -i ~/.ssh/my_key.pem foo@10.10.10.10 -W %h:%p
User ansible
IdentityFile ~/.ssh/ansible
`