Custom ssh_config Not Working As Expected

I am having some issues with a custom ssh_config file and not sure if I’m doing something wrong, if it’s an ssh bug, or if it’s an Ansible bug.

I currently have a custom ssh_config file placed next to my ansible.cfg file. In the ansible.cfg file I have the line ssh_args = -F ssh_config under the [ssh_connection] section. Inside of my ssh_config file is the following:

Host bastion
HostName xxx.xxx.xxx.xxx
User ubuntu
Host app01

HostName xxx.xxx.xxx.xxx
ProxyCommand ssh bastion nc %h %p
User ubuntu

If I then run the command ssh app01 -F ssh_config I receive the following error:

ssh: Could not resolve hostname bastion: nodename nor servname provided, or not known

However, if instead of referencing the ssh_config file, I instead place the contents into ~/.ssh/config and run the command ssh app01, all is well and it is able to resolve the hostname, bastion.

Is there a setting I am missing somewhere or is this potentially a bug in either ssh or Ansible?

Thanks for your help and if you need any more info, please ask!

  • James

Could this possibly be because the CWD when executing ansible can’t find your specified ssh_config ?

Maybe try fulling pathing it for starters?

I tried fully pathing it to no avail. It seems to be picking up on the custom ssh_config because if I change the ProxyCommand from ssh bastion nc %h %p to ssh ubuntu@x.x.x.x nc %h %p then it works fine. It’s almost like it’s not referencing the custom ssh_config for the ProxyCommand and is defaulting back to looking in the config in ~/.ssh or /etc/ssh.

Hi

I am having some issues with a custom ssh_config file and not sure if I'm doing
something wrong, if it's an ssh bug, or if it's an Ansible bug.

I currently have a custom ssh_config file placed next to my ansible.cfg file.
In the ansible.cfg file I have the line ssh_args = -F ssh_config under the
[ssh_connection] section. Inside of my ssh_config file is the following:

  Host bastion
    HostName xxx.xxx.xxx.xxx
    User ubuntu
  Host app01
    HostName xxx.xxx.xxx.xxx
    ProxyCommand ssh bastion nc %h %p
    User ubuntu

If I then run the command ssh app01 -F ssh_config I receive the following
error:

  ssh: Could not resolve hostname bastion: nodename nor servname provided, or
not known

However, if instead of referencing the ssh_config file, I instead place the
contents into ~/.ssh/config and run the command ssh app01, all is well and it
is able to resolve the hostname, bastion.

That makes sense.

When you run "ssh app01 -F ssh_config", it will obviously use the
given SSH config file.... But in order to connect to app01, it needs
to run the ProxyCommand. But this will use the *default* ssh config
file. Not the one in the current directory...

As far as SSH is concerned, the ProxyCommand is simply a shell
command - even if it just invokes ssh...

Thanks for your reply, Karl!

So given that is the case, would there be a way for it to behave the way I’m expecting when using a custom ssh_config file?

On a separate but related note, I’m trying to use a wildcard entry in my ssh_config for various servers. So for example, my inventory file has the entry app01-stg ansible_ssh_host=10.0.11.195 and my ssh_config file is as follows:

Host app*-stg
ProxyCommand ssh ubuntu@... nc %h %p
User ubuntu

When I then run the command ansible all -m ping -i inventory/staging.ini -l app01-stg I receive the error app01-stg | FAILED => SSH Error: data could not be sent to the remote host. Make sure this host can be reached over ssh. However, all works fine if I remove the ansible_ssh_host address from the inventory file and rerun the command with the following ssh_config file:

Host app01-stg
HostName 10.0.11.195
ProxyCommand ssh ubuntu@... nc %h %p
User ubuntu

Thanks in advance!!

  • James