ssh setup questions

Hi all,
I'm just getting started with ansible and have a need to run my
client's ssh daemon on a different port (let's say 222). I tried this
against master and integration branches. It looked like i could just
edit lib/ansible/constants.py and set "DEFAULT_REMOTE_PORT = 222",
but, when i do that, i end up with the following stacktrace (when it's
set to 22 it works fine):

client_system | FAILED => Traceback (most recent call last):
  File "/ansible/ansible/lib/ansible/runner.py", line 534, in _executor
    (host, ok, data, err) = self._executor_internal(host)
  File "/ansible/ansible/lib/ansible/runner.py", line 573, in _executor_internal
    result = self._execute_normal_module(conn, host, tmp, module_name)
  File "/ansible/ansible/lib/ansible/runner.py", line 346, in
_execute_normal_module
    module = self._transfer_module(conn, tmp, module_name)
  File "/ansible/ansible/lib/ansible/runner.py", line 197, in _transfer_module
    outpath = self._copy_module(conn, tmp, module)
  File "/ansible/ansible/lib/ansible/runner.py", line 643, in _copy_module
    conn.put_file(in_path, out_path)
  File "/ansible/ansible/lib/ansible/connection.py", line 161, in put_file
    sftp = self.ssh.open_sftp()
  File "/usr/lib/python2.6/site-packages/paramiko/client.py", line
399, in open_sftp
    return self._transport.open_sftp_client()
  File "/usr/lib/python2.6/site-packages/paramiko/transport.py", line
828, in open_sftp_client
    return SFTPClient.from_transport(self)
  File "/usr/lib/python2.6/site-packages/paramiko/sftp_client.py",
line 105, in from_transport
    chan.invoke_subsystem('sftp')
  File "/usr/lib/python2.6/site-packages/paramiko/channel.py", line
240, in invoke_subsystem
    self._wait_for_event()
  File "/usr/lib/python2.6/site-packages/paramiko/channel.py", line
1084, in _wait_for_event
    raise e
SSHException: Channel closed.

The other thing I ran across is that I seem to need to be running an
ssh-agent even if I'm using passwordless keys... if I'm not using an
agent, I get the following:

client_system | FAILED => FAILED: Private key file is encrypted

Is that expected? I guess maybe paramiko is expecting to find an
agent with a set of keys added that it can try for authentication? It
would be nice if I could specify a static key to use somehow (i.e.
/root/.ssh/ansible) and have paramiko connect with that instead of
requiring an agent...

thanks!
matt

Matt,

If you pass --ask-pass ( or -k) to ansible on the CLI, it will prompt you
for the SSH password. While you don't need to run an ssh-agent, it surely
does help. Also realize that ansible is trying to log in as the root user,
unless you specify "-u <user>" as well. I had a headache for a short while
not realizing it was going in as root :wink: when everything was setup
correctly on my end with keys, agents, etc.

I want to say if I remember correctly that running ansible over different
SSH ports is currently only in the integration branch (0.3) and you specify
the port in the inventory file when you declare the host, i.e.
host.example.com:222

Dave

Hi Dave,
Thanks for the reply!

Matt,

If you pass --ask-pass ( or -k) to ansible on the CLI, it will prompt you
for the SSH password. While you don't need to run an ssh-agent, it surely
does help. Also realize that ansible is trying to log in as the root user,
unless you specify "-u <user>" as well. I had a headache for a short while
not realizing it was going in as root :wink: when everything was setup correctly
on my end with keys, agents, etc.

The -k option was not what i was looking for since i've already set up
a passwordless keypair to let me authenticate. It looks like -k
forces password auth and bypasses pubkey auth altogether. I'm happily
using ssh-agent now, but having to use an ssh-agent when I've already
set up a passwordless pubkey seems unnecessary (maybe there's
something I'm missing).

I want to say if I remember correctly that running ansible over different
SSH ports is currently only in the integration branch (0.3) and you specify
the port in the inventory file when you declare the host, i.e.
host.example.com:222

Cool, thanks for the pointer. That did indeed work on the commandline
(./ansible client_system:222 -m ping). It occurred to me that the
host:port syntax may make more sense in the hosts file than the
commandline; when someone specified a different port I think most of
the time they intend that to be permanent and if they have to specify
it on the commandline they lose the ability to use "./ansible all",
for example, and still use that alternate port. That said, I just saw
the note from Michael about Rodney Quillo's patch to have Ansible read
an ssh_config file so I went in that direction as it solves the
problem in an even better way. It worked as expected except I had to
cast the port as an int (submitted a pull request for the one-line
fix).

Finally, for anyone in the future that's reading this thread because
they saw a stacktrace similar to the one I posted below, it's because
I had failed to enable the sftp subsystem on my alternate sshd running
on port 222. whoops :slight_smile:

matt

Hi Matt,

Thanks for testing and patching the .ssh/config :slight_smile:

Cheers,
Rodney

Fix no merged if people want to try the latest on devel.

Thanks folks!

–Michael

I meant “now merged”

–Michael

Exactly what I was looking for, and it works!

  • Erno

tiistai, 24. huhtikuuta 2012 15.03.24 UTC+3 Michael DeHaan kirjoitti:

Hi guys! I’m new with Ansible so I have to ask. Currently I’m playing with playbooks and it looks like that it doesn’t use .ssh/config when I run playbook.
I have specified my ssh key and user in .ssh/config and now if I run ad-hoc command it works, but in playbook I had to give user value.

Thanks!

-Erno

tiistai, 24. huhtikuuta 2012 19.39.07 UTC+3 Erno Aapa kirjoitti:

Hi Erno,

playbook I think has set the user primarily to root(if user is not set
inside the playbook yml file)
without first checking the content of ~/.ssh/config.

I've tested it and somehow this might be a new feature/bug..Hmm.. well
depends on how maintainers see it.:slight_smile:
Can you please file it to github issues?

Thanks,
cocoy

Note: This email pertains to the devel branch only. There is no SSH Config file reading on master.

Hi Erno,

playbook I think has set the user primarily to root(if user is not set
inside the playbook yml file)
without first checking the content of ~/.ssh/config.

Hardly a bug, since the SSHConfig reading stuff is crazy new :slight_smile:

I’m actually wondering if we really WANT the SSHConfig reading stuff at all.

Important: I want to make sure the DEFAULT user is set from SSHConfig, but it does NOT override any preferences set by -u or the "user: " line in a playbook.

I’ve tested it and somehow this might be a new feature/bug…Hmm… well
depends on how maintainers see it.:slight_smile:
Can you please file it to github issues?

cocoy, if you like I can assign you to the project so that you can request for tickets to be assigned to you.

That goes for anyone else who would like this. Email me with your github ID for anyone that wants that.

(It does not, however, give commit access… as we still review those… but it’s quite useful)

Hardly a bug, since the SSHConfig reading stuff is crazy new :slight_smile:

Hmm.. crazy indeed.

I'm actually wondering if we really WANT the SSHConfig reading stuff at all.

Michael, exactly the same question I want to raise. :slight_smile:

I agree sourcing an ssh config may not be the long term way to do it,
but we want to be able to control various parameters around ssh
behavior, right? In my case, I need a way to have ansible connect to
sshd on a port other than 22. I don't think that sourcing a user's
default ssh config is necessarily appropriate for configuration
management... I have an ssh config that I use which that I _don't_
want ansible using, so for my requirements I've hacked the feature to
read the static file /etc/ansible/ssh_config. Maybe that's an ok
short-term solution to support this flexibility without unintended
consequences to people who don't want it?

One nice thing about the ssh_config implementation is that it can set
global and host-specific behavior in a standard way that everyone
already knows. In Ansible, we have the hosts file, but I don't think
ssh settings really fit into that model, and I don't think there is
currently support for global settings there either (I may be wrong?).
Maybe a larger question here is how to flexibly override Ansible's
default behavior...

matt

Hardly a bug, since the SSHConfig reading stuff is crazy new :slight_smile:

Hmm.. crazy indeed.

I'm actually wondering if we really WANT the SSHConfig reading stuff at all.

Michael, exactly the same question I want to raise. :slight_smile:

I agree sourcing an ssh config may not be the long term way to do it,
but we want to be able to control various parameters around ssh
behavior, right? In my case, I need a way to have ansible connect to
sshd on a port other than 22.

YAML host file already does this.

ansible_ssh_port variable

or in the INI format file

host:port

I don't think that sourcing a user's

default ssh config is necessarily appropriate for configuration
management... I have an ssh config that I use which that I _don't_
want ansible using, so for my requirements I've hacked the feature to
read the static file /etc/ansible/ssh_config. Maybe that's an ok
short-term solution to support this flexibility without unintended
consequences to people who don't want it?

Sounds confusing to me.

I'd rather rip the SSH config file stuff out and make things use
Ansible's own host file.

One nice thing about the ssh_config implementation is that it can set
global and host-specific behavior in a standard way that everyone
already knows.

Not everyone :slight_smile: I definitely don't want it to be the only way and if
it overrides things configured in Ansible
that would be confusing.

In Ansible, we have the hosts file, but I don't think

ssh settings really fit into that model, and I don't think there is
currently support for global settings there either (I may be wrong?).
Maybe a larger question here is how to flexibly override Ansible's
default behavior...

It supports groups though, which is a pretty nice way to assign things.

Override how?

Hardly a bug, since the SSHConfig reading stuff is crazy new :slight_smile:

Hmm.. crazy indeed.

I'm actually wondering if we really WANT the SSHConfig reading stuff at all.

Michael, exactly the same question I want to raise. :slight_smile:

I agree sourcing an ssh config may not be the long term way to do it,
but we want to be able to control various parameters around ssh
behavior, right? In my case, I need a way to have ansible connect to
sshd on a port other than 22.

YAML host file already does this.

ansible_ssh_port variable

Sorry, I had missed this; I see it's right there in the documentation
about the inventory file format...

So, I had incorrectly assumed that variables in the YAML host file
were only custom things passed in by the user for use in templates,
etc, and that was the source of my comment that 'I don't think ssh
settings really fit into that model'. Thinking about it a bit more,
it seems reasonable though, and answers my question about overriding
Ansible default behavior; as requirements to override other settings
come up, they can be turned into variables read from the inventory.
Out of curiosity, is there anything other than ansible_ssh_port at
this point that we can set in the inventory to override defaults (is
looking thru inventory.py for _set_variable calls the correct way for
me to figure this out for myself)?

The only thing I think I'm missing at this point is how I'd set
ansible_ssh_port (or any variable) globally within the YAML inventory
instead of having to apply it individually to each system or group?

Ok, I cannot say anything to those technical stuff because I’m not so familiar with Ansible… yet :).

But I just want to point out that I think many users in future will expect that config in .ssh/config works when use Ansible, or thats what I did :slight_smile:

People have old servers and have already setup ssh to use specific user, key file, and already have public key in target server. It would be super fast start to use Ansible, if user don’t need to do any other than add host to ansible_host file. I think step to test and start using Ansible would be super small. And thats one key feature for Ansible because user don’t need any agent installation etc. just ssh access.
Definatelly this same configuration should be available by Ansible host/config file so Ansible is not depended to ssh config.

Just remind that users might have different ssh keys, different user, different password to each host.

-Erno

keskiviikko, 25. huhtikuuta 2012 19.03.30 UTC+3 Matt Coddington kirjoitti:

–Michael

Ok, I cannot say anything to those technical stuff because I’m not so familiar with Ansible… yet :).

But I just want to point out that I think many users in future will expect that config in .ssh/config works when use Ansible, or thats what I did :slight_smile:

I can’t help users who don’t RTFM, but we ARE reading some variables from SSH config WRT key file, etc.

User doesn’t make sense because we’re not interacting with systems via specific user accounts.

Since this is ridiculously easy to specify each a playbook play (“user:”), this is not really a problem.

People have old servers and have already setup ssh to use specific user, key file, and already have public key in target server. It would be super fast start to use Ansible, if user don’t need to do any other than add host to ansible_host file. I think step to test and start using Ansible would be super small. And thats one key feature for Ansible because user don’t need any agent installation etc. just ssh access.
Definatelly this same configuration should be available by Ansible host/config file so Ansible is not depended to ssh config.

Just remind that users might have different ssh keys, different user, different password to each host.

I’m not ripping SSH config out, we haven’t done that. What we have done is selectively read parts of it that are relevant to ansible.

This includes the port, user, and key pair.