how to encrypt ssh_pass password without asking any more password

Hello Team,

how to encrypt ssh_pass password without asking any more password. I dont want to type password everytime i run the ad hoc command like

ansible -i hosts.yaml -m shell -a “ulimit -a”. I dont want to put the password everytime i just want to hide or obscure or salt the below password

hosts.yaml has below all:vars

[all:vars]

ansible_ssh_common_args=‘-o StrictHostKeyChecking=no’

ansible_connection=ssh

ansible_port=22

ansible_user=sam

ansible_ssh_pass=abc@123

Now i want to hide ansible_ssh_pass variable or encrypt/salt this value thats it .

how do i do that. I m ok even if i store that in plaintext and reference here .

im fine evenits base64 just should not be in plain in hosts.yaml.

I apologize for this type of response, because they bug me a lot. It’s when someone says, “I want to do X. How do I do X?” And the responder say, “You should want Y!”

I don’t see an option in Ansible to read the ssh_pass from a file. However, native ssh itself can be made to run the script named in the SSH_ASKPASS environment variable, using its output as the ssh password. Perhaps that can be made to do what you describe. I haven’t done that myself. I don’t know if the paramiko ssh implementation supports that either.

But what you describe — “I don’t want to type a password every time…” — is better achieved by setting up and using ssh keys. I encourage you to look into that solution.

Picky unrelated stuff:

  • Your “hosts.yaml” inventory isn’t a YAML file. Looks like the “ini” format to me.
  • People seem to enjoy sticking host variables into inventories. There are other ways to associated such data with hosts. My personal preference is to limit inventories to listing hosts and associating them through groups, putting anything else somewhere else. But maybe that’s just me.

According to the conversation at http://forum.ansible.com/t/ansible-inventory-as-json-and-inline-vaulted-data-does-not-work-works-with-yaml-inventory/2909
true .yaml inventories permit vaulted values, so that should work for your requirements.
You’ll need to convert your inventory from .ini format to .yml, and vault your password with “ansible-vault encrypt_string…”,

Use an SSH key then no password is needed.

Or create a separate vars file P_vars or something
and have p_ansible_ssh_pass=yourpassword
encrypt that file with ansible vault

then reference that in your all:vars

ansible_ssh_pass={{ p_ansible_ssh_pass }}

you can then see the file and non secure vars without having to un-encrypt the vault but can see that a password stored in vault is used and what it’s name is.

you then have the problem the ansible vault password is needed every time you run your playbook / add hoc commands but you can include a reference to that in your .ansible.cfg and have it reference somewhere on your machine that isn’t included in your source control and protected to only be ready by your user.

Here is how I do it. I think this is what you want.

The way I did it is using a SSH keyring. In the ansible.cfg private_key_file = /etc/ansible/id_rsa_ansible
Next share the public key out to all clients, then on the ansible server I created a shell script that invokes
#!/bin/env bash
ssh-agent /usr/bin/sshinit.opt

in my opt file i have a simple if clause.

if [ -s “/etc/ansible/id_rsa_ansible” ]; then
/usr/bin/ssh-add /etc/ansible/id_rsa_ansible
echo “SSH KEY added and initialized”
cd /etc/ansible && $SHELL
else
echo id_rsa_ansible was not found, zero size.
fi

so on my ansible control node ( server ) I run sshinit
it prompts me for the private key password I created now I can login or run any ansible command or playbook with no pasword prompts once your public key is in each clients .ssh/authorized_keys file.
If I want to schedule a cronjob that runs a playbook on a bunch of clients, you have to export the environment variable SSH_AUTH_SOCK in your cronjob playbook script.

env |grep -i ssh
SSH_CONNECTION=10.50.10.22 51630 10.50.3.71 22
SSH_AUTH_SOCK=/tmp/ssh-FwQS5lko0os0/agent.9366
SSH_AGENT_PID=9367
SSH_CLIENT=10.50.10.22 51630 22
SSH_TTY=/dev/pts/0

SSH_ASKPASS also requires DISPLAY (X11 working) so it is not really viable option for Ansible as it is often run from a configuration server/instance that rarely has X11 installed.

But if you have such an environment you should be able to set it up that way.