How do I get the Ansible plugin to work with Jenkins?

My Ansible (version 1.9.5) server is on its own dedicated CentOS 7 server. My Jenkins (version 1.6) server is on its own dedicated CentOS 7 server. I installed the Ansible plugin for Jenkins. I created a New Item that invokes an ad-hoc Ansible command. For the Ansible Installation field, I enter the DNS name of the Ansible server. For host pattern, I chose the group of servers that I want the Ansible playbook to run again. The group name was defined in the …/ansible/hosts file.

The console output of this new Jenkins job that should invoke an ansible command (on a separate server with Ansible installed) says this:

“Building in workspace /var/lib/jenkins/jobs/… $ sshpass ****** /ansible … DNSnameOfAnsibleServer -i … FATAL: command execution failed hudson.AbortException: Ansible Ad-Hoc command execution failed at org.jenkinsci.plugins.ansible.AnsibleAdHocCommandBuilder.perform(AnsibleAdHocCommandBuilder.java:176 at …”

Does the Ansible server need sshpass? The Jenkins server has sshpass.

How do I get Jenkins to invoke an ansible playbook on an Ansible server? This error makes me think that something is wrong with sshpass. I can view the man page of sshpass. Is there a certain version that I need?

Hi Kiran,

I suspect that the Ansible plugin for Jenkins only looks for a locally installed Ansible binary. See the example at the plugin page: https://wiki.jenkins-ci.org/display/JENKINS/Ansible+Plugin

Cheers,

//Jon

Hi Kiran,

I suspect that the Ansible plugin for Jenkins only looks for a locally installed Ansible binary. See the example at the plugin page: https://wiki.jenkins-ci.org/display/JENKINS/Ansible+Plugin

Cheers,

//Jon

With the setup you describe, you have basically two jenkins-aware options. One, which is closer to what you are currently making, is to give ssh permission from jenkins to ansible servers. In that case you will need also one of the plugins that allows Jenkins to execute ssh commands in a different node (ssh-plugin or similar one) and use it instead of standard shell command box, but the remote node has no access to whatever lives at the jenkins server, and will only access whatever exists at the ansible server unless you explicitly copy it (it’s just ssh after all). The second approach which is the one that will probably behave as you expect, is to make the ansible server a Jenkins slave, and configure all ansible jobs to be executed in that slave with standard shell command box, you can drop the ssh stuff and run the slave as a user capable to run ansible, but security implications are not actually higher than giving ssh access (except maybe firewall ports). In this setup the workspace lives at the ansible (slave) server, but it is transparent to you because everything else is stored at jenkins (master) server.

Hi Kiran,

Am having the same issue… could you please help me in resolving this…

I've just dealt with this last week. With the caveat that every time I
use Jenkins I feel like I've been assaulted by aliens, here's the only
solution I found that actually worked:

- the jenkins slave has a passwordless ssh key provided to it that
*only* enables ssh to the ansible box. It cannot be used elsewhere in
the infrastructure. The key lives in $HOME of the jenkins user -
$HOME/.ssh/id_ed25519 (or id_rsa whatever key you use) as putting the
private key directly into the Jenkins config did not work.

ssh-keygen -o -t ed25519 -f /home/jenkins/.ssh/id_ed25519
"jenkins@example.org"

- the ansible server has an authorized_keys file that restricts the
above key to run a pre-selected command and 1 parameter

# /home/ansible/.ssh/authorized_keys
command="/home/ansible/src/ansible/jenkins.sh
$SSH_ORIGINAL_COMMAND",no-port-forwarding,no-X11-forwarding,no-agent-forwarding,no-pty
ssh-ed25519 <pubkey> jenkins@example.org

- the jenkins job calls ssh and passes in 1 parameter which will be
provided to ansible

# Jenkinsfile

    stage('Deploy') {
      steps {
        echo 'Deploy step ...'
        script {
          switch(env.BRANCH_NAME){
            case "master" : sh 'ssh ansible@example.org production'
            case "develop" : sh 'ssh ansible@example.org develop'
            default : echo "... ignored in this branch"
          }
        }
      }

- this is the script spawned by ssh

# /home/ansible/src/ansible/jenkins.sh
#!/bin/sh -xe
# restricted command for ssh to run ansible via make targets
cd /home/ansible/src/ansible
case $1 in
    production) make production ;;
    development) make development ;;
esac

- there is a Makefile in the root of the ansible dir that picks up the
parameter and uses it as a makefile target

# /home/ansible/src/ansible/Makefile

clean::
  @git reset --hard
  @git clean -fdx
  @git pull --ff-only

production:: clean
  ansible-playbook app.yml --limit prod,lb --diff

There is also an ssh_config and ansible.cfg files that specify ports,
private key to use, and a few other useful parameters.

There are a few things you can do to tighten security, making sure that
there is no way for jenkins to do anything other than ssh in and run
make.

- make jenkins.sh immutable
- move the authorized_keys file to somewhere like /etc/ssh/ to ensure
the command cannot be trimmed
- find a better way to handle the ansible-side ssh key. I use
hashicorp's vault here to handle that but thats a story for another day
- get the jenkins hashicorp plug to work, it doesn't for me

I'd really like to tell ansible to display output in ANSI colour even
though its running as a background task.

A+
Dave

You can
export ANSIBLE_FORCE_COLOR=TRUE
if you have AnsiColor plugin installed in your jenkins you will see the colours in the Console output of your jobs.
Jon