local_action rsync

Further to that last post on recursive file copying:

I am now trying to get a directory with files copied onto a host via rsync (instead of copying over a single .ziped bundle and then decompressing that and changing the permissions, which I can do but it takes four tasks: (i) copy, (ii) unzip, (iii) change owner, (iv) delete zip file and it is quite awkward.)

I have been playing around with this for a bit now trying to hit the magic rsync / sudo / shh incantation to get this to go smoothly and I have had no luck so far.

So in a simple task I would like to install sublime on a server under /opt/sublime2. I have the sources on my machine (as well as some config files, etc which I would all like to copy over.) (So the first step is getting the app over there.)

So question (1) on rsync itself: From a normal shell command line, how do I rsync files into a protected directory via a user who has sudo privileges. Ie I have root login disabled on the host machine, but a user on there called ‘jason’ who has sudo privileges. I have googled around for this but I must have missed how to do this…

So failing finding out how to make (1) work I created a target directory with completely open privileges as follows:

  • name: create sublime install directory
    file: path=/opt/sublime2/ state=directory owner=jason group=jason mode=777

Now I can rsync via the command line into this directory:

rsync --rsh=‘ssh’ --delete -rua app/SublimeText2/ cupertino12043:/opt/sublime2/

And that works.

However if I try and do this via ansible via:

  • name: UpLoad the app

local_action: command rsync --rsh=‘ssh’ --delete -rua app/SublimeText2/ {{inventory_hostname}}:/opt/sublime2/

I get:

[arcbat:Installers/Linux/ansible/Setup] Setup 5(5) ⌘ ansible-playbook -i “host_cupertino.ini” cupertino.yml -K -v
sudo password:

PLAY [cupertino] **************************************************************

GATHERING FACTS ***************************************************************
ok: [cupertino12043]

TASK: [create sublime install directory] **************************************
changed: [cupertino12043] => {“changed”: true, “gid”: 1000, “group”: “jason”, “item”: “”, “mode”: “0777”, “owner”: “jason”, “path”: “/opt/sublime2/”, “size”: 4096, “state”: “directory”, “uid”: 1000}

TASK: [UpLoad the app] ********************************************************
failed: [cupertino12043] => {“failed”: true, “item”: “”, “parsed”: false}
invalid output was: Sorry, try again.
[sudo via ansible, key=nigpbfaxyragiumbhfmpsvuwlttixxih] password:
Sorry, try again.
[sudo via ansible, key=nigpbfaxyragiumbhfmpsvuwlttixxih] password:
Sorry, try again.
sudo: 3 incorrect password attempts

FATAL: all hosts have already failed – aborting

PLAY RECAP ********************************************************************
to retry, use: --limit @/Users/jason/cupertino.retry

cupertino12043 : ok=2 changed=1 unreachable=0 failed=1

It looks like it is asking for my password even though it is passwordless from the command line. What am I missing here? Is there a good guide on how to do this canonically?

Naively, it would seem to me that synchronizing a directory with files in it onto a server is a fairly basic thing to want to do in a configuration management system. I guess I am missing something since so far this is decidedly not a smooth process for me! :slight_smile:

Is there a beta version of the synchronize command which I could install? Any recommended plugin to make this smoother in the interim?

Thanks,
Jas

“So question (1) on rsync itself: From a normal shell command line, how do I rsync files into a protected directory via a user who has sudo privileges.”

rsync won’t help you with this unless you want to invoke rsync in the reverse direction (and can)

Let’s with-hold discussion of what’s basic or not.

I’m not against recursive copy support but rsync and packaging and git and going to be better solutions.

“So question (1) on rsync itself: From a normal shell command line, how do I rsync files into a protected directory via a user who has sudo privileges.”

rsync won’t help you with this unless you want to invoke rsync in the reverse direction (and can)

Hmmm… I saw some pages on this but none of it appeared simple and clean.

Let’s with-hold discussion of what’s basic or not.

Hmm… maybe synchronization of a directory is an advanced feature of a configuration management system then? I wouldn’t have thought so… but I guess it doesn’t matter what we call it. I just mean to say I would think that a lot of people would want this functionality. It seems strange that it is not yet available, even up to v1.3 so I guess the implementation must be highly non-trivial?

Maybe I am doing things the wrong way? and if so what is the correct way? Using a DVCS or something to store the files would definitely be a bit of a hassle during the development stage. And final deployment could be quite slow. That is if you wanted to be clean about it and you decided not to pull down the repos and just pull down snapshots of the tip / head, then synchronizing might be slow as well?

I’m not against recursive copy support but rsync and packaging and git and going to be better solutions.

Is this due to the implementation being currently difficult to do, or something fundamental?

So saying this has anyone written some sort of plugin to make this tidier?

It still feels somehow I am missing something here…

Thanks,
Jason

I’ve solved the ‘rsync to a protected directory from ansible’ problem a couple of ways now.

The easiest involves first turning off ‘requiretty’ in sudoers on the target machines (its on by default on RHEL/CentOS). On my systems I allow users in the group wheel to sudo without a password. Here’s how I set all that up:

do this first so that early failure does not lock us out

  • name: ensure wheel users can sudo without a password
    lineinfile: >
    dest=/etc/sudoers
    state=present
    regexp=“^%wheel”
    line=“%wheel ALL=(ALL) NOPASSWD: ALL”

wheel users do not sudo with passwords so tty is not required

  • name: tty not required by wheel users to sudo
    lineinfile: >
    dest=/etc/sudoers
    state=present
    regexp=“^Defaults:%wheel”
    line=“Defaults:%wheel !requiretty”

Now I can rsync directly to a protected directory using the rsync-path trick. Here’s and actual example:

  • name: deploy common files to target host
    sudo: false
    local_action: >
    command
    /usr/bin/rsync -Ca --stats
    –rsync-path=“sudo -n rsync”
    –exclude=“.t"
    –exclude=".
    .swp”
    –include=“/root”
    –include=“/lib”
    –include=“/script”
    –include=“/script/init.sh”
    –exclude=“/script/"
    –exclude="/

    –delete
    –no-group
    –no-owner
    –chmod=u=rwX,go=rX
    {{ lookup(‘env’,‘DEALMAX_SRC_PATH’) }}/common/
    {{ inventory_hostname }}:/var/www/dealmax/common
    register: rsync
    changed_when: “‘Number of files transferred: 0’ not in rsync.stdout”

Prior to getting the requiretty I was running a two step process where I used one ‘local_command: rsync’ to stage the files the the sudo users home directory on the target system, then a second ‘command: rsync’ on the target to deploy the files. If you want the details, let me know and I’ll can dig the old playbook out of git.

Hope this helps.

K

Hi Kahlil,

Thanks! I got this to work and use rsync. But my next question though isn’t this a security problem? Wouldn’t it be better to turn off the password requests a bit more selectively?

Or do you add these lineinfile’s at the beginning of your playbook and then remove them at the end?

Update1. I had this going for a bit but now can’t seem to get it back. It all feels a little “un-robust” somehow. Still I do appreciate the suggestion!

Also with rsyncing to a staging directory and then rsynching on the final machine to the target directory and then cleaning up the staging directory things won’t be idempotent… Still it is better than nothing…

So are there any at least semi-stable plugins out there that automate this a little better?

Thanks!
Jas

Thanks! I got this to work and use rsync. But my next question though
isn't this a security problem? Wouldn't it be better to turn off the
password requests a bit more selectively?

In general, it would be a security problem, but if you do it right, it can
be good. This is part of a global security policy, extending well beyond
this particular case. If I get a chance this week, I'll start a thread
explaining and soliciting feedback/critique.

Or do you add these lineinfile's at the beginning of your playbook and
then remove them at the end?

Interesting idea, and I do something similar elsewhere, but no, not in this
case.

Update1. I had this going for a bit but now can't seem to get it back. It
all feels a little "un-robust" somehow. Still I do appreciate the
suggestion!

I've use this approach many of times in different contexts without issue,
but it does require getting a number of pieces exactly right. A good case
for any python hackers out there to help with the synchronize module :slight_smile:

Also with rsyncing to a staging directory and then rsynching on the final
machine to the target directory and then cleaning up the staging directory
things won't be idempotent... Still it is better than nothing...

Why clear our the staging directory? You are probably going to run the
sync again at some point, so unless you have very tight disk requirements,
just leave the staged content there. Either way, it will still be
idempotent: you get the same result no matter how many times you run the
play (without changing the source).

K

Kahlil (Kal) Hodgson GPG: C9A02289
Head of Technology (m) +61 (0) 4 2573 0382
DealMax Pty Ltd (w) +61 (0) 3 9008 5281

Suite 1415
401 Docklands Drive
Docklands VIC 3008 Australia

"All parts should go together without forcing. You must remember that
the parts you are reassembling were disassembled by you. Therefore,
if you can't get them together again, there must be a reason. By all
means, do not use a hammer." -- IBM maintenance manual, 1925

Seems better to just give the users acls on the content you need to rsync to.