fetch with backup (done with rsync)

All I wanted to do was to fetch files for OTAP purposes. Well …

I tried fetch, combining it with copying files to a backup folder and using fdupes (not installed by default). But i needed the backup files to include a timestamp. Well you can imagine my playbook grew and grew with set_facts and what not (i did not give up without a fight). The playbook filled my screen with useless information and on top of it all, it did not work like i wanted it to.

So i resorted to back to old trusty rsync and now have a nice simple single task in my playbook

  • How are others implementing this?
  • Anything missing by any change?

`
vars:
files_to_sync:

  • file1
  • file2
    fetchfiles: /root/ansible/fetchfiles

tasks:

  • name: sync file to master
    local_action: shell rsync -abci --backup-dir=backup --suffix=_date '+%Y%m%d.%H%M%S' {{ ansible_user_id }}@{{ ansible_fqdn }}:‘{{ files_to_sync | join(’ ‘) }}’ {{ fetchfiles }}/{{ inventory_hostname }}
    register: mastersync
    changed_when: mastersync.stdout

`

i use the -c option of rsync, because i dump sql tables to temporary files, so the timestamp does get changed
the -i option to itemize changes to report actual changes

Sorry, I don’t understand what “useless information is”, you will need to be more specific.

I also don’t understand the set_fact part of the above commentary, or what you wanted it to work like when you say it didn’t work like you wanted to.

The synchronize module is available if you’d like a nice wrapper around rsync, though I’m not clear on what you want to do that it does not provide.

Let me know.

Thanks!

Sorry let me elaborate :slight_smile:

I had a playbook that did:

  • setfact to create a datetimestamp to add to my filename
  • stat remote file
  • stat local file
  • rename local file if remote.md5 != localfile.md5
  • fetch the remotefile

So for each file i would get 5 tasks, each 4-5 lines, scrolling over my screen, creating somewhat of a mess. Especially since i want to fetch 20+ files. The reason i abandoned this path is because i couldn’t include the playbook using with_items (deprecated feature) and i didn’t feel like rewriting my playbook because of the number of tasks getting repeated for each file. It was a fun exercise though, getting to know the nooks and crannies of ansible.

Back to my usecase. Normally ansible works by pushing files to a “slave” and the copy module has a backup argument, however for OTAP purposes i want to import data from one server and push it to another and let the ansible master machine work out the details when pushing configs and data to a new testmachine.
So i needed the fetch module, but with a backup argument, which it doesn’t have.

Synchronize is a nice module, but works like copy, pushing from the control machine to remote host. I wanted it in reverse
local_action: synchronize works the sync two paths on the control machine, according to the samples given.

So there you have it :slight_smile: both synchronize usecases and the fetch module didn’t cover my usecase.

From the synchronize docs, here’s an example of how to make it synchronize to your central host:

# Synchronization of src on delegate host to dest on the current inventory host
synchronize: >
    src=some/relative/path dest=/some/absolute/path
    delegate_to: delegate.host
In this case, "delegate_to: localhost" or equivalent.