Fetching directories

Usually Ansible is used for managing remote hosts but I thought I used it to “monitor” some hosts I have. In normal circumstances I’d just loop over a list of hosts and run a simple script over ssh but I have a quite annoying setup (access via jump-host, different authentication methods on different hosts) which makes automating the task with Ansible a quite reasonable choice (as I use Ansible, I can gather facts which is even more helping, but that’s out of scope of this topic).
But the issue is - I need to fetch some configuration directories from remote hosts and have them mirrored on local node.

As we all know, the fetch module does not work with directories.
I have at least two different “solutions” for this.
One is to:

  • create temporary file on the remote host

  • archive to that file

  • fetch the file

  • locally unarchive

  • remove the remote temporary file

I don’t like this approach very much since it has some issues with idempotency, especially if the playbook fails for any reason before I’m able to remove the remote temporary file.

Another one is to just run tar | base64 (for some reason it wouldn’t work without base64), capture stdout and then register the output and pass it as stdin to local base64 - d | tar. This is pretty much what I’d go for with normal ssh-based scripting but the problem with this approach is that I can’t just run remote task and pass it “in realtime” to local task. I have to first run the remote tar, and capture the resulting base64-ed blob into memory. It’s not much of an issue for a small directory but I’d like to have a general solution which wouldn’t crash if I wanted to transfer a big payload.

Any other reasonable approaches tothis problem?
The synchronize module doesn’t seem to fit since I don’t have direct connection between the remote node and local one.
Developing action plugin seems to be an overkill…

Regards,
MK