Hi,
For performance (faster first deployment), safety (in case the third party goes down), and bad connectivity (no internet) reasons, I’d like to cache downloads made with get_url.
I know this can be done manually, but I wonder if there’s a module to do it. Would a cache=yes property make sense in the get_url module?
Regards,
Warren.
Are you running get_url as a local action? (on the ansible master rather than on your managed machines)?
I’d be tempted to try using something like varnish myself.
Jon
Non, I’m not use a local action. That would involve sending the data back to the managed machines at some point, basically the same as downloading the file directly on the managed machine.
Varnish could make sense, though I’m not sure if it’s not a bit overkill. I would have thought of a simple cache directory where all downloaded files would automatically go.
I use a pattern like this to only get the tarball when the
'grafana_version' var changes.
Seems to work pretty well (NB: this happens up on the server, no local caching).
- name: download grafana tarball
get_url: url=http://grafanarel.s3.amazonaws.com/grafana-\{\{grafana\_version\}\}.tar.gz
dest=/root/.grafana-{{grafana_version}}.tar.gz
- name: extract tarball to docroot
unarchive: src=/root/.grafana-{{grafana_version}}.tar.gz dest=/opt/
copy=no creates=/opt/grafana-{{grafana_version}}/build.txt
Ok, simple and straightforward.