Use get_url to download all files in a location

Im writing a playbook to install the latest version of a software. The latest version will always be in a url like this http://www.example.com/software/latest But the files in there will have different names that will append the version of the software. So the content of that URL location will look like this

http://www.example.com/software/latest

software.1.1.zip

Every so often there will be a new version of the software released and the file name will change . I dont want to have to change the playbook, every time the file name changes. And I dont want to keep track of this in a variable. I just want it to pull the latest every time it runs.

Is there a way to recursively pull all the files there or maybe use a wild card like *.zip

I know I can do it with wget by shelling out. But I was wondering if there was a way of doing this with the Ansible module

wget -r https://www.example.com/files/latest/ -nd -P ~/Desktop/test/ -A zip

Thanks in advance

+1
I’m interested in something like this as well. Thanks!

Andreas,

How about writing a wrapper script that ansible can call and then populate the results into a variable to use with get_url.

wget -O - http://www.example.com/software/latest/ | grep zip | cut -d> -f 2 | sed -e ‘s!</a!!g’|sort -u |tail -1

This is very nice. I think I will use it for now. It would be nice if it was build in to the get_url module. Thanks again