We’re feature-creeping here.
Your original ask was, “How can I pass the fact variable set using set_fact to jq pipeline in ansible and transform my json?”
The second best answer is echo "{{ data | to_json }}" | jq …
.
The first answer is, “No.”
Then it was, “I want to transform the json obtained from previous Ansible task to a key value pair separated by comma.”
Farther up the thread, Walter gave some excellent suggestions that will require some work on your part to fully appreciate. The most valuable part of that, believe it or not, is the work, not the final result.
This ask is a little shakier than the first, because such a string has no intrinsic value. Presumably you want that format because you’ve got another part of your solution that requires such input. However, I’m suspicious of a process that would consume such a string, and expect that a broader understand of the problem would suggest another approach.
Now we get to, “Why this is not a good idea?” Frankly, this is the deepest question so far, and it gets right at the issue of how to think about data in Ansible. There’s nothing absolutely right or wrong about passing some json through jq
in a shell script. When you’ve been solving problems that way, it’s natural when learning Ansible to use it as a wrapper around such techniques. But as you gain more experience with Ansible filters, Jinja, etc., you’ll find that sometimes a better solution to such a task is to not put yourself in a situation where you need such a task in the first place. Increased experience gives insight into “the Ansible way” of structuring and manipulating data, to solve data problems within Ansible itself, rather than merely using Ansible as a bridge to a string of external solutions.
This is why I wanted to follow-up to your original post – even though that isn’t how an experienced Ansible user would approach the problem: Because understanding why that playbook didn’t work is fundamental to stepping up to the next level of Ansible proficiency. It was the wrong solution with a broken implementation. Now you have the wrong solution with a working implementation. Whether that matters depends; if this is a one-off, it’s no big deal either way. But if it’s going to run thousands of times a day, it’s reaching out to a remote machine each time to do trivial - perhaps unnecessary - data manipulations that could be done much more efficiently on the Ansible controller itself. It’s easy to choose the less expensive solution when it’s pointed out, but only experience allows one to recognize such traps before falling into them. This is why exploring the techniques Walter pointed to is so valuable. Such exploration is a stepping stone to better solutions on the rest of your Ansible journey.
To step back down a few levels (and then I’ll stop preaching), if you need to pass some json to a tool like jq
, the solutions are (1) don’t; do something else instead; (2) use “{{ data | to_json }}”, which is fragile enough but you may get away with it; (3) this other trick (echo “{{ data }}” | sed …) which is even more fragile and pulls in another executable for questionable purposes.
Sorry if I got a little preachy there. Hope you can take it in the spirit intended.