How can I pass a json variable set by set_fact to jq pipeline in ansible shell task?

Hello,
I have a ansible playbook that has set a json variable using set_fact module. I’m trying to use jq command in ansible playbook, to transform that json to a particular format, but got parse error: Invalid numeric literal at line , column

My Playbook:

  • name: get data

set_fact: data: “{{ value.stdout | from_json }}”

  • name: get key value pairs

shell: echo “{{ data }}” | jq ‘to_entries | map((.key) + “=” + .value)|join(“,”)’

register: key_value

Error:

TASK [utils : debug] *************************************************************************************

ok: [localhost] => {
“data”: {

“key1”: “keyval1”,
“key2”: “keyval2”
}
}

TASK [utils : get key value pairs] *************************************************************** FAILED! => {“changed”: true, “cmd”: “echo "{‘key1’: ‘keyval1’, ‘key2’: ‘keyval2’}" | jq ‘to_entries | map((.key) + "=" + .value)|join(",")’”, “stderr”: “parse error: Invalid numeric literal at line 1 , column 22”, “stderr_lines”: [“parse error: Invalid numeric literal at line 1, column 22”], “stdout”: “”, “stdout_lines”: }

PS: I had tried using > YAML construct specified in How to use jq in ansible shell tasks but got same error

  • name: get key value pairs

shell: >

echo “{{ data }}”

jq ‘to_entries | map((.key) + “=” + .value)|join(“,”)’ register: key_value

My data variable set using set_fact module:

“data”: { “key1”: “keyval1”, “key2”: “keyval2” }

Working jq command and expected output: jqPlay

How can I pass the fact variable set using set_fact to jq pipeline in ansible and transform my json?

Thank you

You can use ansible itself to transform data. You don’t need to use a shell to jq to achieve this.

https://docs.ansible.com/ansible/latest/user_guide/playbooks_filters.html

Walter

I want to transform the json obtained from previous ansible task to a key value pair separated by comma.

Json Input:

“data”: {

“key1”: “keyval1”,
“key2”: “keyval2”,
“key3”: “keyval3”
}
}

Expected output:
“key1: keyval1,key2: keyval2,key3: keyval3”

Is this transformation of json possible using ansible filters?

Here is my test playbook. You can use jinja2 templates on the left side of assignments. The dict2items creates list

I will add … this is documented well on the Ansible filters page. You need to get comfortable reading their documentation. It is very good.

https://docs.ansible.com/ansible/latest/user_guide/playbooks_filters.html#transforming-dictionaries-into-lists

Walter

Thank you

There’s good advice further up thread about doing data manipulations within Ansible using the tools it provides, and I heartily agree. Still, the original post raises some interesting questions, and it’s instructive to understand why the playbook snippet there doesn’t work.

First, we aren’t shown what value.stdout looks like, but since it made it through the from_json filter let’s assume it looks something like

{"app": "myApp", "env": "myEnv"}

In the “get key value pairs” step, jq is expecting json, but data isn’t a json string, so it needs to be filtered through to_json. And since that json is going to contain double-quotes, you’ll need to single-quote the whole thing for the shell to handle it properly. (And that’s assuming none of your json values contains single quotes!)

Here’s the closest I could come up with that does what the original post was attempting to do. Not that is is a good idea; this is just an exercise in fixing stuff as asked. There are other, better ways to do this.

---
- name: json to jq
  hosts: localhost
  gather_facts: no
  vars:
    value:
      stdout: '{"app": "myApp", "env": "myEnv"}'
  tasks:
#
    - name: get data
      set_fact:
        data: "{{ value.stdout | from_json }}"
#
    - name: get key value pairs
      shell: |
        echo '{{ data | to_json }}' | jq 'to_entries | map((.key) + "=" + .value)|join(",")'
      register: key_value

Cheers,

Thanks Todd! That worked!

Error seems to be jq complaining about the JSON passed to it. As mentioned in the question the json printed in error message isn’t valid JSON. Somehow double quotes were replaced by single quote although debug prints valid JSON. Invalid JSON was passed to jq.

Replacing the single quote with double quote also solved the issue.

shell: echo “{{ data }}” | sed “s/'/"/g” | jq ‘to_entries | map((.key) + “=” + .value)|join(“,”)’

Having said that this is Not that is is a good idea; this is just an exercise in fixing stuff as asked. Could you elaborate on other better ways you know and Why this is not a good idea?

We’re feature-creeping here.

Your original ask was, “How can I pass the fact variable set using set_fact to jq pipeline in ansible and transform my json?”
The second best answer is echo "{{ data | to_json }}" | jq ….
The first answer is, “No.”

Then it was, “I want to transform the json obtained from previous Ansible task to a key value pair separated by comma.”
Farther up the thread, Walter gave some excellent suggestions that will require some work on your part to fully appreciate. The most valuable part of that, believe it or not, is the work, not the final result.

This ask is a little shakier than the first, because such a string has no intrinsic value. Presumably you want that format because you’ve got another part of your solution that requires such input. However, I’m suspicious of a process that would consume such a string, and expect that a broader understand of the problem would suggest another approach.

Now we get to, “Why this is not a good idea?” Frankly, this is the deepest question so far, and it gets right at the issue of how to think about data in Ansible. There’s nothing absolutely right or wrong about passing some json through jq in a shell script. When you’ve been solving problems that way, it’s natural when learning Ansible to use it as a wrapper around such techniques. But as you gain more experience with Ansible filters, Jinja, etc., you’ll find that sometimes a better solution to such a task is to not put yourself in a situation where you need such a task in the first place. Increased experience gives insight into “the Ansible way” of structuring and manipulating data, to solve data problems within Ansible itself, rather than merely using Ansible as a bridge to a string of external solutions.

This is why I wanted to follow-up to your original post – even though that isn’t how an experienced Ansible user would approach the problem: Because understanding why that playbook didn’t work is fundamental to stepping up to the next level of Ansible proficiency. It was the wrong solution with a broken implementation. Now you have the wrong solution with a working implementation. Whether that matters depends; if this is a one-off, it’s no big deal either way. But if it’s going to run thousands of times a day, it’s reaching out to a remote machine each time to do trivial - perhaps unnecessary - data manipulations that could be done much more efficiently on the Ansible controller itself. It’s easy to choose the less expensive solution when it’s pointed out, but only experience allows one to recognize such traps before falling into them. This is why exploring the techniques Walter pointed to is so valuable. Such exploration is a stepping stone to better solutions on the rest of your Ansible journey.

To step back down a few levels (and then I’ll stop preaching), if you need to pass some json to a tool like jq, the solutions are (1) don’t; do something else instead; (2) use “{{ data | to_json }}”, which is fragile enough but you may get away with it; (3) this other trick (echo “{{ data }}” | sed …) which is even more fragile and pulls in another executable for questionable purposes.

Sorry if I got a little preachy there. Hope you can take it in the spirit intended.

Or just skip the tasks and assign to variable direclty
vars:
  myvar: "{{lookup('pipe'. data|to_json + '|jq "to_entrie ....

Thank you Todd and Walter for the wonderful explanation to my question. These awesome explanations excites me to gain more expertise in Ansible. I’m a newbie to Ansible and I’m at my 1st week of using Ansible. As a beginner, I was unaware of doing these kind of data transformation in Ansible controller itself. To be Frank, I didn’t even know that this could be done in Ansible itself. I thought it could only be achieved using some json processing tool. I’ve started learning the ansible filters part. As you mentioned, now I know how expensive this operation is going to be.

I’m trying to automate the management of some applications in kubernetes. I’m using Kubernetes collection for ansible and the actual problem statement is finding out certain kubernetes applications based on a match criteria.The initial input I receive from the previous task is json. This json to key-value pair data transformation is done to obtain the match criteria that has to be passed to the next task that has kubectl command to query applications.

Happy to help. One thing that I came to appreciate and that has served me well is that Ansible is written in Python. Many of the filters and features of data manipulation in Python are exposed in Ansible. When seeking a way to achieve something in Ansible it sometimes can be helpful to think about how it would be done in Python then look for a corresponding filter or method or module in Ansible.

Walter

Woah, that’s interesting! Thank you