File lookup plugin behavior concerning quotes @abadger, @bcoca

Hi all,

This is mostly aimed at the core team…

Can you explain why the file lookup plugin changes the contents of the file as far as quotes are concerned? I.e. it changes all double quotes to single quotes.

I am not sure if it used to perform this action but all i know is modules that rely on loading json through a lookup(‘file’,‘my.json’) in to a parameter have recently started to fail and digging about has led me here.

Obviously the problem is that single quotes is not valid json.

Can we not have the file lookup plugin return exactly what was passed in?

Example:

Passed in:

{

“Version”: “2012-10-17”,

“Statement”: [

{

“Sid”: “”,

“Effect”: “Allow”,

“Principal”: {

“Service”: “ec2.amazonaws.com

},

“Action”: “sts:AssumeRole”

}

]

}

Passed out:

{‘Version’: ‘2012-10-17’, ‘Statement’: [{‘Action’: ‘sts:AssumeRole’, ‘Principal’: {‘Service’: ‘ec2.amazonaws.com’}, ‘Effect’: ‘Allow’, ‘Sid’: ‘’}]}

Hi,

Hi all,

This is mostly aimed at the core team…

Can you explain why the file lookup plugin changes the contents of the file as far as quotes are concerned? I.e. it changes all double quotes to single quotes.

I am not sure if it used to perform this action but all i know is modules that rely on loading json through a lookup(‘file’,‘my.json’) in to a parameter have recently started to fail and digging about has led me here.

Obviously the problem is that single quotes is not valid json.

What happens if you pipe it to to_json filter.

It’s even worse…

“{\n "Version": "2012-10-17",\n "Statement": [\n {\n "Sid": "",\n "Effect": "Allow",\n "Principal": {\n "Service": "ec2.amazonaws.com"\n },\n "Action": "sts:AssumeRole"\n }\n ]\n}”

:frowning:

Here’s the bugs i know of because of this…

https://github.com/ansible/ansible-modules-core/issues/3404
https://github.com/ansible/ansible-modules-extras/issues/1813

This is not a simple problem, aside from an issue with Ansible internals, modules are inconsistent in what they expect when declaring these fields, some expect an actual JSON string, others expect a data structure to be converted (the better ones detect what they got and adjust).

The lookup itself does no transformations, the issue is with Jinja2 templating, it returns a string by default, so Ansible tries to detect ‘type’, this conflicts with JSON as any string starting with “[” or “{” is considered a list or dictionary respectively. This is done in an effort of preserving existing types in declared variables and we cannot effectively just ‘turn off’ without creating many other issues.

To avoid this ‘typing’ there are a few of things that can be done:

  • in all versions of Ansible, put in a preceding space " {{lookup… ", this will skip type detection.

  • in the latest versions a |to_json and certain other filters at the end will also bypass the automatic type casting.

We are also adding a new feature for modules in 2.2 a type=‘json’ which will accept both JSON strings and/or python data structures which will automatically be transformed to JSON. This will effectively be the same as ‘the better modules’ I mention above and will be ‘transparent’ to users, eventually eliminating the problem and confusion.

Hi,

Tried both with the latest version of devel.: a space in " {{ …}}" and |to_json filter but they both return:

TASK [Create S3 bucket] ********************************************************
fatal: [127.0.0.1]: FAILED! => {“changed”: false, “failed”: true, “msg”: “Policies must be valid JSON and the first byte must be ‘{’”}

sigh that is new one … the module is doing a check it shouldn’t, we need to fix that in the module itself.

Interesting, I’m trying to do something similar with ansible 2.0.0.2 and looking up the json from a file and then posting it to (a solr instance) with uri.

The message I’m getting back from solr implies there is something wrong with the json I’m sending it.

Can anybody point me at a working example of this?

many thanks

Jon

No, the check is correct, the policy string must start with a {

One can easily confirm this in the AWS Console.

uri should work and I can help diagnose that. What are you sending to it and what’s the error message?

-Toshio

ansible module should be able to take the user input and make it conform to
what AWS needs. So, for instance, since one of the workarounds for the
mixture of yaml and jinja2 templating doing the wrong thing is to prepend a
space to your json string, the ansible module should probably should be
stripping leading and trailing whitespace. (that looks fine under the JSON
specification to me).

-Toshio

  - in all versions of Ansible, put in a preceding space " {{lookup... ",
this will skip type detection.
  - in the latest versions a `|to_json` and certain other filters at the
end will also bypass the automatic type casting.

Note on to_json -- I think that this is a bit confusing as what you'd want

to do is pass a dictionary or list to to_json. But when combining (at
least in a single variable) with lookup , what you end up is passing a
string to to_json. That is problematic because you then end up escaping
that to make it a single json string.

We are also adding a new feature for modules in 2.2 a type='json' which

will accept both JSON strings and/or python data structures which will
automatically be transformed to JSON. This will effectively be the same as
'the better modules' I mention above and will be 'transparent' to users,
eventually eliminating the problem and confusion.

I called the new type jsonarg. Making use of it will require modules

which have a parameter which is supposed to be a json string to change the
parameter's type from "str" to "jsonarg". It is better than the current
hodgepodge where each module attempts to deal with this entirely on its
own, sometimes leaving out certain cornercases as we're seeing here.

-Toshio

For people experiencing this, can you test out: https://github.com/ansible/ansible-modules-extras/pull/2126 need to know if that’s sufficient or if there’s more code needed to fix this. Also need to know if there’s any regressions introduced by it.

-Toshio

Thank you.

The json file I was sending contained the following (which I have tried on a single line and also on a single line without spaces)

`

{
“add-field”:{
“name”:“recordNumber”,
“type”:“string”,
“indexed”:true,
“stored”:true,
“docValues”:true,
“omitNorms”:true,
“omitTermFreqAndPositions”:true,
“sortMissingLast”:true,
“multiValued”:false
},
“add-field”:{
“name”:“characterization”,
“type”:“text_general”,
“indexed”:true,
“stored”:true,
“omitNorms”:true,
“omitTermFreqAndPositions”:true,
“sortMissingLast”:true,
“multiValued”:false
},
“add-field”:{
“name”:“primaryname”,
“type”:“text_general”,
“indexed”:true,
“stored”:true,
“omitNorms”:true,
“omitTermFreqAndPositions”:true,
“sortMissingLast”:true,
“multiValued”:false
},
“add-field”:{
“name”:“originator”,
“type”:“text_general”,
“indexed”:true,
“stored”:true,
“omitNorms”:true,
“omitTermFreqAndPositions”:true,
“sortMissingLast”:true,
“multiValued”:false
},
“add-field”:{
“name”:“createdOn”,
“type”:“tdate”,
“indexed”:true,
“stored”:true,
“docValues”:true,
“omitNorms”:true,
“omitTermFreqAndPositions”:true,
“multiValued”:false
},
“add-field”:{
“name”:“report”,
“type”:“text_en_splitting”,
“indexed”:true,
“stored”:true,
“omitNorms”:false,
“omitTermFreqAndPositions”:false,
“multiValued”:false
}
}

`

This is the error I was getting

`
ok: [malpdwfftsla001 → localhost] => {“changed”: false, “content”: “{\n "responseHeader":{\n "status":0,\n "QTime":0},\n "errors":[{"errorMessages":"Error parsing schema operations :The JSON must be an Object of the form {\"command\": {…},…"}]}\n”, “content_length”: “185”, “content_type”: “text/plain;charset=utf-8”, “redirected”: false, “status”: 200}

`

I guess it would have been helpful if I had used wireshark or similar to capture what URI was actually POSTing, didn’t think to do that at the time.

I tried both with and without the leading space when fetching the json at various points.

`

  • name: fetch schema json
    set_fact:
    schema: " {{ lookup(‘file’, ‘schema.json’)}}"

`

I tried using vars_files to load the contents but because there are multiple “add-field” keys, I only got the last defined “add-field” command. I played with converting to a list of objects and using with_items on the uri action (below) but still got the above error for each iteration of the list.

`

  • name: add the schema
    uri:
    url: “http://{{ solr_node }}:8983/solr/evaluation/schema”
    method: POST
    body: “{{ item }}”
    body_format: json
    return_content: yes
    with_items:
    schema
    delegate_to: localhost

`

For now I have abandoned modifying solr via the rest api and am pushing the schema files I want to use out directly from ansible.