Inconsistent variable dereferences depending on number of forks

Dear all,

I have recently upgraded to ansible 1.3.2 from the release/1.3.2 branch. This upgrade broke my playbooks.

git bisect identified commit 576962d as the first bad commit. This commit changes the number of forks in relation to the number of hosts. As it turns out, this commit did not introduce the bug I encountered, but merely unmasked it.

Unfortunately, the most reduced example I managed to create is not really minimal. At the end of the post there’s a script to create the files I mention here, so that you can easily reproduce the problem. Note that you will need to set hash_behaviour = merge In your ansible.cfg:

hosts:

[vm_servers]
127.0.0.2
[vm_guests]
127.0.0.3
127.0.0.4

host_vars/127.0.0.3:

name: srv
vm:
host_system: 127.0.0.2

host_vars/127.0.0.4:

name: srx
vm:
host_system: 127.0.0.2

site.yml:

  • include: role.vm.yml

role.vm.yml:

  • hosts: vm_guests
    vars_files:
  • “roles/vm/vars/{{ name }}.yml”
    roles:
  • vm

roles/vm/vars/srv.yml

vm:
foo: bar

roles/vm/vars/srx.yml

vm:
foo: baz

roles/vm/tasks/main.yml

  • name: delegation pass
    action: shell hostname -f
    delegate_to: “{{ vm.host_system }}”

  • name: template upload
    action: template src=test.j2 dest=/tmp/test.txt

  • name: delegation fail
    action: shell hostname -f
    delegate_to: “{{ vm.host_system }}”

roles/vm/templates/test.j2

{%- for host in groups[‘vm_guests’] -%}
{{ hostvars[host][‘name’] }}
{%- endfor %}

With ansible 1.3.2, the playbook works like expected when called with:

$ ansible-playbook site.yml -i hosts

If I limit the number of forks to exactly 1, execution aborts with an error:

$ ansible-playbook site.yml -i hosts -vvvv --forks 1

TASK: [delegation pass]

See https://github.com/ansible/ansible/blob/devel/CONTRIBUTING.md for information about how to report bugs.

In the above, you have

  • “roles/vm/vars/{{ name }}.yml”

This is a vars_file that is loaded by the hostname.

This is a bad idea because all variables in the vars/ directory are going to get loaded.

What you should do instead is put all variables in the ‘host_vars/host’ file that are specific, or store them outside of the roles path.

In my original playbook I have something like

vars_files:

  • [ “roles/vm/vars/{{ name }}.yml”, “roles/vm/vars/default.yml”

as mentioned in the docs. Is this still bad practice? Is the referenced example not applicable to roles?

Thanks. I created a new issue #4239.

Michale, your last sentence just registered :slight_smile:

So this is bad:

vars_files:

  • [ “roles/vm/vars/{{ name }}.yml”, “roles/vm/vars/default.yml” ]

while the following is ok:

vars_files:

  • [ “vars/{{ name }}.yml”, “vars/default.yml” ]

Correct?

Oops, typo. Sorry, Michael.

I looked and the feature I think we had about loading everything in vars/ under a role isn’t actually implemented yet :slight_smile:

So, yeah, we need to look into this.

I imagine something is not doing a copy operation so the main variables are getting tweaked.

Sorry about the noise – hard to remember what gets implemented sometimes!

Thanks. Should we move the discussion to github?

Yeah we have the ticket open now, so we can track it on the ticket.

Thanks!