Why does delegate_to group with include_role run command on local machine?

I am trying to debug a playbook I’ve written which uses a couple of roles to spin up and then configure an AWS instance.

The basic structure is one playbook (new-server.yml) imports two roles – roles/ec2_instance and roles/start_env. The ec2_instance role should be ran on localhost with my AWS tokens and then the start_env role gets ran on servers which are generated by the first role.

My playbook new-server.yml starts off like this:

  • name: provision new instance
    include_role:
    name: ec2_instance
    public: yes
    vars:
    instance_name: “{{ item.host_name }}”
    env: “{{ item.git_branch }}”
    env_type: “{{ item.env_type }}”
    loop:
  • { host_name: ‘prod’, git_branch: ‘master’, env_type: ‘prod’ }
  • { host_name: ‘test’, git_branch: ‘test’, env_type: ‘devel’}

This role builds an ec2 instance, updates route 53, uses add_host to add the host to the in-memory inventory in the just_created group.

Next, I have this in the new_server.yml playbook. Both of my IPs show up here just fine. My localhost does not show up here.

  • name: debug just_created group
    debug: msg=“{{ groups[‘just_created’] }}”

Finally, again in new_server.yml, I try to do the last mile configuration and start my application on the new instance:

  • name: Configure and start environment on new instance
    include_role:
    name: start_env
    apply:
    become: yes
    delegate_to: “{{ item }}”
    with_items:
  • “{{ groups[‘just_created’] }}”

However, it doesnt look like the task is delegating properly, because I have this task in roles/start_env/main.yml:

  • name: debug hostname
    debug: msg=“{{ ansible_hostname }}”

And what I’m seeing in my output is

TASK [start_env : debug hostname] ************************************************************************************************************************************
Monday 11 January 2021 12:00:05 -0800 (0:00:00.111) 0:00:37.374 ********
ok: [localhost → 10.20.15.225] => {
“msg”: “My-Local-MBP”
}

TASK [start_env : debug hostname] ************************************************************************************************************************************
Monday 11 January 2021 12:00:05 -0800 (0:00:00.043) 0:00:37.417 ********
ok: [localhost → 10.20.31.35] => {
“msg”: “My-Local-MBP”
}

I’ve read a lot about delegate_to, include_role and loops this morning. It sounds like Ansible has made things pretty complicated when you want to combine these things, but it also seems like the way I am trying to invoke these should be right. Any idea what I’m doing wrong (or if there is a smarter way to do this? I found this – https://medium.com/opsops/ansible-2-5-delegate-to-and-include-role-20cd7e67008e – and while its a clever workaround, it doesn’t quite fit what I’m seeing and I’d like to avoid creating another tasks file in my roles. Not exactly how I want to manage something like this. Most of the information I’ve been going off of has been this thread:

https://github.com/ansible/ansible/issues/35398

The output shows [localhost -> 10.20.31.35] which indicates it is delegating from localhost to 10.20.31.35, however this is only for the connection. Any templating done in the task definition uses the values of the host in the loop, which is localhost.

I see. Thanks for your response. Are you aware of a good way to use the vars within the role? It seems like import_role would do that, but would not permit the loop.

I figured out something in my own way that allows me to most keep what I’ve already written the same. I modified my add_host task to use the instance_name var as the hostname and the ec2 IP as the ansible_host instance var and then updated my last task to

roles/aws.yml:

  • name: Add new instance to inventory
    add_host:
    hostname: “{{ instance_name }}”
    ansible_host: “{{ ec2_private_ip }}”
    ansible_user: centos
    ansible_ssh_private_key_file: …/keys/my-key.pem
    groups: just_created

new_servers.yml:

tasks:

  • name: provision new instance
    include_role:
    name: ec2_instance
    public: yes
    vars:
    instance_name: “{{ item.host_name }}”
    env: “{{ item.git_branch }}”
    env_type: “{{ item.env_type }}”
    loop:

  • { host_name: ‘prod’, git_branch: ‘master’, env_type: ‘prod’ }

  • { host_name: ‘test’, git_branch: ‘test’, env_type: ‘devel’}

  • name: Configure and start environment on new instance
    include_role:
    name: start_env
    apply:
    become: yes
    delegate_to: “{{ item }}”
    vars:
    instance_name: “{{ item }}”
    with_items:

  • “{{ groups[‘just_created’] }}”

Works well enough and lets me avoid adding that dict to every subsequent included role.