Ansible not connecting to all hosts if using common azure cloud service for multiple VMs

I am using a single cloud service in azure, which acts as loadbalancer for two backend hosts. The SSH public port is 22 for VM1 and 26 for VM2.

My Ansible inventory file is like below:

[webservers]
web.cloudapp.net ansible_ssh_port=22
web.cloudapp.net ansible_ssh_port=26

When I ping the VMs

ansible webservers -m ping

I am getting response only once (I expected twice for each VM)

web.cloudapp.net | success >> {
“changed”: false,
“ping”: “pong”
}

I tried commenting one vm1 in inventory and I am getting response from VM2 (I tried commenting VM2 also and got ping response from vm1)

Then I ran ansible-playbook on the same inventory (with 2 VMs) files and i am able to see changes only in second VM with port 26.

I tried this in different VM set and each time I am able to see changes only in port 26 VM, not the other one in port 22.

Can you help me in getting the ansible executing on both VMs when defined at same time

Hosts must have a unique name in inventory.

Both of your hosts have the same name.

You can use ansible_ssh_host to specify a single host/ip and give the hosts unique aliases instead.

Similar too

[webservers]
web1 ansible_ssh_host=web.cloudapp.net ansible_ssh_port=22
web2 ansible_ssh_host=web.cloudapp.net ansible_ssh_port=26

Thanks! let me try this option!