Failed to connect to the host via ssh: Permission denied (publickey,password)

I have a playbook where it has an option that can be run in 2 different ways.

  1. When having single host to execute, can run prompt.yml file to execute against single host.
  2. When a large number of hosts exits, that was kept in a inventory and run against them.

Play book is working fine when executing with site_prompt.yml, but having issue when executing again site_inventory.yml

SK [upgrade : Retrieve the current firmware] **************************************************************************************************************************************
task path: /root/brocade/ansible-fos-command/devopsweb1/roles/brocade_upgrade/tasks/main.yaml:1
<7.5.5.154> ESTABLISH SSH CONNECTION FOR USER: None
<7.5.5.154> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/c6fd0bb33a 7.5.5.154 ‘/bin/sh -c ‘"’“‘echo ~ && sleep 0’”’“‘’
<7.5.5.155> ESTABLISH SSH CONNECTION FOR USER: None
<7.5.5.155> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/900bdb6d71 7.5.5.155 '/bin/sh -c '”‘“‘echo ~ && sleep 0’”’"‘’
<7.50.52.154> (255, b’‘, b’Permission denied (publickey,password).\r\n’)
fatal: [7.5.5.154]: UNREACHABLE! => {
“changed”: false,
“msg”: “Failed to connect to the host via ssh: Permission denied (publickey,password).”,
“unreachable”: true
}
<7.5.5.155> (255, b’‘, b’Permission denied (publickey,password).\r\n’)
fatal: [7.5.5.155]: UNREACHABLE! => {
“changed”: false,
“msg”: “Failed to connect to the host via ssh: Permission denied (publickey,password).”,
“unreachable”: true

With this amount of provided information the only thing that can be said is that the issue is with site_inventory.yml

Playbook: Site_inventory.yml

  • hosts: switches
    gather_facts: False
    vars:
    ftp_server: 2.19.16.15

tasks:

  • name: Include the brocade upgrade role
    include_role:
    name: brocade_upgrade
    vars:
    ip_addr: “{{ inventory_hostname }}”

playbooks: Main.yml

  • name: Retrieve the current firmware
    brocade_fos_command:
    switch_login: “{{ username }}”
    switch_password: “{{ password }}”
    switch_address: “{{ ip_addr}}”
    command_set:

  • command: firmwareshow
    register: firmwareshow_output

  • name: Parse the firmwareshow output
    ansible.netcommon.cli_parse:
    text: “{{ firmwareshow_output[‘messages’]|join(‘\n’) }}”
    parser:
    name: ansible.netcommon.native
    template_path: “{{ role_path }}/templates/brocade_firmware.yaml”
    register: parsed_firmwareshow

inventory file:

all:
vars:
upgrade_path:
v7.4.2d: “v7.4.2g”
v8.2.1d: “v8.2.2d”
children:
switches:
hosts:
#10.1.1.2:
#10.1.1.3:

vars:
username: admin
password: password

just a shot into the dark:
could it be, that the permissions on the files in .ssh of the target are wrong?
ssh is very picky here
robert

I am running this playbook against the switches, where it does not have an inbuilt python to execute. It has to execute in localhost and then fetch the output.
That is working fine in prompt.yml file. but not working correctly in inventory.yml