XCP-NG and Ansible - Error in create VM on XCP-NG-8.2.1

Hi, Everyone!
I’ve two environment with XCP-NG, a with XCP-NG 8.2.0 where I use for tests. The other, it is a XCP-NG 8.2.1, this I use for production.
I wrote a playbook for to create VMs in XCP-NG. In test environment ( 8.2.0) I dind’t have problem, but when I ran the playbook in produciton environmet for me the ansible showed a error when it try to create a VM.
This is the error:

fatal: [localhost]: FAILED! => {"changed": false, "msg": "XAPI ERROR: ['HANDLE_INVALID', 'host', 'OpaqueRef:45ceb5df-0d7f-4485-98b5-af8439f6cbd2']"}

In logs i found the lines with de error menseges.

Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] host.get_record D:74f5c816f7e9 failed with exception Db_exn.DBCache_NotFound("missing row", "host", "OpaqueRef:45ceb5df-0d7f-4485-98b5-af8439f6cbd2")
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] Raised Db_exn.DBCache_NotFound("missing row", "host", "OpaqueRef:45ceb5df-0d7f-4485-98b5-af8439f6cbd2")
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 1/9 xapi Raised at file ocaml/database/db_cache_impl.ml, line 135
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 2/9 xapi Called from file ocaml/xapi/db_actions.ml, line 8160
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 3/9 xapi Called from file ocaml/xapi/rbac.ml, line 197
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 4/9 xapi Called from file ocaml/xapi/rbac.ml, line 205
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 5/9 xapi Called from file ocaml/xapi/server_helpers.ml, line 92
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 6/9 xapi Called from file ocaml/xapi/server_helpers.ml, line 113
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 7/9 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 8/9 xapi Called from file [map.ml](http://map.ml/), line 135
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace] 9/9 xapi Called from file src/sexp_conv.ml, line 156
Apr 29 15:32:53 xcp-ng-r740 xapi: [error||1967426 :::80||backtrace]

Can someone help me in this case?

Bellow I putted the playbook that I used.

---

- name: Tentativa de criação VM
  hosts: localhost
  vars_files: /home/teste/ansible/vars/xen_localhost_vars.yml
  tasks:
  - name: Criando VM
    community.general.xenserver_guest:
       hostname: "{{ xenserver }}"
       username: "{{ xenserver_username }}"
       password: "{{ xenserver_password }}"
       #validate_certs: false
       #home_server:
       folder: /teste/vm
       name: teste
       state: poweredon
       #template: Debian11_Disk30G
       template_uuid: "{{ templete_vm }}"
       disks:
       - size_gb: 40
         sr_uuid: xxxxxx-xxxx-xxxxx-xxxxx-xxxxxxxxxxxxx 
       hardware:
          num_cpus: 1
          num_cpu_cores_per_socket: 1
          memory_mb: 1024
       #cdrom:
           #type: iso
           #iso_name: guest-tools.iso
       networks:
       - name: network0
            #mac: aa:bb:dd:aa:00:18
       #wait_for_ip_address: true
       #delegate_to: localhost
       register: deploy
          - name: deploy
            debug:
               msg: "{{ deploy }}"
...

Sorry me about my English, because it is in building

<small>[Reply](https://xcp-ng.org/forum/topic/8931/xcp-ng-and-ansible-error-in-create-vm-on-xcp-ng-8-2-1#)</small>

Hi Thiago,

Bojan here, the author of xenserver_* Ansible modules. I rarely roam the Ansible forums so I stumbled upon this thread only by accident.

There doesn’t seem to be any problem with your Ansible playbook or the module itself. What module is showing is just an error returned by the XenAPI triggered by internal error on the host.

What seems to be the case here is that your VM has a reference to a host that does not exist (or does not exist any more). A VM can reference a host in case you use home_server option.

References are unique internal XenAPI identifiers. Each object, including hosts and VMs, has one. XenAPI internaly deals with the objects using references instead of names like xcp-ng-r740. Anyway, it seems that Ansible module somehow at some point got a corrupted or invalid reference to a host or the host is missing.

A few causes for this come to my mind:

  1. There is a corruption in your XenAPI database or database cache. This can be caused by hardware issue - bad RAM, bad disk, improper shutdown that currupted the file system etc. but can also be caused by a bug in XenAPI.
  2. A host got removed from the pool, possibly forcefully, at some point but VM kept a reference to this host.
  3. You have some hosts with duplicated IP addresses (IP address collision) so the Ansible module sometimes connects to the first host and sometimes to the other. References on one host are unknown by the other.

Is this something that occured only once or is this something that happens often for you?

If none of the above cases apply to you, than this can be a sign of some serious bug in xapi daemon and should be reported as a bug to Vates and/or Citrix.

3 Likes

Hi Bojan Vitnik.
So, reference the paramiter “home_server”, I already use this paramiter in myplaybook, when used the IP or de name server, but the result is the same.
I noticed that direrence between the environment of test doesn’t be in pool.
My produciton environment it is new and the other options never happened.
After you indicate use home_server, I did here, but the problem persits .
Regarding this problem always occurring, yes, I have since run this playbook in my production environment.
Thank’s to answer my topic.

Do you still need assistance with this? At this moment I can’t think of a simple way to debug this. This requires some serious digging.

One more idea i that this could be related to host being standalone or in the pool. Maybe host is standalone now but was part of the pool at some point? I did’t understand which of your environments is pool and which one is standalone host.

Hi, Yes, i do. Which informations do you need for help me?

Which of your environments is standalone host and which one is a pool?

It is a pool. 2 servers.

The pool is test or production environment?

Production environment.

Can you run the Ansible playbook that does not work but with “-vvvv” option (high verbosity) and send me an output?

ansible@ansible:~/ansible/playbooks_apresentacao/producao$ ansible-playbook --ask-vault-pass teste-criacao-vm-prod.yml -vvv
ansible-playbook [core 2.15.8]
config file = /etc/ansible/ansible.cfg
configured module search path = [‘/home/ansible/.ansible/plugins/modules’, ‘/usr/share/ansible/plugins/modules’]
ansible python module location = /usr/lib/python3/dist-packages/ansible
ansible collection location = /home/ansible/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible-playbook
python version = 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0] (/usr/bin/python3)
jinja version = 3.0.3
libyaml = True
Using /etc/ansible/ansible.cfg as config file
Vault password:
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match ‘all’
[WARNING]: While constructing a mapping from /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml, line 8, column 9, found a duplicate dict key (home_server). Using last defined
value only.
Skipping callback ‘default’, as we already have a stdout callback.
Skipping callback ‘minimal’, as we already have a stdout callback.
Skipping callback ‘oneline’, as we already have a stdout callback.

PLAYBOOK: teste-criacao-vm-prod.yml **************************************************************************************************************************************************************************
1 plays in teste-criacao-vm-prod.yml
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’

PLAY [Tentativa de criação VM] *******************************************************************************************************************************************************************************
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’

TASK [Gathering Facts] ***************************************************************************************************************************************************************************************
task path: /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml:2
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: ansible
<127.0.0.1> EXEC /bin/sh -c ‘echo ~ansible && sleep 0’
<127.0.0.1> EXEC /bin/sh -c ‘( umask 77 && mkdir -p “echo /home/ansible/.ansible/tmp”&& mkdir “echo /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580” && echo ansible-tmp-1717689258.0953054-684098-48375431404580=“echo /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580” ) && sleep 0’
Using module file /usr/lib/python3/dist-packages/ansible/modules/setup.py
<127.0.0.1> PUT /home/ansible/.ansible/tmp/ansible-local-68409469c84n_7/tmpyrscm6cz TO /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580/AnsiballZ_setup.py
<127.0.0.1> EXEC /bin/sh -c ‘chmod u+x /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580/ /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580/AnsiballZ_setup.py && sleep 0’
<127.0.0.1> EXEC /bin/sh -c ‘/usr/bin/python3 /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580/AnsiballZ_setup.py && sleep 0’
<127.0.0.1> EXEC /bin/sh -c ‘rm -f -r /home/ansible/.ansible/tmp/ansible-tmp-1717689258.0953054-684098-48375431404580/ > /dev/null 2>&1 && sleep 0’
ok: [localhost]
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’

TASK [Criando VM] ********************************************************************************************************************************************************************************************
task path: /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml:6
Read vars_file ‘/home/ansible/ansible/vars/xen_localhost_vars.yml’
ESTABLISH LOCAL CONNECTION FOR USER: ansible
EXEC /bin/sh -c ‘echo ~ansible && sleep 0’
EXEC /bin/sh -c ‘( umask 77 && mkdir -p “echo /home/ansible/.ansible/tmp”&& mkdir “echo /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116” && echo ansible-tmp-1717689259.163136-684180-164259893967116=“echo /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116” ) && sleep 0’
Using module file /home/ansible/.ansible/collections/ansible_collections/community/general/plugins/modules/xenserver_guest.py
PUT /home/ansible/.ansible/tmp/ansible-local-68409469c84n_7/tmp_k5ro415 TO /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116/AnsiballZ_xenserver_guest.py
EXEC /bin/sh -c ‘chmod u+x /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116/ /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116/AnsiballZ_xenserver_guest.py && sleep 0’
EXEC /bin/sh -c ‘/usr/bin/python3 /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116/AnsiballZ_xenserver_guest.py && sleep 0’
EXEC /bin/sh -c ‘rm -f -r /home/ansible/.ansible/tmp/ansible-tmp-1717689259.163136-684180-164259893967116/ > /dev/null 2>&1 && sleep 0’
The full traceback is:
File “/tmp/ansible_community.general.xenserver_guest_payload_u18fmorf/ansible_community.general.xenserver_guest_payload.zip/ansible_collections/community/general/plugins/module_utils/xenserver.py”, line 347, in gather_vm_params
vm_affinity = xapi_session.xenapi.host.get_record(vm_params[‘affinity’])
File “/usr/local/lib/python3.10/dist-packages/XenAPI/XenAPI.py”, line 260, in call
return self.__send(self.__name, args)
File “/usr/local/lib/python3.10/dist-packages/XenAPI/XenAPI.py”, line 154, in xenapi_request
result = _parse_result(getattr(self, methodname)(*full_params))
File “/usr/local/lib/python3.10/dist-packages/XenAPI/XenAPI.py”, line 234, in _parse_result
raise Failure(result[‘ErrorDescription’])
fatal: [localhost]: FAILED! => {
“changed”: false,
“invocation”: {
“module_args”: {
“cdrom”: null,
“custom_params”: null,
“disks”: [
{
“name”: null,
“name_desc”: null,
“size”: null,
“size_b”: null,
“size_gb”: “30”,
“size_kb”: null,
“size_mb”: null,
“size_tb”: null,
“sr”: null,
“sr_uuid”: “nnnnnnnn-nnnn-nnnn-nnnn-nnnnnnnn”
}
],
“folder”: “/ansible”,
“force”: false,
“hardware”: {
“memory_mb”: 2096,
“num_cpu_cores_per_socket”: 1,
“num_cpus”: 1
},
“home_server”: “xcp-ng-server1”,
“hostname”: “z.z.z.z”,
“is_template”: false,
“linked_clone”: false,
“name”: “Deletar_vm_criada_ansible”,
“name_desc”: null,
“networks”: [
{
“gateway”: null,
“gateway6”: null,
“ip”: null,
“ip6”: null,
“mac”: null,
“name”: “Server2-MGMT”,
“netmask”: null,
“type”: null,
“type6”: null
}
],
“password”: “VALUE_SPECIFIED_IN_NO_LOG_PARAMETER”,
“state”: “poweredon”,
“state_change_timeout”: 0,
“template”: null,
“template_uuid”: “aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa”,
“username”: “user”,
“uuid”: null,
“validate_certs”: true,
“wait_for_ip_address”: false
}
},
“msg”: “XAPI ERROR: [‘HANDLE_INVALID’, ‘host’, ‘OpaqueRef:45ceb5df-0d7f-4485-98b5-af8439f6cbd2’]”
}

PLAY RECAP ***************************************************************************************************************************************************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0

ansible@ansible:~/ansible/playbooks_apresentacao/producao$

Your comment would be a lot more readable if you edited it to use fenced code blocks, eg:

```bash
echo “hello world!”
```

Results in:

echo "hello world!"

OK, I think I know what is happening here. It’s definitely associated with home_server parameter or in XenServer terms the “affinity”. The module fails at this line:

vm_affinity = xapi_session.xenapi.host.get_record(vm_params[‘affinity’])

It fails when trying to get info for the host VM is bound to. That means that VM (or template) has home server or “affinity” set to some host but that host does not exist.

Here is the catch. It’s most probably not a problem with your VM but your template. Your template has some home server set but that home server does not exist any more. Either the host was removed from the pool or you copied/migrated template from one pool to the other. As a rule, your templates should not have any home server set to avoid these kinds of errors.

We should check this out to verify my doubts. Please run this command from host console:

xe template-param-list uuid="$(xe template-list name-label='your-template-name' | head | awk '{print $5}')"

Please change your-template-name with the name of your template.

Last login: Wed May 15 10:53:25 2024 from 10.0.15.250
[09:10 xcp-ng-1 ~]# xe template-param-list uuid="$(xe template-list name-label=Debian11_Disk30G | head | awk '{print $5}')"
uuid ( RO)                                  : af0718ed-1a75-d1fc-b349-d3d696ae12d8
                            name-label ( RW): Debian11_Disk30G
                      name-description ( RW):
                          user-version ( RW): 1
                         is-a-template ( RW): true
                   is-default-template ( RW): false
                         is-a-snapshot ( RO): false
                           snapshot-of ( RO): <not in database>
                             snapshots ( RO):
                         snapshot-time ( RO): 19700101T00:00:00Z
                         snapshot-info ( RO):
                                parent ( RO): <not in database>
                              children ( RO):
                     is-control-domain ( RO): false
                           power-state ( RO): halted
                         memory-actual ( RO): 0
                         memory-target ( RO): 0
                       memory-overhead ( RO): 73400320
                     memory-static-max ( RW): 8489271296
                    memory-dynamic-max ( RW): 8489271296
                    memory-dynamic-min ( RW): 8489271296
                     memory-static-min ( RW): 536870912
                      suspend-VDI-uuid ( RW): <not in database>
                       suspend-SR-uuid ( RW): <not in database>
                          VCPUs-params (MRW):
                             VCPUs-max ( RW): 4
                      VCPUs-at-startup ( RW): 2
                actions-after-shutdown ( RW): Destroy
                  actions-after-reboot ( RW): Restart
                   actions-after-crash ( RW): Restart
                         console-uuids (SRO):
                                   hvm ( RO): false
                              platform (MRW): timeoffset: 1; videoram: 8; hpet: true; secureboot: false; device-model: qemu-upstream-compat; apic: true; device_id: 0001; vga: std; nx: true; pae: true; viridian: false; acpi: 1; cores-per-socket: 2
                    allowed-operations (SRO): changing_NVRAM; changing_dynamic_range; changing_shadow_memory; changing_static_range; migrate_send; provision; destroy; export; clone; copy
                    current-operations (SRO):
                    blocked-operations (MRW):
                   allowed-VBD-devices (SRO): 1; 2; 4; 5; 6; 7; 8; 9; 10; 11; 12; 13; 14; 15; 16; 17; 18; 19; 20; 21; 22; 23; 24; 25; 26; 27; 28; 29; 30; 31; 32; 33; 34; 35; 36; 37; 38; 39; 40; 41; 42; 43; 44; 45; 46; 47; 48; 49; 50; 51; 52; 53; 54; 55; 56; 57; 58; 59; 60; 61; 62; 63; 64; 65; 66; 67; 68; 69; 70; 71; 72; 73; 74; 75; 76; 77; 78; 79; 80; 81; 82; 83; 84; 85; 86; 87; 88; 89; 90; 91; 92; 93; 94; 95; 96; 97; 98; 99; 100; 101; 102; 103; 104; 105; 106; 107; 108; 109; 110; 111; 112; 113; 114; 115; 116; 117; 118; 119; 120; 121; 122; 123; 124; 125; 126; 127; 128; 129; 130; 131; 132; 133; 134; 135; 136; 137; 138; 139; 140; 141; 142; 143; 144; 145; 146; 147; 148; 149; 150; 151; 152; 153; 154; 155; 156; 157; 158; 159; 160; 161; 162; 163; 164; 165; 166; 167; 168; 169; 170; 171; 172; 173; 174; 175; 176; 177; 178; 179; 180; 181; 182; 183; 184; 185; 186; 187; 188; 189; 190; 191; 192; 193; 194; 195; 196; 197; 198; 199; 200; 201; 202; 203; 204; 205; 206; 207; 208; 209; 210; 211; 212; 213; 214; 215; 216; 217; 218; 219; 220; 221; 222; 223; 224; 225; 226; 227; 228; 229; 230; 231; 232; 233; 234; 235; 236; 237; 238; 239; 240; 241; 242; 243; 244; 245; 246; 247; 248; 249; 250; 251; 252; 253; 254
                   allowed-VIF-devices (SRO): 0; 1; 2; 4; 5; 6
                        possible-hosts ( RO): 68b81b95-14c7-4f83-bd19-23c704ec5ed6; f30f6a89-e846-415d-96a5-03e73c1359cf
                           domain-type ( RW): hvm
                   current-domain-type ( RO): hvm
                       HVM-boot-policy ( RW): BIOS order
                       HVM-boot-params (MRW): firmware: bios; order: cdn
                 HVM-shadow-multiplier ( RW): 1.000
                             PV-kernel ( RW):
                            PV-ramdisk ( RW):
                               PV-args ( RW):
                        PV-legacy-args ( RW):
                         PV-bootloader ( RW):
                    PV-bootloader-args ( RW):
                   last-boot-CPU-flags ( RO): vendor: GenuineIntel; features: 1fcbfbff-f7fa3223-2c100800-00000121-0000000f-019c07ab-00000008-00000000-00001000-9c000400-00000000-00000000-00000000-00000000-00000000-00000000-00000000-00000000
                      last-boot-record ( RO): ''
                           resident-on ( RO): <not in database>
                              affinity ( RW): <not in database>
                          other-config (MRW): instant: true; base_template_name: Debian Bullseye 11; import_task: OpaqueRef:b3f0053b-3b67-4ec8-a0ed-ce9b73cdde08; mac_seed: 4f2c6ac2-6fcb-3dac-3ce7-5ec2f61c944e; install-methods: cdrom,nfs,http,ftp; linux_template: true
                                dom-id ( RO): -1
                       recommendations ( RO): <restrictions><restriction field="memory-static-max" max="1649267441664"/><restriction field="vcpus-max" max="32"/><restriction field="has-vendor-device" value="false"/><restriction field="allow-gpu-passthrough" value="1"/><restriction field="allow-vgpu" value="1"/><restriction field="allow-network-sriov" value="1"/><restriction field="supports-bios" value="yes"/><restriction field="supports-uefi" value="no"/><restriction field="supports-secure-boot" value="no"/><restriction max="255" property="number-of-vbds"/><restriction max="7" property="number-of-vifs"/></restrictions>
                         xenstore-data (MRW): vm-data/mmio-hole-size: 268435456; vm-data:
            ha-always-run ( RW) [DEPRECATED]: false
                   ha-restart-priority ( RW):
                                 blobs ( RO):
                            start-time ( RO): 20231220T00:58:10Z
                          install-time ( RO): 20231220T00:40:50Z
                          VCPUs-number ( RO): 2
                     VCPUs-utilisation (MRO):
                            os-version (MRO): name: Debian GNU/Linux 11 (bullseye); uname: 5.10.0-26-amd64; distro: debian; major: 11; minor: 11
                    PV-drivers-version (MRO): major: 7; minor: 30; micro: 0; build: 11
    PV-drivers-up-to-date ( RO) [DEPRECATED]: true
                                memory (MRO):
                                 disks (MRO):
                                  VBDs (SRO): 2bf84290-dbaf-1003-5d18-aa006912e551; 548cbe30-420f-f59b-bed1-efedbdf3401e
                              networks (MRO): 3/ipv6/0: fe80::dcce:9cff:fec8:7077; 3/ip: 10.0.9.89; 3/ipv4/0: 10.0.9.89
                   PV-drivers-detected ( RO): true
                                 other (MRO): feature-balloon: 1; feature-vcpu-hotplug: 1; feature-reboot: 1; feature-poweroff: 1; feature-suspend: 1; has-vendor-device: 0; platform-feature-xs_reset_watches: 1; platform-feature-multiprocessor-suspend: 1
                                  live ( RO): true
            guest-metrics-last-updated ( RO): 20231220T00:59:30Z
                   can-use-hotplug-vbd ( RO): unspecified
                   can-use-hotplug-vif ( RO): unspecified
              cooperative ( RO) [DEPRECATED]: true
                                  tags (SRW):
                             appliance ( RW): <not in database>
                     snapshot-schedule ( RW): <not in database>
                      is-vmss-snapshot ( RO): false
                           start-delay ( RW): 0
                        shutdown-delay ( RW): 0
                                 order ( RW): 0
                               version ( RO): 0
                         generation-id ( RO):
             hardware-platform-version ( RO): 0
                     has-vendor-device ( RW): false
                       requires-reboot ( RO): false
                       reference-label ( RO): debian-11
                          bios-strings (MRO): bios-vendor: Dell Inc.; bios-version: 1.11.2; system-manufacturer: Dell Inc.; system-product-name: PowerEdge R750; system-version: ; system-serial-number: 3XGG7T3; baseboard-manufacturer: Dell Inc.; baseboard-product-name: 04V528; baseboard-version: A01; baseboard-serial-number: .3XGG7T3.CNIVC002750267.; oem-1: Xen; oem-2: MS_VM_CERT/SHA1/bdbeb6e0a816d43fa6d3fe8aaef04c2bad9d3e3d; oem-3: Dell System; oem-4: 5[0000]; oem-5: 14[1]; oem-6: 17[3BE53CE653CECF38]; oem-7: 17[3C6DA5BDB553A4F8]; oem-8: 18[0]; oem-9: 19[1]; oem-10: 19[1]; oem-11: 26[0]; oem-12: 31[1]; oem-13: 30[0000000069BD6000;00100000]; hp-rombios:


Bvitnik,
Do you think that I should to try execute this playbook, but using other template? In this case, I would create a new template, and run a playbook to see if with a new template the problem does not repeat itself?

You can try with another template if you want but let’s try this experiment first. Please type this into CLI:

xe vm-param-clear param-name=affinity uuid=af0718ed-1a75-d1fc-b349-d3d696ae12d8

And then try to provision some new VM from the “Debian11_Disk30G” template using Ansible.

For more in depth explanation, please continue reading.

Looking at the params of your template, “affinity” parameter definitely has some improper value (note the <not in database>). Unfortunately, even for proper templates, this parameter is showing the same value in CLI. The difference can only be seen through XenAPI but we will not delve into that for now. Let’s first try the experiment above.

Here is an example of how params of one of my templates are seen through XenAPI:

[
    {
        "OpaqueRef:3033de62-ab20-4459-8896-0c2030281391": {
            "HVM_boot_params": {
                "firmware": "bios",
                "order": "dcn"
            },
            "HVM_boot_policy": "BIOS order",
            "HVM_shadow_multiplier": 1.0,
            "NVRAM": {},
            "PCI_bus": "",
            "PV_args": "",
            "PV_bootloader": "",
            "PV_bootloader_args": "",
            "PV_kernel": "",
            "PV_legacy_args": "",
            "PV_ramdisk": "",
            "VBDs": [
                "OpaqueRef:af90e70f-a115-40b3-9a75-4a025e12d3c4",
                "OpaqueRef:84312b7a-be9c-4fca-bff5-031f6fac583d",
                "OpaqueRef:faa5306c-02ab-4d29-a7df-218871bf13da"
            ],
            "VCPUs_at_startup": "16",
            "VCPUs_max": "16",
            "VCPUs_params": {
                "weight": "256"
            },
            "VGPUs": [],
            "VIFs": [
                "OpaqueRef:28a087fe-ed4e-4cc4-82f3-fc140dff978d",
                "OpaqueRef:16c919fc-77dd-46ec-bc68-59dddf555f7a"
            ],
            "VTPMs": [],
            "VUSBs": [],
            "actions_after_crash": "restart",
            "actions_after_reboot": "restart",
            "actions_after_shutdown": "destroy",
            "affinity": "OpaqueRef:NULL",
            "allowed_operations": [
                "changing_NVRAM",
                "changing_dynamic_range",
                "changing_shadow_memory",
                "changing_static_range",
                "make_into_template",
                "migrate_send",
                "destroy",
                "export",
                "start_on",
                "start",
                "clone",
                "copy",
                "snapshot"
            ],
.
.
.

Note that “affinity” has a value of “OpaqueRef:NULL” which is a proper value which means that template is not associated with any host (no home server set). Your template has a value of “OpaqueRef:45ceb5df-0d7f-4485-98b5-af8439f6cbd2” which is a reference to a host that does not exist. Both are unfortunately shown as <not in database> in CLI.

The command provided above will reset “affinity” to “OpaqueRef:NULL” value.

1 Like

Hi Bvitnik.
I performed the command that you passed me. Now it is all correct. I maked the test with the other template that there is a same error. Now, the ansible created a vm correctly .

ansible@ansible:~/ansible/playbooks_apresentacao/producao$ ansible-playbook --ask-vault-pass teste-criacao-vm-prod.yml -vvv
ansible-playbook [core 2.15.8]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/home/ansible/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3/dist-packages/ansible
  ansible collection location = /home/ansible/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/bin/ansible-playbook
  python version = 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0] (/usr/bin/python3)
  jinja version = 3.0.3
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
Vault password:
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
[WARNING]: While constructing a mapping from /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml, line 8, column
9, found a duplicate dict key (home_server). Using last defined value only.
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: teste-criacao-vm-prod.yml ************************************************************************************************************
1 plays in teste-criacao-vm-prod.yml
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'

PLAY [Tentativa de criação VM] *****************************************************************************************************************
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'

TASK [Gathering Facts] *************************************************************************************************************************
task path: /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml:2
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: ansible
<127.0.0.1> EXEC /bin/sh -c 'echo ~ansible && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/ansible/.ansible/tmp `"&& mkdir "` echo /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982 `" && echo ansible-tmp-1718380185.9401252-839001-241008283097982="` echo /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982 `" ) && sleep 0'
Using module file /usr/lib/python3/dist-packages/ansible/modules/setup.py
<127.0.0.1> PUT /home/ansible/.ansible/tmp/ansible-local-8389974fywt08r/tmpbxo8ibqt TO /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982/AnsiballZ_setup.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982/ /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982/AnsiballZ_setup.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982/AnsiballZ_setup.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /home/ansible/.ansible/tmp/ansible-tmp-1718380185.9401252-839001-241008283097982/ > /dev/null 2>&1 && sleep 0'
ok: [localhost]
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'

TASK [Criando VM] ******************************************************************************************************************************
task path: /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml:6
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ansible
<localhost> EXEC /bin/sh -c 'echo ~ansible && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/ansible/.ansible/tmp `"&& mkdir "` echo /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722 `" && echo ansible-tmp-1718380187.0203605-839083-111205290269722="` echo /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722 `" ) && sleep 0'
Using module file /home/ansible/.ansible/collections/ansible_collections/community/general/plugins/modules/xenserver_guest.py
<localhost> PUT /home/ansible/.ansible/tmp/ansible-local-8389974fywt08r/tmpyotu3qaa TO /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722/AnsiballZ_xenserver_guest.py
<localhost> EXEC /bin/sh -c 'chmod u+x /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722/ /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722/AnsiballZ_xenserver_guest.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python3 /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722/AnsiballZ_xenserver_guest.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /home/ansible/.ansible/tmp/ansible-tmp-1718380187.0203605-839083-111205290269722/ > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
    "changed": true,
    "instance": {
        "cdrom": {
            "type": "none"
        },
        "customization_agent": "custom",
        "disks": [
            {
                "name": "Ubuntu22.04_Disk30GB_Ram8GB 0",
                "name_desc": "Created by template provisioner",
                "os_device": "xvda",
                "size": 32212254720,
                "sr": "STORAGE2",
                "sr_uuid": "50075613-8d27-cc00-47f5-857275566265",
                "vbd_userdevice": "0"
            }
        ],
        "domid": "-1",
        "folder": "/ansible",
        "hardware": {
            "memory_mb": 2096,
            "num_cpu_cores_per_socket": 1,
            "num_cpus": 1
        },
        "home_server": "xcp-ng-server1,
        "is_template": false,
        "name": "Deletar_vm_criada_ansible",
        "name_desc": "",
        "networks": [
            {
                "gateway": "",
                "gateway6": "",
                "ip": "",
                "ip6": [],
                "mac": "da:97:83:46:22:87",
                "mtu": "1500",
                "name": "Server2-MGMT",
                "netmask": "",
                "prefix": "",
                "prefix6": "",
                "vif_device": "3"
            }
        ],
        "other_config": {
            "base_template_name": "Ubuntu Jammy Jellyfish 22.04",
            "folder": "/ansible",
            "import_task": "OpaqueRef:4a36ea24-37e9-47a7-a3bd-d227495739c4",
            "install-methods": "cdrom,nfs,http,ftp",
            "instant": "true",
            "linux_template": "true",
            "mac_seed": "1b771e63-1346-a651-8145-8653560d3b83"
        },
        "platform": {
            "acpi": "1",
            "apic": "true",
            "device-model": "qemu-upstream-compat",
            "device_id": "0001",
            "hpet": "true",
            "nx": "true",
            "pae": "true",
            "secureboot": "false",
            "timeoffset": "0",
            "vga": "std",
            "videoram": "8",
            "viridian": "false"
        },
        "state": "poweredon",
        "uuid": "e17e6e65-eb04-b040-d4b4-6af403222e3f",
        "xenstore_data": {
            "vm-data": "",
            "vm-data/mmio-hole-size": "268435456"
        }
    },
    "invocation": {
        "module_args": {
            "cdrom": null,
            "custom_params": null,
            "disks": [
                {
                    "name": null,
                    "name_desc": null,
                    "size": null,
                    "size_b": null,
                    "size_gb": "30",
                    "size_kb": null,
                    "size_mb": null,
                    "size_tb": null,
                    "sr": null,
                    "sr_uuid": "50075613-8d27-cc00-47f5-857275566265"
                }
            ],
            "folder": "/ansible",
            "force": false,
            "hardware": {
                "memory_mb": 2096,
                "num_cpu_cores_per_socket": 1,
                "num_cpus": 1
            },
            "home_server": "xcp-ng-server1",
            "hostname": "z.z.z.z",
            "is_template": false,
            "linked_clone": false,
            "name": "Deletar_vm_criada_ansible",
            "name_desc": null,
            "networks": [
                {
                    "gateway": null,
                    "gateway6": null,
                    "ip": null,
                    "ip6": null,
                    "mac": null,
                    "name": "R740-MGMT",
                    "netmask": null,
                    "type": null,
                    "type6": null
                }
            ],
            "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "state": "poweredon",
            "state_change_timeout": 0,
            "template": null,
            "template_uuid": "fc0280a3-706a-c97c-2821-03cbc53f0f13",
            "username": "user",
            "uuid": null,
            "validate_certs": true,
            "wait_for_ip_address": false
        }
    }
}
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'

TASK [deploy] **********************************************************************************************************************************
task path: /home/ansible/ansible/playbooks_apresentacao/producao/teste-criacao-vm-prod.yml:28
ok: [localhost] => {
    "msg": {
        "changed": true,
        "failed": false,
        "instance": {
            "cdrom": {
                "type": "none"
            },
            "customization_agent": "custom",
            "disks": [
                {
                    "name": "Ubuntu22.04_Disk30GB_Ram8GB 0",
                    "name_desc": "Created by template provisioner",
                    "os_device": "xvda",
                    "size": 32212254720,
                    "sr": "VMDATA-02-NEW",
                    "sr_uuid": "50075613-8d27-cc00-47f5-857275566265",
                    "vbd_userdevice": "0"
                }
            ],
            "domid": "-1",
            "folder": "/ansible",
            "hardware": {
                "memory_mb": 2096,
                "num_cpu_cores_per_socket": 1,
                "num_cpus": 1
            },
            "home_server": "xcp-ng-server2",
            "is_template": false,
            "name": "Deletar_vm_criada_ansible",
            "name_desc": "",
            "networks": [
                {
                    "gateway": "",
                    "gateway6": "",
                    "ip": "",
                    "ip6": [],
                    "mac": "da:97:83:46:22:87",
                    "mtu": "1500",
                    "name": "Server2-MGMT",
                    "netmask": "",
                    "prefix": "",
                    "prefix6": "",
                    "vif_device": "3"
                }
            ],
            "other_config": {
                "base_template_name": "Ubuntu Jammy Jellyfish 22.04",
                "folder": "/root",
                "import_task": "OpaqueRef:4a36ea24-37e9-47a7-a3bd-d227495739c4",
                "install-methods": "cdrom,nfs,http,ftp",
                "instant": "true",
                "linux_template": "true",
                "mac_seed": "1b771e63-1346-a651-8145-8653560d3b83"
            },
            "platform": {
                "acpi": "1",
                "apic": "true",
                "device-model": "qemu-upstream-compat",
                "device_id": "0001",
                "hpet": "true",
                "nx": "true",
                "pae": "true",
                "secureboot": "false",
                "timeoffset": "0",
                "vga": "std",
                "videoram": "8",
                "viridian": "false"
            },
            "state": "poweredon",
            "uuid": "e17e6e65-eb04-b040-d4b4-6af403222e3f",
            "xenstore_data": {
                "vm-data": "",
                "vm-data/mmio-hole-size": "268435456"
            }
        }
    }
}
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'
Read vars_file '/home/ansible/ansible/vars/xen_localhost_vars.yml'

PLAY RECAP *************************************************************************************************************************************
localhost                  : ok=3    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Thanks for help me.

Great! Glad to be of help.

As a reminder, when creating VM templates, you should ensure that they don’t have any home server set. There are few different ways to do it:

  1. Via GUI - XCP-ng Center (shown) or Xen Orchestra:

Right click on VM template → Properties → Home Server → Don’t assign this VM a home server.

  1. Via CLI:
xe vm-param-clear param-name=affinity uuid=<your-template-uuid>
  1. Via Ansible:
tasks:
  - name: Clear affinity
    community.general.xenserver_guest:
       hostname: "{{ xenserver }}"
       username: "{{ xenserver_username }}"
       password: "{{ xenserver_password }}"
       uuid: "<your-template-uuid>"
       home_server: ""

I will think if this issue warrants some special handling in xenserver_guest module like. For example I could fix the module by making it ignore invalid home server references.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.