System
Ubuntu 20.04 LTS
Ansible Version
ansible [core 2.13.11]
config file = /home/admin/appstack/ansible.cfg
configured module search path = ['/home/admin/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /home/admin/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Nov 22 2023, 10:22:35) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Playbook Task
- name: 'Project Cleanup and Purge'
hosts: localhost
gather_facts: false
tasks:
- name: '(DEVICE) - Bring the Docker Compose Stack Down and Purge Volumes'
ansible.builtin.command:
cmd: docker compose down --volumes
args:
chdir: "{{ configuration.deploy_dir | default('/home/admin/project/prod') }}"
register: docker_compose_purge_results
changed_when: docker_compose_purge_results.rc != 0
This playbook is supposed to run locally on the controller, and I have explicitly mentioned that the chdir
should be via a variable and if the variable is not set the default path is provided.
Error
ansible-playbook [core 2.13.11]
config file = /home/admin/appstack/ansible.cfg
configured module search path = ['/home/admin/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /home/admin/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible-playbook
python version = 3.8.10 (default, Nov 22 2023, 10:22:35) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Using /home/admin/appstack/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /home/admin/appstack/inventory/hosts.yml as it did not pass its verify_file() method
script declined parsing /home/admin/appstack/inventory/hosts.yml as it did not pass its verify_file() method
Parsed /home/admin/appstack/inventory/hosts.yml inventory source with yaml plugin
Loading collection community.docker from /usr/local/lib/python3.8/dist-packages/ansible_collections/community/docker
redirecting (type: callback) ansible.builtin.yaml to community.general.yaml
Loading collection community.general from /usr/local/lib/python3.8/dist-packages/ansible_collections/community/general
Loading callback plugin community.general.yaml of type stdout, v2.0 from /usr/local/lib/python3.8/dist-packages/ansible_collections/community/general/plugins/callback/yaml.py
Loading callback plugin awx_display of type stdout, v2.0 from /usr/local/lib/python3.8/dist-packages/ansible_runner/display_callback/callback/awx_display.py
Skipping callback 'awx_display', as we already have a stdout callback.
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: local_stack_teardown.yml *********************************************
Positional arguments: local_stack_teardown.yml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/home/admin/appstack/inventory',)
vault_password_files: ('/home/admin/appstack/project/.vaultpass',)
forks: 5
1 plays in local_stack_teardown.yml
PLAY [PACEdge - DEVICE - PACEdge Project Cleanup and Purge] ********************
META: ran handlers
TASK [(DEVICE) - Bring the Docker Compose Stack Down and Purge Volumes] ********
task path: /home/admin/appstack/project/local_stack_teardown.yml:18
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: admin
<127.0.0.1> EXEC /bin/sh -c 'echo ~admin && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir "` echo /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849 `" && echo ansible-tmp-1715950067.7385032-180389-125912705660849="` echo /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849 `" ) && sleep 0'
Using module file /usr/local/lib/python3.8/dist-packages/ansible/modules/command.py
<127.0.0.1> PUT /home/admin/.ansible/tmp/ansible-local-180384ocaydl_s/tmpvyvxsro3 TO /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849/AnsiballZ_command.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849/ /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849/AnsiballZ_command.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1715950067.7385032-180389-125912705660849/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
File "/tmp/ansible_ansible.legacy.command_payload_4zjt8lfo/ansible_ansible.legacy.command_payload.zip/ansible/modules/command.py", line 336, in main
fatal: [localhost]: FAILED! => changed=true
cmd:
- docker
- compose
- down
- --volumes
delta: null
end: null
invocation:
module_args:
_raw_params: docker compose down --volumes
_uses_shell: false
argv: null
chdir: /tmp/remote
creates: null
executable: null
removes: null
stdin: null
stdin_add_newline: true
strip_empty_ends: true
warn: false
msg: 'Unable to change directory before execution: [Errno 2] No such file or directory: b''/tmp/remote'''
rc: null
start: null
stderr: ''
stderr_lines: <omitted>
stdout: ''
stdout_lines: <omitted>
I am using ansible-runner
to run this playbook via:
ansible-runner run . -p teardown.yml
and I am unable to understand two things:
- Why is the
ansible.builtin.command
stating a Legacy Command? - why is this being provided with a
/tmp/remote
directory? I have not explicitly mentioned such a path anywhere?