Unable to access file from one awx automation job pod to another automation job pod hosted on k8s

Hi,

I have awx operator running on k8s and I have following pods running:

NAME                                               READY   STATUS    RESTARTS      AGE
awx-operator-controller-manager-774c766fd4-274vn   2/2     Running   8 (38h ago)   70d
awx-postgres-13-0                                  1/1     Running   3             70d
awx-task-6dbfd8f6c9-2cvd2                          4/4     Running   12            70d
awx-web-64765969f4-gm4nl                           3/3     Running   9             70d

Scenario:
I have 2 AWX job templates i.e update_telegraf and generic_flow.When we launch this job templates we get separate automation-job pod per job execution.Now the issue is that I am creating a json file in generic_flow template in /tmp which is running in 2nd automation job pod and then I am trying to access this file from the 1st automation job pod.But this is not working.

We are using awx ee image for job execution

How can I read this file from 1st automation job pod which is created in 2nd automation job pod ?

Thought to create pv pvc but don’t know from where this automation job pods gets created.

Please provide some solution as well as info that how this automation job pods gets created and work

Details:
AWX: 22.2.0
Ansible: ansible [core 2.14.5]
config file = None
configured module search path = [‘/root/.ansible/plugins/modules’, ‘/usr/share/ansible/plugins/modules’]
ansible python module location = /usr/local/lib/python3.9/site-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.9.16 (main, Dec 8 2022, 00:00:00) [GCC 11.3.1 20221121 (Red Hat 11.3.1-4)] (/usr/bin/python3)
jinja version = 3.1.2
libyaml = True

Im not sure how the PV solution would work, if its possible. Maybe someone else will weigh in on that.

Some other options:

  1. upload the file to a remote storage location (s3 bucket, azure blob, NFS share)
  2. instead of storing the data in a file, pass the data from one job to the next as an extra var. You could use the AWX api to have the first job trigger the second job and pass the data

Hi @mikemorency ,

Thanks for the update.

Actually I was able to fix this issue and I use the following approach:

Once the file get’s created in the automation_job_pod,then from there I do scp for that file on the local host in /tmp.And then I have wrote the task to read the file from /tmp and it was able to read.
So it worked.