Find register directory seems to not registered

  • hosts: test
    remote_user: ec2-user
    sudo: yes
    tasks:
    #using the shell command because we need * expansion, otherwise if we know the exact directory we can use command module instead

  • name: list log directory to find main directory name
    find:
    paths: [ “/var/log/” ]
    patterns: “file-main*”
    file_type: directory
    register: out_directories

  • debug: var=“{{ item }}”
    with_items: “{{ out_directories }}”

The output is

ansible-playbook -vv -i hosts -u ec2-user directory_cleanup.yml
Using /Users/ggao/Projects/ansible-playbooks/test/ansible.cfg as config file
[DEPRECATION WARNING]: Instead of sudo/sudo_user, use become/become_user and make sure become_method is ‘sudo’ (default).
This feature will be removed
in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.

PLAYBOOK: directory_cleanup.yml ************************************************
1 plays in directory_cleanup.yml

PLAY [test] ********************************************************************

TASK [setup] *******************************************************************
ok: [10.81.108.140]

TASK [list log directory to find wasabi main directory name] *******************
task path: /Users/ggao/Projects/ansible-playbooks/test/directory_cleanup.yml:8
ok: [10.81.108.140] => {“changed”: false, “examined”: 30, “files”: [{“atime”: 1478891483.1474261, “ctime”: 1478891479.107535, “dev”: 51713, “gid”: 50002, “inode”: 264540, “isblk”: false, “ischr”: false, “isdir”: true, “isfifo”: false, “isgid”: false, “islnk”: false, “isreg”: false, “issock”: false, “isuid”: false, “mode”: “0755”, “mtime”: 1478891479.107535, “nlink”: 2, “path”: “/var/log/file-main-20161108022522”, “rgrp”: true, “roth”: true, “rusr”: true, “size”: 4096, “uid”: 50002, “wgrp”: false, “woth”: false, “wusr”: true, “xgrp”: true, “xoth”: true, “xusr”: true}], “matched”: 1, “msg”: “”}

TASK [debug] *******************************************************************
task path: /Users/ggao/Projects/ansible-playbooks/test/directory_cleanup.yml:16
ok: [10.81.108.140] => (item=files) => {
“files”: “VARIABLE IS NOT DEFINED!”,
“item”: “files”
}
ok: [10.81.108.140] => (item=msg) => {
“item”: “msg”,
“msg”: “VARIABLE IS NOT DEFINED!”
}
changed: [10.81.108.140] => (item=changed) => {
“changed”: “VARIABLE IS NOT DEFINED!”,
“item”: “changed”
}
ok: [10.81.108.140] => (item=examined) => {
“examined”: “VARIABLE IS NOT DEFINED!”,
“item”: “examined”
}
ok: [10.81.108.140] => (item=matched) => {
“item”: “matched”,
“matched”: “VARIABLE IS NOT DEFINED!”
}

PLAY RECAP *********************************************************************
10.81.108.140 : ok=3 changed=1 unreachable=0 failed=0

using python 2.7 on ansible-playbook 2.2.0.0

Thanks

- hosts: test
  remote_user: ec2-user
  sudo: yes
  tasks:
      #using the shell command because we need * expansion, otherwise if we
know the exact directory we can use command module instead
      - name: list log directory to find main directory name
        find:
            paths: [ "/var/log/" ]
            patterns: "file-main*"
            file_type: directory
        register: out_directories

When you use register with find the result is a list in files.
You can check the content of out_directories with
- debug: var=out_directories

      - debug: var="{{ item }}"

Var takes the variable and not the content of the variable, correct use is var=item.

        with_items: "{{ out_directories }}"

Register with find, the result is list of dictionaries in out_directories.files, so you'll need to write it like this

- debug: var=item.path
   with_items: "{{ out_directories.files }}"

Kai,

thanks for your explanation and I have made some progress but I am further stopped by the next stage.
My problem is that I have a dynamic directory that generate log file in certain directories with the following format
/var/log/app-YYYYMMDD-RANDOM_HASH/log_file_name-YYYYMMDDMMSS-[audit,console,access,].log and I am trying to find all the logs in that directory that are more than 1GB then trim it.

I have these in my tasks

tasks:
#using the shell command because we need * expansion, otherwise if we know the exact directory we can use command module instead

  • name: list log directory to find wasabi main directory name
    find:
    paths: [ “/var/log/” ]
    patterns: “wasabi-intuit-main*”
    file_type: directory
    register: out_directories
    ignore_errors: True

  • name: list log files for wasabi intuit main
    find:
    paths: “{{item.path}}”
    patterns: “wasabi-intuit-main*.log”
    file_type: file
    register: out_files
    with_items: “{{ out_directories.files }}”
    ignore_errors: True

but it seems the returned out_files variable is a dict. the key is another dict of the previous job and the value is a dict from find return values with added properties like “changed”,“examined”, and “msg”. now I am confused on how to iterate that object so I can filter the result.

I have tried “{{ out_files.values().files }}” and “{{ out_files.results.files }}” which does not seems to work

thanks for your explanation and I have made some progress but I am further
stopped by the next stage.
My problem is that I have a dynamic directory that generate log file in
certain directories with the following format
/var/log/app-YYYYMMDD-RANDOM_HASH/log_file_name-YYYYMMDDMMSS-[audit,console,access,].log
and I am trying to find all the logs in that directory that are more than
1GB then trim it.

Can't you just do this in one find?

- name: Find file >=1GB
   find:
     path: /var/log
     patterns: "wasabi-intuit-main*.log"
     file_type: file
     size: 1g
   register: files_too_large

I have these in my tasks

  tasks:
      #using the shell command because we need * expansion, otherwise if we
know the exact directory we can use command module instead
      - name: list log directory to find wasabi main directory name
        find:
            paths: [ "/var/log/" ]
            patterns: "wasabi-intuit-main*"
            file_type: directory
        register: out_directories
        ignore_errors: True

      - name: list log files for wasabi intuit main
        find:
            paths: "{{item.path}}"
            patterns: "wasabi-intuit-main*.log"
            file_type: file
        register: out_files
        with_items: "{{ out_directories.files }}"
        ignore_errors: True

but it seems the returned out_files variable is a dict. the key is another
dict of the previous job and the value is a dict from find return values
with added properties like "changed","examined", and "msg". now I am
confused on how to iterate that object so I can filter the result.

I have tried "{{ out_files.values().files }}" and "{{
out_files.results.files }}" which does not seems to work

This is going to be somewhat complicated, I do recommend looking at doing it with less loops.

out_file.results.0.files will contain all files in the fist directory from out_directories
out_file.results.1.files will contain all files in the second directory from out_directories
...

If you still want to iterate on out_files you will have to look av with_subelements.

Probably someting like this
   - debug: var=item.1.path
     with_subelements:
       - "{{ out_file.results }}"
       - files

Thanks for the reply, I ended using

  • name: Find file >=1GB
    find:
    path: /var/log
    patterns: “wasabi-intuit-main*.log”
    file_type: file
    size: 1g
    recursive: true
    register: files_too_large

because the file is in another level deep than /var/log and the nested loops seems to be too complicated. I would really love is find’s path can also take a regex instead of just the pattern on the file matching.