Setup facts: ansible_default_ipv4 is empty, even when interface is active with an ip address

I am writing a playbook where I want to name a file including the ipaddress. However, my default ipv4 in the facts is empty, even though my interface is active and has a static ip. My target machine is CentOS 7.1, and my ansible version is 1.9.3.

“ansible_default_ipv4”: {},
“ansible_default_ipv6”: {
“address”: “REDACTED”,
“gateway”: “REDACTED”,
“interface”: “enp5s0”,
–break–
“ansible_enp5s0”: {
“active”: true,
“device”: “enp5s0”,
“ipv4”: {
“address”: “REDACTED”,
“netmask”: “255.255.255.0”,
“network”: “REDACTED”

This might just be a configuration issue with my host. I’m still working on bootstrapping it to be my new control host. I also just realized that AWS EC2 hosts give their internal addresses, not public, so default ipv4 won’t always do what I need.

My goal is to fetch a set of ssh public_key files gathered from remote hosts, and name them with user + the hosts’ public ip addresses. Is there a way to extract the ips from a group of hosts? I am ending up with one file rather than a file for each host when I use the following:

My task:

  • name: copy new keys back to repo
    fetch: “src=/home/ansible/.ssh/id_rsa.pub
    dest=/home/ansible/public_keys/ansible_{{item}}
    flat=yes”
    with_items:
  • ‘{{ hosts }}’

On the command line I specify --extra-vars “hosts=ipaservers” and end up with one file named ansible_ipaservers with only one key in it. I’m hoping to end up with something like ansible_52.x.x.x, ansible_192.x.x.x, ansible_176.x.x.x files instead, each with the ansible user’s key on that host.