I have been fighting with this issue for days. I have been trying to avoid roles, since my setup seems pretty simple.
Not even sure if roles are needed. You can find this question on stackoverflow as well.
My problem
Here’s my inventory:
[development]
web_server ansible_connection=docker
db_server ansible_connection=docker
[staging]
web_server ansible_host=20.20.20.20 ansible_user=tom ansible_connection=ssh
db_server ansible_host=20.20.20.20 ansible_user=tom ansible_connection=ssh
[production]
web_server ansible_host=10.10.10.10 ansible_user=tom ansible_connection=ssh
db_server ansible_host=10.10.10.10 ansible_user=tom ansible_connection=ssh
My issue now is that I can’t figure a way of having my three playbooks: production.yml, staging.yml, development.yml to resolve the web_server and db_server aliases to the appropriate machines. I want to use those aliases throughout my tasks and playbooks for consistency but nothing seems to work.
Things I’ve tried
This solution doesn’t work since it runs all tasks twice!
---
- hosts: staging
tasks:
- name: Setup web server
command: uptime
delegate_to: web_server
- name: Setup db server
command: ls
delegate_to: db_server
This solution solves the above problem but it prints the wrong alias (web_server
even when running the db task).
---
- hosts: staging
run_once: true
tasks:
- name: Setup web servers
command: uptime
delegate_to: web_server
- name: Setup db servers
command: ls
delegate_to: db_server
This solution suggested by someone at stackoverflow doesn't work either (although without errors). web_server and db_server resolve to the wrong machine.
---
- hosts: staging:web_server
tasks:
- name: Deploy to web server
command: uptime
---
- hosts: staging:db_server
tasks:
- name: Deploy to db server
command: ls
This solution would be ideal but Ansible does not support access to an individual host from a group from what I know:
---
- hosts: web_server
tasks:
- name: Setup web server
command: uptime
- hosts: db_server
tasks:
- name: Setup db server
command: ls