Help needed with setting up a correct ansible structure for test and production

note: I had to replace part of the used hosts names as the forum sees them as links. Used an underscore instead of a point.

I’ve been struggeling to set-up a correct ansible structure for multiple servers and local test environmets that I can extend without to much trouble. Let me try to explain it.

We have various servers that I want to maintain through ansible. Servers include, but not limited to nodejs (with nginx), mariaDB servers and a few plesk servers. All the ansible related files are put into version controll in a single git repository.

The idea is that all the servers are using ansible-pull to pull this git repository and execute one of the playbooks at a set time(s) a day. For testing purposes I want to be able to run the same playbooks on a virtualbox vm using vagrant. Where vagrant creates the vm’s in virtualbox and executes an intial playbook to prepare the vm. And afterwards runs one of the playbooks from the ansible repository.

My current ansible structure is as follows:

 |-- inventory
 |  |-- group_vars
 |  |  |-> vagrant.yml
 |  |-- host_vars
 |  |  |-- nodejs
 |  |  |  |-> sites.yml
 |  |-> vagrant.yml
 |-- playbooks
 |  |-- nodejs
 |  |  |-> playbook.yml
 |  |  |-> vagrant.yml (can be the same as playbook.yml with vagrant checks) 
 |  |-- mariadb
 |  |  |-> playbook.yml
 |  |  |-> vagrant.yml
 |  |-- ...
 |-- roles
 |  |-- ansible
 |  |-- common
 |  |-- nginx (uses the 'sites' variable to create host files)
 |  |-- nodejs
 |  |-- sudo
 |  |-- users (uses the 'users' variable to create users)
 |  |-- ...
 |-> ansible.cfg

The following list nodes are used. These are fictive hostnames:
production:
- nodejs01.infra_com (running ansible-playbook nodejs playbook.yml)
- nodejs02.infra_com (running ansible-playbook nodejs playbook.yml)
- nodejsne.infra_eu (running ansible-playbook nodejs playbook.yml)
- db01.infra_com (running ansible-playbook mariadb playbook.yml)
- db02.infra_com (running ansible-playbook mariadb playbook.yml)
testing:
- nodejs_test (ansible-playbook nodejs vagrant.yml)
- db_test (ansible-playbook nodejs vagrant.yml)

The vagrant environment will set-up the local vm’s by using a config file (created/updated by the developer) to define what nodes he wants to create. Their hostnames, domains that should be created on them and some other stuff. Based on that configuration file vagrant creates a sites.yml file that contains all the information that ansible needs to create the nginx host files for that particular node. Each node (nodje01.infra_com, nodejs02.infra_com, nodejsne.infra_com, nodejs_test etc) must have their own unique sites.yml file. For an example of a sites.yml file see below.

*** sites.yml ***

---
sites:
 - name: site1
   port: 3000
   fqdn: 'test1_test'
 - name: site2
   port: 3001
   fqdn: 'test2_test'

The problem that I’m facing or rather I’m confused about is the hosts specified in the playbooks and their relation to the various configurations files needed for each node. Where to put them and how to name them. Recall that I primarely use ansible-pull for the moment. But might later on use ansible-push with the same git repository. So that I will be able to maintain all servers from a central ansible host if I want/need to in the future.

I was thinking to have a folder somewhere with a sub folder for each node and put all node releated data in those sub folders. inventory/host_vars for example. Or maybe in the root of the ansible repositiry. The idea is that that is easely maintainable.

For example:
(part of the git repository)

  |-- nodejs01.infra_com
  |  |-> sites.yml
  |-- nodejs02.infra_com
  |  |-> sites.yml
  |-- nodejsne.infra_com
  |  |-> sites.yml

(automatically created localy by vagrant, based on the specified vagrant config file)

  |-- nodejs_test
  |  |-> sites.yml
  |-- db_test
  |  |-> sites.yml

But I’m struggeling to get this working in ansible. The whole inventory file(s) and the relation to the hosts in the playbooks. I can’t seem to wrap my head around it. If anyone is able to shed some light on that or have an idea how I can structure it I’m all ears.

Let me know if Im missing the mark, I feel like I havnt fully grasped your question yet.

I’ve found a good approach is to break inventory files into environments, and in the inventory files have groups that correspond to roles. For you, it might look like two different files:

# inventory/production.yml
all:
  children:
      nodejs:
        hosts:
          nodejs01.infra_com
          nodejs02.infra_com
      db:
        hosts:
          db01.infra_com
          db02.infra_com

# inventory/test.yml
all:
  children:
    nodejs:
      hosts:
        nodejs_test
    db:
      hosts:
        db_test

Then in your playbooks, you target the roles. For example:

# playbooks/mariadb/playbook.yml
---
- hosts: db
  tasks:
....

When you run ansible (push), you will specify the inventory you want to include:
ansible-playbook -i inventory/production.yml playbooks/mariadb/playbook.yml

For push setups, this structure will mean your local host or ansible controller will connect to the hosts in the db group in the production.yml inventory over SSH.
Since your doing ansible-pull right now, your really only working with local connections. You could do a hairpin ssh connection (like the remote host ssh’s to itself) but i think local connections will be better anyway.

Instead of your playbooks targeting localhost like this:

- hosts: localhost
  tasks:
......

You can use the same setup i described above, but set the ansible_connection at the command line. You probably also need to limit the playbook to the host your running on.
ansible-playbook -i inventory/production.yml playbooks/mariadb/playbook.yml -e 'ansible_connection=local' --limit $(hostname)

Finally, you can configure your group_vars or host_vars as needed, but set them up by the server role (inventory/group_vars/db.yml) or the hostname in the inventory file, even if the ansible_connection is local (inventory/host_vars/nodejs01.infra_com.yml). You can even do production or test groups, with a slight adjustment to the inventory file:

# inventory/production.yml
all:
  children:
    production:
      children:
        nodejs:
          hosts:
            nodejs01.infra_com
            nodejs02.infra_com
        db:
          hosts:
            db01.infra_com
            db02.infra_com

Some additional reading:
https://docs.ansible.com/ansible/latest/inventory_guide/intro_inventory.html#inventory-setup-examples
https://docs.ansible.com/ansible/2.8/user_guide/playbooks_best_practices.html

I hope this is helpful

Thanks for your quick response. Sorry for the truckload of text :wink:

let me try to explain it from another point. As I’m not sure if I made my issue clear enough. It is hard to get things straight when you mind is chaotic like mine is at the moment.

For simplification of the issue I’ll just pick one server/node type. Basically I want to run one or more servers with the same nodejs, nginx and pm2 set-up. The only difference between those nodes are the domains running on them. It should not matter if the node is a virtuabox vm, a server at OVH or a server at one of the other providers.

So basically if I set-up a virtualbox vm with vagrant, generate a sites.yml file and run the nodejs playbook on it I should end up with a working local nodejs server, running the specified domeins from the sites.yml. When I run that same playbook on an external server with the same or a different sites.yml I should also end up with a working external server running the supplied list of domains.

The issue I’m facing is what should I use as “hosts:” in the playbook ?
If I’m correct this can be an localhost, IP, hostname, server-group or some kind of pattern. I think in order to be able to use a playbook for more then one server it will need to be a pattern or a server-group

If that is the case how can I make it so that each and every node that I create using that playbook have a different content for the variable “sites” ?

and what inventory file do I need to use for that (ini) is I think easier ti understand then a yml file.

I’m thinking

[nodejs-servers]  -> the group name for the hosts variable in the playbook ?
nodejs01.infra_com
nodejs02.infra_com

Found the answer, thanks to your explanation and some brain farts :wink:

inventory file (inventory/nodejs)

[nodejs_servers]
nodejs.test  -> local hostname

Top part of playbook vagrant.yml

---
- hosts: nodejs_servers
  become: true
  connection: local
  any_errors_fatal: true

  tasks:
    ...

Inventory folder host_vars

host_vars
  -> nodejs.test
    -> sites.yml
    -> test.yml

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.