A development, staging and production environment.
Development environment is a local Vagrant box.
Staging would be a DigitalOcean droplet.
Production would be several DigitalOcean droplets.
All of the DigitalOcean droplets are fetched with a Dynamic Inventory script.
Currently, all my variables are stored within the roles, but this doesn’t work when trying to make this work with multiple environments and I’m really confused on as to how to do this right.
I have one playbook for everything. My initial thought would be to split up the three environments into separate playbooks. But how do I go around with grouping the variables correctly?
What do you mean all of your variables are in roles? As in role defaults? If so, that’s fine, use inventory variables to override them.
Your 3 environment scenario is fairly typical. The approach depends on you.
Here’s a couple of rough approaches (hard to tell based on your description).
have a separate inventory for each environment.
The playbook would be the same, but you would have to point to a different inventory file for each run (using -i).
have each environment be a group(dev, stage, prod), and group/environment specific vars are set as inventory group_vars. Have your machines belong to the group that corresponds to their environment, and they will include the proper variables. Use a intersection host-pattern to target the group in your playbook:
Assuming env is defined as an extra var, and you ran:
and playbook.yml started with something like this:
`
hosts: web:&{{ env }}
roles:
foo
bar
`
it would target your web servers in your dev environment.
Some folks will even have completely separate cloud credentials for dev/staging/prod. It all comes down with what level of complexity your comfortable with.