Inventory and group_vars layout advice

Seeking some best-practice advice about where to store variables within the following structure:

  • multiple inventory folders ./inventory/x, ./inventory/y, ./inventory/z
  • inside each inventory folder, combinations of dynamic and static inventory sources
  • inside each inventory folder, environment specific group vars

Currently I am copy-pasting variables common to all of the inventories (environments) between the all.yml group_vars files.

My question: is there a way to have a ‘global’ group variables file I can use for this type of variable?

I know I could source a file with include_vars: at the start of each of the multiple plays inside the playbooks, but hoped there might be something more elegant and/or automatic (like group_vars) available. Maybe even a simple restructure I can’t think of.

`

.
├── inventory

│ ├── aws-dev
│ │ ├── aws-dev
│ │ ├── ec2.ini
│ │ ├── ec2.py
│ │ └── group_vars
│ │ ├── all
│ │ │ ├── all.yml
│ │ │ └── secrets.yml
│ │ ├── security_group_app.yml
│ │ └── security_group_util
│ │ ├── secrets.yml
│ │ └── security_group_util.yml
│ ├── aws-prod
│ │ ├── aws-prod
│ │ └── group_vars
│ │ └── all
│ │ ├── all.yml
│ │ └── secrets.yml
│ └── vmware-dev
│ ├── group_vars
│ │ ├── all
│ │ │ ├── all.yml
│ │ │ └── secrets.yml
│ │ └── tag_role_app.yml
│ ├── vmware-dev
│ ├── vmware.ini
│ └── vmware.py
├── roles

├── aws-configure.yml
├── aws-provision.yml
├── vmware-configure.yml
└── vmware-provision.yml

`

Have you thought about using role default variables instead? If all environments have the same setting, this would be simpler. Role default variables can then be overridden for specific groups in the group vars file.

Already doing that where possible, but for things like microservices that have templated configs that use the same variable, having it in defaults still requires copy-paste, but at that (potentially larger) level instead.

Might it be possible to add an ansible configuration param that forces a particular file to always be loaded, in addition to normal?

Abuse a vars_plugin to always return the vars in a file…?

I’d like to hear advices for this as well. I do have one idea that I don’t like however. You should be able to use group_vars relative to your playbooks as well. So put a group_vars dir outside of your inventories, next to your playbooks, and that will be evaluated for all envs.

Thanks Hafai,
Confirming that works, unbelievably. A group_vars/all.yml next to playbooks, as you say, is evaluated as well as in the inner group_vars.

I’m warming to the idea of writing a vars plugin that is essentially just a pointer to a file somewhere. Advantage is that the target could be anywhere you like.

Problem with a vars plugin is that you’ll have to specify it in each playbook. New idea: you could put another inventory script in your dir that evaluates your common file.

So vars plugins have to be specified per play, like ‘include_vars:’ do? If so, no savings there.

An inventory script would be pretty good as it would be a write once, then stays the same for every extra inventory.

Can you clarify how it might work?

This command turns will output JSON from the Ansible .yml file:

python -c 'import sys, yaml, json; json.dump(yaml.load(sys.stdin), sys.stdout, indent=4)' < all.yml > all.json

The other inventory scripts return groups. Would I have to wrap that output in a ‘all’ group for it to work? Is that enough?

Some of the variables in the all.yml are also templated from other variables - would they be parsed properly if coming in from an inventory script?

You need to wrap it in an ‘all’ group, yes. The output should look like this:

{
“all”: {
“vars”: {
…your vars here…
}
}
}

And templates are evaluated when used, so the vars can be templated.

I have, and and still are, struggling with this as well. Our inventory is very similar to yours. We have three AWS environments, and dozens of other on-prem environments. For the AWS we need to share a bunch of variables. Putting those in playbook group_vars does not work because the are only for AWS, not the other environments.

Hence I have resorted to soft-linking an aws-common.yml into each of the aws group_vars. Since git handles the links its really no pain, just a bit of confusion when searching.

I do a similar thing in the playbooks for other reasons. But now reading below the under playbooks ansible will look up the tree for group_vars I may not need those.

Thanks to Hagai I have a good solution worth sharing.

  • add common_vars.sh bash script to each of the inventories I wish to share variables. Make it executable. Point it where your common Ansible YAML variables are (in my case inventory/common_vars.yml).
  • add common_vars.yml and populate it

Run your playbooks with the specified inventory path as normal (inventory/aws-dev, inventory/aws-prod etc)

Benefits

  • inventory files are all kept in the inventory folder
  • normal Ansible group_vars are left alone
  • only need one extra file per inventory, and no copy paste
  • if you need a few top level common_vars, you can just point your common_vars to onprem_common.yml and cloudstack_common.yml, or as you like.
  • feels like a proper Ansible way to do things without hacks

New dir structure below (compared to original). Bash script code after.

`

.
├── inventory
│ ├── common_vars.yml

│ ├── aws-dev
│ │ ├── aws-dev
│ │ ├── common_vars.sh

│ │ ├── ec2.ini
│ │ ├── ec2.py
│ │ └── group_vars
│ │ ├── all
│ │ │ ├── all.yml
│ │ │ └── secrets.yml
│ │ ├── security_group_app.yml
│ │ └── security_group_util
│ │ ├── secrets.yml
│ │ └── security_group_util.yml
│ ├── aws-prod

│ │ ├── aws-prod
│ │ ├── common_vars.sh

│ │ ├── ec2.ini
│ │ ├── ec2.py

│ │ └── group_vars
│ │ └── all
│ │ ├── all.yml
│ │ └── secrets.yml
│ └── vmware-dev
│ ├── group_vars
│ │ ├── all
│ │ │ ├── all.yml
│ │ │ └── secrets.yml
│ │ └── tag_role_app.yml
│ ├── common_vars.sh

│ ├── vmware-dev
│ ├── vmware.ini
│ └── vmware.py
├── roles

├── aws-configure.yml
├── aws-provision.yml
├── vmware-configure.yml
└── vmware-provision.yml

`

common_vars.sh

`

#!/usr/bin/env bash

http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in

DIR=$( cd “$( dirname “${BASH_SOURCE[0]}” )” && pwd )

echo $DIR

http://stackoverflow.com/questions/27382552/converting-yaml-to-json-with-python-block-end-found

python -c ‘import sys, yaml, json; json.dump({“all”: {“vars”: yaml.load(sys.stdin) } }, sys.stdout, indent=4)’ < $DIR/…/common_vars.yml # > all.json

`

This accomplishes everything required without complexity.

Thanks Hagai!

Thanks for sharing! Glad I could help. :slight_smile:

This has worked incredibly well in practice. We’ve deduped all our variables and only have a handful of inventory specific vars. Perfect.

Any idea how this might be extended to use ansible-vault encrypted files?

Is the vault-password put into somewhere accessible, so that a modified script could do the same JSON-ification as it does with the clear YAML vars?

Hm, can’t think of something. I think we must have ansible pass it as an environment variable for the script to use.

Nice concept.

If you prefer not to mix python and bash, you could create a common_vars.py instead. Something like:

`

#! /usr/bin/env python

import os, sys, yaml, json
srcfile = os.path.abspath(os.path.join(os.path.dirname(sys.argv[0]), ‘…’, ‘common_vars.yml’))
with open(srcfile) as f:
json.dump({“all”: {“vars”: yaml.load(f) } }, sys.stdout, indent=4)

`

Remember to make it executable with chmod +x as with the bash script.