Hey!
I’ve been recently struggling with the following use case:
A server has to be provisioned with system packages and a new user account using a root account. Afterwards, create a python virtualenv in the new user’s home directory, install packages etc.
My first, obvious try was to split the above tasks’ list into roles:
- A system_packages role that would install all system packages
- A user_account role that would setup a new user, copy keys and whatnot.
- A virtualenv role that would setup the virtualenv for newly created used and whatnot.
An example playbook might look like this:
-
host: target
remote_user: root
roles: -
system_packages
-
user_account
-
host: target
remote_user: target_user
roles: -
virtualenv
The problem begins with the virtualenv role: it requires python-virtualenv system package to be installed, and as such the requirement shouldn’t be included in project’s system requirements (so that the developer is freed from knowing system packages required for a role to work). The target_user has no sudo access, thus I can’t install the package in the second play, but the first one. Now, since virtualenv requirement is not specified in project’s system requirements, it won’t be installed. I was thinking about creating a dummy role system_packages that would be dependant on instances of a system_package role that would install a single system requirement. But here is the blocker: I can declare direct dependencies for a roleA, but I can’t specify in roleB declaration that roleA should also depend on {role: system_package, package: python-virtualenv}.
What I’m really looking for is a mechanism similar to require_in that is found in SaltStack. Or even better, has anyone struggled with similar use case and has a neat solution to share?
Cheers