I’ve been building my AWX environment and NetDevOps skills from the ground up for a few months now.
I have AWX running in a pretty stable lab.
My AWX uses GitLab for project sources.
I use tiered branches in GitLab for main (Production), PreProd, Dev, and feature branches. When I create a new feature or bug fix, I clone Dev into a new branch, do my feature and bugfix development in that branch, and then merge that branch back to Dev. After testing the merge into Dev, I merge Dev into PreProd, and then PreProd into main (with manual tests and validations in between).
Now, I’m at the point where I would like to start doing some CI/CD and automated testing at the branch merge steps to help me get consistent QA testing done on each branch and release of my projects. I have no experience with this piece of development. Can someone help me get pointed in the right direction to start building this out?
@Dustin Hey! I’m assuming the changes you are making, building, deploying and testing are for another project (not AWX itself).
It sounds like you have one Job Template that runs the automation to build and test your webapp/project and you want to add in a step that runs integration tests against your deployment once it is up.
I would probably start by creating a workflow in AWX for each of the steps:
Job Template 1: build images/compile code
Job Template 2: deploy app/code
Job Template 3: run tests against that running instance
You can pass information between jobs within a workflow using the set_stats ansible module.
As far as what that last “integration test” job template does, there are many approaches. I would lean towards a simple ansible playbook that calls python pytests, but this will be depend on what you are testing and what languages you know well
I’m talking about playbook development that our engineers will be running. This request is less about AWX’s involvement in what I’m asking, and more about how to do DevOps with CI/CD pipelines with a focus around AWX playbooks as the software being developed.
I created some playbooks to achieve specific outcomes for engineers so they don’t have to process things manually. What I want is a way to:
Push my playbook updates to my feature branch
GitLab (or whatever) detect the changes and start a validation process (CI/CD?)
The validation process runs the playbook updates multiple times with different sets of inputs to validate it can handle known user behaviors.
One validation for user providing incorrect data
One validation for user providing correct data
One validation for user providing incomplete data
etc.
If all of the feature branch validation tests pass do the below items, else stop the pipeline:
Merge the GitLab feature branch into Dev
Tell AWX to re-sync the Dev project
Run the validations again against the Dev branch
If all of the Dev branch validation tests pass:
Merge the GitLab Dev branch into main
Tell AWX to re-sync the Prod project
Run the validation tests against main/Prod branch
As a beginner, when I research all I see are references to Jenkins and other tools; but I can’t quite figure out what those tools are and how they come together to achieve what I’m asking about above.
So, I’m in a similar boat right now. All of my content goes in an on-premise GitHub Enterprise org that does not have Actions enabled.
What I do have though is branch protection and webhooks. Whenever a pull request is created/modified, it triggers a webhook (CI) that sends event data to AWX and triggers a job run. I tailored a role around the webhook payload to checkout the PR source branch and run ansible-lint against that. AWX streams job status updates back to the PR, and reports success if there’s no errors.
The branch protection settings I have enabled require all CI checks to pass before the PR can be merged. The are also other requirements, such as requiring the PR to be reviewed and approved by someone other than the author.
If I could use GitHub Actions, I would use those similarly to run ansible-lint and other CI.
Since this is still WIP for me, I also have plans for having webhooks that perform dry-runs of related jobs in AWX as a validation step after successful linting tests.