Some thoughts from CfgMgmtCamp about collection testing tools

Hi everyone,

I can’t believe it’s already been over a month since CfgMgmtCamp. There were many great discussions and ideas exchanged. One of the gems, from my view, was the session @felixfontein gave on antsibull-nox project.

Felix’s talk was a real standout because I think it shows a way to move forward despite technical challenges. The very nature of open source, and maybe IT fundamentally, is to work around when things get stuck.

And, as a tool goes, antsibull-nox provides a really elegant solution to problems that seem to plague collection testing. In his session Felix covers how many of the existing tools for testing collections are brittle and not even specifically designed for collection testing.

So if you still haven’t watched it, I highly recommend checking out Felix’s session: How to use antsibull-nox to test your collection.

Another thing that came up at CfgMgmtCamp was why the Community engineering team at Red Hat and the Partner engineering team at Red Hat were combined. There was a lot of overlap between the two teams given the focus on automation content.

If you’re not familiar, the partner engineering team (now the community and partner engineering team) owns the process for certifying and validating collections from Red Hat partners. As you might have noticed from some other recent posts, we’ve been talking about how to bring those partners closer to the rest of the community to share expertise and provide more transparency. There are a lot of benefits for doing that. One of those happens to be solving the same sort of frustrations that Felix described so well and that antsibull-nox is well suited to.

antsibull-nox provides a common interface to multiple collection testing tools. It also offers a very convenient way to test your content. Just add a straightforward, human friendly TOML file to your project and a couple of dependencies.

For the CPE team, we see antsibull-nox as a great way to increase the quality of content while also making it easy for collection developers and maintainers, such as Red Hat partners, to get past some of the headaches and obstacles caused by all the wrangling with different test tools. In the specific case of partners, antsibull-nox also has the potential to accelerate onboarding for the certification process.

This actually came up in a conversation with @Leo recently. He has been doing a lot of work with developer tools and was wondering how antsibull-nox and tox-ansible relate to each other, which brings me to my point.

My team and I have done a bunch of work to reduce fragmentation across the Ansible ecosystem. While it’s great to have more tools and additional flexibility, it can be confusing for folks trying to figure out which tool they should use when presented with options that seemingly overlap. So I’ve been thinking that it might be a worthwhile discussion to have in the forum.

As part of the Ansible Development Tools (ADT) suite of packages, tox-ansible is a good choice when following the Red Hat opinionated path of content development. Maybe also some of the experts out there prefer ini file configuration or have technical reasons for using tox-ansible.

On the other hand if you have a collection that has already been around for some time; if you scaffolded a collection from the community template; if you took over collection maintenance after a team handover, then antsibull-nox provides a convenient option that you can basically drop into your project. It seems like a solid option for folks who are a domain expert in a given technology but not necessarily an expert in Ansible collection testing.

At the end of the day both projects are tools with different approaches to doing a thing. While choice is always good, and healthy, it’s useful to have enough context to make informed decisions. So I’m hoping to kick off a bit of a discussion about all this to hear what other people think. Does the above make sense for the most part? Does anyone disagree? Has anyone used either antsibull-nox and/or tox-ansible and switched from one to another?

Looking forward to hearing some other points of view. Cheers.

2 Likes

Thanks a lot for this text!

I haven’t used tox-ansible yet, since it didn’t seem that useful in the past - it didn’t take care of collection setup, for example. This apparently changed in the last weeks (since version 26.3.0), it now uses ade to set up the environments. This definitely makes tox-ansible more useful to beginners, and also a more friendly tool to use for collection contributors.

(On the one hand I like the new separation of concerns between tox-ansible and ade, on the other hand this also introduces inefficiencies. I guess what’s better is a personal matter of taste…)

I also personally don’t like the way it invokes ansible-test sanity. Doing that without --docker, but for many different --python versions is usually a huge waste of CPU times, and often also misses some test coverage since most users don’t have all mentioned Python versions installed. (Missing Python versions is also a problem for unit and integration tests. And for integration tests, missing encapsulation in containers or VMs can also be problematic.)

Out of curiosity, is this “Red Hat opinionated path of content development” documented anywhere? (Just because I didn’t see it so far doesn’t mean it’s not documented, I also haven’t searched that intensively for it…)

3 Likes

Cheers @felixfontein Thanks for the thoughtful response about tox-ansible I’ll admit that I’m a little rusty and haven’t really looked at it in a while. Last time I tried using it for testing a collection, I struggled a bit so it’s nice to hear that there have been some improvements.

For that Red Hat opinionated path I was referring to, probably the best docs on the community side are these ones: Building a Collection - Ansible Development Tools Documentation

Last year the community eng team did some upskilling exercises on Dev Tools. As part of that we gave some feedback to the Dev Tools team about improving the docs around that opinionated path. It looks like some of that was incorporated into the docs above because it honestly does seem like an improvement over the previous docs experience. I’ve found that you can always improve docs though.

I also checked the docs.redhat.com content and maybe this guide is the best candidate for describing that opinionated path: Developing automation content | Red Hat Ansible Automation Platform | 2.6 | Red Hat Documentation

There’s another thing that I can share because @samccann and I worked with folks from Dev Tools on a presentation that we gave at AnsibleFest last year.
Creating sustainable automation content.pdf (1.6 MB)

In that talk we explored some of the patterns and good practices that the “opinionated path” is meant to reinforce.