Policies for module maintenance

We've now completed a comprehensive review of every old pull request
in both Core and Extras repos. Each PR should now be in one of the
following states, represented by a label:

* community_review -- this means that a PR needs to be reviewed by the
community for potential inclusion. This is the default state for all
PRs for new modules, or for any PR to any module maintained by a
community member.

* core_review -- this means that a PR needs to be reviewed by the
Ansible core team. It's reserved for PRs to modules specifically owned
by the Ansible core team, or for PRs that require escalation for
whatever reason.

* needs_revision -- this means that a PR has been reviewed and found
in need of changes by the submitter.

* needs_rebase -- this means that a PR fails its merge test and must
therefore be rebased by the submitter.

* needs_info -- this means that it's unclear what state a PR is in,
and we need to follow up to figure out what state it should be in.

* priority_reviewed -- this means that a PR has been upvoted by its
reviewers and will be considered by the core team for immediate
inclusion.

* action_pending -- this is reserved for PRs that have been stalled
due to an inactive participant (no action in 30 days), and action is
about to be taken. If the PR is stuck in needs_revision, needs_rebase
or needs_info and there's no response within a week, the PR will be
closed as inactive. If the PR is stuck in community_review, and
there's no response from the reviewer within a week, the PR will be
reassigned to core_review and we will look for a new
maintainer/reviewer for the module.

We will continue to monitor PRs to ensure that each one is properly
tagged, and to make sure that our workflow is as effective as
possible. We will also be updating the contributor guidelines to
reflect this new policy.

This also brings us to some issues we are still working through:

* Module maintenance. We are now asking module authors to be
responsible for the maintenance of those modules over time, including
management of issues and pull requests that pertain to their modules.
Some module authors are not willing or able to perform this
maintenance. That's ok, and no one should feel obliged to own their
modules forever -- but we also want to have a policy by which we can
hand off the maintainership of modules to people who use them and care
about them. We're already seeing a number of unresponsive module
maintainers, so we will assign maintenance of these modules to the
core team while we work on our unresponsive maintainer process.

* Reviews of new modules. New modules continue to be a particular
challenge. We want to bring modules in as quickly as possible, but we
also want to ensure that modules continue to be of the highest
quality, and we've implemented the common "2 +1s, 0 -1s" policy for
acceptance of new modules. One possibility: weekly IRC sessions
explicitly for the review of new modules, with the intention of
working down the backlog from oldest to newest. The more hands we
have, the better; it's my hope that we can work through the backlog
quickly with a concerted effort. Of course, as we become more
successful, more modules will come in the door, so this will be an
ongoing process. Keep your eyes open for ideas and improvements here.

Comments to the above process welcome / actively encouraged. No
process is set in stone, and we're happy to iterate to improve this
process for everyone.

Finally, thanks to everyone for your continued patience as we work to
improve the throughput of one of the most active communities in the
entire open source world. It's inspiring to work with all of you.

--g

I think this process could be enhanced by having some way of submitting unit tests for modules. Perhaps not all modules are easily unit testable, but many are. It would help reviewers to have more confidence if the module in question has a decent suite of tests and the PR passes them.
I have a PR for a rewrite of the ini_file module, and while I was developing it, I ran it through a number of ad-hoc tests. A way to automate them would have been welcome, and whoever finally ends up reviewing my PR would have a much easier time of it if a set of unit tests could demonstrate it works as designed. I’m perfectly willing to write these tests, but there needs to be some kind of framework or process decided by the ansible team with regard to how we write them, and where to put them in the repo.

Chris

tests go in the test/ directory in the main repo, You can find many
tests for modules there as well as for general ansible functionality,
most modules are not covered but the most important core and many
cloud modules are.

Wouldn’t it make more sense to be able to include tests in the PR for the modules?

Chris

Hi Greg

The process looks good to me. Two little questions from me:

1st: How to mark modules as "stalled"?

How can we see a module is not maintained? I remember my proposal was to
add a "maintainer" field in the docs, should we come back to this?

# Authors covers maintenance, no maintainers field.
author: <authors names>

# Stalled module
author: <authors names>
maintainer: None

# Maintainers covers maintenance
# In case of author and maintainers covers maintenance,
# author name is also in the maintainers field
authors: <authors names>
maintainers: <maintainers names>

Any thoughts?

2. How should I inform Ansible Inc a module has been reviewed and
upvoted/downvoted?

Should I add a github mention e.g. @gregdek or do you have any
technically solution for parsing comments for +1/-1? How can I help you
get informed?

Yours
René

Hi Greg

The process looks good to me. Two little questions from me:

1st: How to mark modules as "stalled"?

How can we see a module is not maintained? I remember my proposal was to
add a "maintainer" field in the docs, should we come back to this?

# Authors covers maintenance, no maintainers field.
author: <authors names>

# Stalled module
author: <authors names>
maintainer: None

# Maintainers covers maintenance
# In case of author and maintainers covers maintenance,
# author name is also in the maintainers field
authors: <authors names>
maintainers: <maintainers names>

Any thoughts?

Yes. You have picked out two of the biggest pain points that I have
not yet sorted out in my mind. :slight_smile:

I actually had this almost exact discussion with the core team guys.
The thinking, for now, is that we just use the Author: field for this.
Whomever is the first author listed is the presumed maintainer. If the
first author listed is "Core Team", that means it's essentially
orphaned and the Core Team has taken over interim responsibility for
it. We're still getting a sense for which modules are basically
orphaned, and the next thing we'll need will be a mechanism to recruit
new maintainers for orphaned modules. The mechanism for now is to ask
people who are submitting changes if they'd be willing to take over
maintainership. I think that will work in many cases.

The author ordering technique is perhaps not perfect, but it avoids
the need to add more metadata into the module documentation, which I'd
like to avoid for now -- even though I do believe the split you
describe is ultimately more accurate.

2. How should I inform Ansible Inc a module has been reviewed and
upvoted/downvoted?

Should I add a github mention e.g. @gregdek or do you have any
technically solution for parsing comments for +1/-1? How can I help you
get informed?

Yeah, we definitely need something better for this. We need something
parseable, because the flood of data can be overwhelming to pick
through. And "+1" is enraging in its difficulty to parse with web
search tools.

My current practice is to go through the various queues one day a
week, but I think that's insufficient, especially in the case of
active reviewers. What we really want to happen is to know
*immediately* when a reviewer has reviewed one way or the other, so we
can change its state as needed and not keep it waiting. So we need
parseable text that people aren't likely to use elsewhere (since
people use "thumbsup" spuriously.)

So. I guess I'll make this proposal:

Reviewers who think a PR is good should upvote with the text: ":shipit:".

Reviewers who think a PR is not good should downvote with the phrase:
":warning:", with the changes enumerated.

Then I can set email filters to pull in only PRs with that text.

Here's a test issue: can you try it out?
https://github.com/ansible/ansible-modules-extras/issues/800

--g