New process for acceptance of new modules in Extras

The backlog of New Modules in Extras is here:

https://github.com/ansible/ansible-modules-extras/labels/new_plugin

The original intention of the Extras module split was to allow us to
be more generous with acceptance criteria of new modules, to help grow
our functionality and community more quickly. I don't think we're
making as much progress as we could be making in merging these
modules, and I believe it's because our current process is too
restrictive.

Here are the baseline criteria for acceptance of modules into Extras,
as I see them:

1. New modules have been tested in good faith by users who care about them;
2. New modules adhere to the module guidelines;
3. The submitter of the module is willing and able to maintain the
module over time.

What do these three criteria have in common? Trust.

We must trust that the people who say "I have tested this module" have
actually done so. We trust that people who say "this module passes
guidelines" have checked against those guidelines carefully. We trust
that the submitters of these modules will fix issues as they arise,
and evaluate and merge pull requests as needed.

So that's what I'd like to do. No longer will we require approvals of
a small set of individuals; instead, we'll open up the review process
to everybody. Here's how it will work:

* All new modules will require two approvals:
  + One "worksforme" approval from someone who has
      thoroughly tested the module, including all parameters
      and switches;
  + One "passes_guidelines" approval from someone who
      has vetted the code according to the module guidelines.

* Either of which can be given, in a comment, by anybody
  (except the submitter, of course).

* Any module that has both of these, and no "needs_revision"
  votes (which can also be given by anybody) will be approved
  for inclusion.

* The core team will continue to be the point of escalation for
  any issues that may arise (duplicate modules, disagreements
  over guidelines, etc.)

Does this new policy increase the risk of poor reviews, thus
increasing the risk of buggy modules? It might. But I think that's
okay. Modules are modular for a reason; if a module doesn't work
perfectly, the pain is limited only to the flawed module itself --
which is not the end of the world. So long as we have maintainers who
are committed to improving their modules over time, I think we'll be
fine.

Note that inclusion of a module in Extras does not imply permanence in
the same way that inclusion in Core does. If modules in Extras go
unmaintained, we will seek new maintainers, and if we don't find new
maintainers, we will ultimately deprecate them.

Our new policy is effective immediately, and we will start going
through the backlog of new modules asap. If there's a module you've
been waiting to see, you can start testing and reviewing them right
away, adding the text "passes_guidelines" or "worksforme" or
"needs_revision" as appropriate.

Module guidelines can be found here:

http://docs.ansible.com/ansible/developing_modules.html#module-checklist

Thanks, as always, for your support and your patience.

--g

Hi Greg

I really appreciate your engagement to improve the process!

There is a little thing I wanted to point out about new modules and "worksforme".

In terms of cloud modules, e.g. cloudstack there won't likely be someone who can test the modules because you need to have a test environment setup or/and root admin permissions for some module.

That is why I do integration tests and add them to https://github.com/ansible/ansible/tree/devel/test/integration/roles. Not even these tests cover all possibilities of the extensive cloudstack API. But with these we can make sure that everything that worked also works in the future and they can be extended.

So in short and general, if a new modules shows up having integration tests in ansible/test/integration and shows they cover the new functionality and passed, IMHO "worksforme" could be equal to: the test results provided look good and cover the new functionality.

Thank you for your feedback.

Yours
@resmo

Greg, thanks very much for your work here. Lowering the barrier to merge for the extras modules is a big win. I know I myself have skipped submitting several homegrown modules after looking at the list of PRs—it just seems like I’d be creating work for others, when there are clearly enough modules to review.

On that tack, sounds like it’d be well worth my time to peruse the existing PRs and post reviews for those that would help me. Rene, thanks for linking to the test examples—very helpful!

Hi Greg

I really appreciate your engagement to improve the process!

There is a little thing I wanted to point out about new modules and
"worksforme".

In terms of cloud modules, e.g. cloudstack there won't likely be someone who
can test the modules because you need to have a test environment setup
or/and root admin permissions for some module.

Yep, this is a good point.

That is why I do integration tests and add them to
https://github.com/ansible/ansible/tree/devel/test/integration/roles. Not
even these tests cover all possibilities of the extensive cloudstack API.
But with these we can make sure that everything that worked also works in
the future and they can be extended.

Good idea. Ultimately, we should break Extras tests from
ansible/ansible and make them part of the Extras repo, because having
to submit PRs to two separate repos will be a high barrier for many --
but as a criteria for merging modules in lieu of community testing, I
think it's a good one.

So in short and general, if a new modules shows up having integration tests
in ansible/test/integration and shows they cover the new functionality and
passed, IMHO "worksforme" could be equal to: the test results provided look
good and cover the new functionality.

Yes. I guess I'd say this would be the one exception to "you can't
give worksforme to your own modules" -- if you have integration tests
that pass, you can safely give "worksforme" to your own module. Let's
give it a try!

--g

​IMHO, this is something that should have happened a long time ago, and
which should be a high priority now.

I would like that too, but for right now the highest priority is
wrapping up 2.0. Everything we add to the plate pushes that out a
little longer. Maybe someone on the core team can comment further.

--g

FWIW it would be nice to have some tool which could check the module guidelines automatically.
Less manual work and less room for disagreement.

Just my two cents,
Dennis Benzinger

FWIW it would be nice to have some tool which could check the module
guidelines automatically.
Less manual work and less room for disagreement.

You are so, so right about this. :slight_smile:

--g

Toshio and I have started on this a little while ago, but haven’t gotten it fully integrated yet.

https://github.com/sivel/ansible-testing

One problem is all of the current modules that don’t yet pass.

Toshio and I have started on this a little while ago, but haven't gotten it
fully integrated yet.

https://github.com/sivel/ansible-testing

Shiny. :slight_smile:

One problem is all of the current modules that don't yet pass.

I presume you're saying this is a problem in the context of
integrating into Travis. We could mitigate that by setting up an
exclude list containing all modules that don't currently pass, with
the intention of whittling down that list over time.

I love the idea of using this as a starting point.

--g

it would be nice to add to hacking/ dir, also the module_utils import
at bottom should be only warning as there are cases it is needed on
top (subclassing)

That is correct. As of now there are 17 failing modules in extras with a total of 19 errors:

https://gist.github.com/sivel/53489d707a010bf8fad3

I’ll look into getting it integrated in extras soon, which may necessitate a couple of changes to provide an exclusion list at run time.

The module_utils import is not always an error, well it is, but there is an
exclusion list of module_utils that are OK to appear elsewhere.

Most, if not all of those that are failing due to that, are needlessly
placed at the top.