Unit Tests for Module and Filters.

I had an interesting conversation on this Kinesis Stream PR.

I would like to know the opinions of Module development community, when it comes to documenting modules and writing tests for these modules.

Here are some of my opinions… (I understand maybe some of you or maybe none of you may agree with my opinions)

Some things I understand (But may not always agree with).

  1. Some modules are so small, that there is no point in documenting them.
  2. Since some of these modules are so small, there may not be a point in writing tests for them.

Here are some issues that I have personally ran into when moving from Ansible 1.9 to 2.0.

  1. Some modules are not explicitly setting the type in argument_spec.

  2. Not doing so allows people to pass a dictionary instead of a list and which may have worked at one point, but due to updates in the module, now the module no longer works with the list.

  3. Not being explicit, in turn allows for these minor bugs to be introduced.1. Some modules have either 1 extremely huge function or just a main function. Doing this in turn can sometimes introduce new issues. Such as…

  4. Not being able to write tests easily.

  5. Function is performing 5+ actions, which not only makes writing detailed tests harder, but it also makes it difficult to debug.1. For modules that do include more than one function.

  6. When a function is not documented and it is bigger than 20 lines, it makes it a bit tedious to debug.

  7. Not documenting what the function is suppose to return, makes it a tad tedious to debug.1. Portion of the modules are not using check_mode. Using check_mode is especially great when writing tests.

What I personally would like to see. (I understand that it may never happen).

  1. Since tests are accepted, then they should be automatically run as well. If the tests does not pass then the PR is denied or updated with further comments.
  2. New module developers should be encouraged to write tests.
  3. New module developers should be encouraged to write documentation.
  4. The ansible-validate-modules command should complain about modules that take in parameters without assigning a type to the parameters.
  5. Allow 2 types of integration tests. The playbook style and the API Style. (I personally prefer to write the API style.)

Reality… (Or my Reality :P)

  1. I understand that even with Unit tests in place, we can still introduce bugs into a module.

  2. We can limit the amount of minor bugs in each release of each module, with the proper amount of testing.

  3. We can reduce the amount of time that a module needs in order to receive a shipit. Since we would no longer be relying on the community to test the modules for us. I know as a user, I have spent hours debugging a module, to find a minor bug in the module.

In the mean time…

  1. I started my own repository called ld-ansible-modules. This repository is integrated with Travis.ci and coveralls.io. I am including in this repository all modules that I have written and have tests associated with them. I am also in the process of writing tests for each module that I have written. I encourage anyone to contribute to this repository and would like to see and encourage the Ansible Module Development community begin to writing well documented modules with coinciding unit tests…

Responses in-line:

Thank you for the quick response.

What I meant by documentation, is the following.

  1. Function documentation.
  2. Method documentation.
  3. Class documentation
  4. What each Method/Function Returns.

I know Ansible does implement a set of strict documentation (For the users), not for the developers.

As for tests being run automatically on commit. From what I understood, from this conversation https://github.com/ansible/ansible-modules-extras/pull/1901. Unit tests are not run during a commit for modules in the ansible-modules-extra github repository. If this is not the case, then I will indeed submit my tests. I just do not see the point of submitting tests, that will not be ran during the commit process.

So this is something that should work soon, the tests that run automatically are in the ansible/ansible repo. I hope to soon trigger the ones in the extras repo (currently just the 1).

I will begin adding my Tests to the ansible-modules-extra/tests/unit/cloud/amazon/ folder for the modules I am working on or have PR for. Brian or anyone on the Ansible team, I would not mind working with someone on getting the modules in the Amazon directory to be Boto3 compliant as well as write tests.

We have tests for aws modules in core, but they only run when credentials are supplied (not in public tests).

As for boto3 the only thing to keep in mind is that existing functionality should still support boto, as many users cannot upgrade that easily.

What about mocking the return of the API’s when check mode is passed??
Example below. This is what I am doing in order to fully test each of the function I write that make API calls to AWS using Boto3.


|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|

  • | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |

`

def find_stream(client, stream_name, limit=1, check_mode=False):
    """Retrieve a Kinesis Stream.
    Args:
        client (botocore.client.EC2): Boto3 client.
        stream_name (str): Name of the Kinesis stream.

    Kwargs:
        limit (int): Limit the number of shards to return within a stream.
            default=1
        check_mode (bool): This will pass DryRun as one of the parameters to the aws api.
            default=False

    Basic Usage:
        >>> client = boto3.client('kinesis')
        >>> stream_name = 'test-stream'

    Returns:
        Tuple (bool, str, dict)
    """
    err_msg = ''
    success = False
    params = {
        'StreamName': stream_name,
        'Limit': limit
    }
    results = dict()
    try:
        if not check_mode:
            results = (
                client.describe_stream(**params)['StreamDescription']
            )
            results.pop('Shards')
        else:
            results = {
                'HasMoreShards': True,
                'RetentionPeriodHours': 24,
                'StreamName': stream_name,
                'StreamARN': 'arn:aws:kinesis:east-side:123456789:stream/{0}'.format(stream_name),
                'StreamStatus': 'ACTIVE'
            }
        success = True
    except botocore.exceptions.ClientError, e:
        err_msg = str(e)

    return success, err_msg, results

`