EDA rulebook condition on webhook header keys

Hello EDA Experts,

I’m new to Ansible. I recently installed AAP 2.5 for learning purpose. I was trying to create a simple rulebook which is triggered by a webhook from jfrog to run a template. I included a key value pair ( “os” = “linux” ) in the header when sending webhook, but having trouble for the condition in the rulebook.

- name: Listen for events on a webhook
  hosts: all
  # Define our source for events
  sources:
    - ansible.eda.webhook:
        host: 0.0.0.0
        port: 5000
  # Define the conditions we are looking for
  rules:
    - name: Demo rule
      condition: event.meta.headers.os == "linux"
      # Define the action we should take should the condition be met
      action:
        run_job_template:
          name: Ansible-Windows-Template
          organization: MyOrg

The condition statement in above code block doesn’t work. The only thing I can make the template to be triggered is these conditions:

condition: event.meta.headers is defined
or
condition: event.payload.event_type == 'deployed'

When I send the same webhook to webhook.site, I can see the key/value pair is in the header. Unfortunately, Jfrog only allows to customize headers, the payload is defined by jfrog and we cannot change it.

Can someone share some light on how to write the condition for matching a webhook header variable?
I wish there is a detailed document for reference, but I couldn’t find one.

Thanks

Looking at the webhook plugin source, I don’t see any mechanism there to expose headers to your condition:s. But don’t take my word for it; have a look yourself.

If you figure it out, post about it here. I’m curious to know, too. Good luck.

I see the header data is passed to the function.

Another thing I found is, it can only decode to headers or payload. Anything inside headers and payload cannot be accessed. For example:

'payload': {'data': {'name': 'sample.txt',
                      'path': 'sample_dir/sample.txt',
                      'repo_key': 'sample_repo',
                      'sha256': 'sample_checksum',
                      'size': 0},
             'domain': 'artifact',
             'event_type': 'deployed',
             'jpd_origin': 'http://jfrog.example.com',
             'source': 'jfrog/admin',
             'subscription_key': 'test'}

In above payload, I can access payload.xxx, but not payload.xxx.xxx.
For example, these conditions will work:

condition: event.payload.event_type == 'deployed'
condition: event.payload.domain == 'artifact'
condition: event.payload.data is defined

but these will not work:

condition: event.payload.data.name == "sample.txt"
condition: event.payload.data.size == 0

Wondering why it is acting like this. Wish someone from Ansible team can explain. Thanks

Have you tried expressing it like this?

condition: event.payload.data["name"] == "sample.txt"
condition: event.payload.data["size"] == 0

Or even

condition: event.payload["data"]["name"] == "sample.txt"
condition: event.payload["data"]["size"] == 0

Grasping at straws here…

3 Likes

If the input is valid JSON it works for me, I used yq to make your example valid JSON:

{
  "payload": {
    "data": {
      "name": "sample.txt",
      "path": "sample_dir/sample.txt",
      "repo_key": "sample_repo",
      "sha256": "sample_checksum",
      "size": 0
    },
    "domain": "artifact",
    "event_type": "deployed",
    "jpd_origin": "http://jfrog.example.com",
    "source": "jfrog/admin",
    "subscription_key": "test"
  }
}

And JMESPath to test this:

@utoddl It works!! Thank you!!

The json is sent from jfrog, so it is valid.
Thanks.

The example you posted above can’t be read by jq, it returns:

parse error: Invalid numeric literal at line 1, column 10

And jpterm returns:

Unable to load the input JSON: Expecting value: line 1 column 1 (char 0)

:person_shrugging:

Sorry, that block is from AAP debug screen, not a real json format.

1 Like

I keep re-reading Navigate structured data and the best I can come up with is that this is a bug not the intended behavior. The very first line says

You can navigate strutured [sic] event, fact, var data objects using either dot notation or bracket notation:

then gives examples. But clearly this is not working as described. I’m guessing this is rooted in some ambiguity accessing properties vs. non-property attributes, and that it’s related to the near-by admonition a few lines above that section:

Note A condition cannot contain Jinja style substitution when accessing variables passed in from the command line, we loose [sic] the data type information and the rule engine will not process the condition. Instead use the vars prefix to access the data passed in from the command line

In fact it was reading that baffling explanation that lead me to suggest using “bracket notation” rather than “dot notation”. To me that note reads, “Until our internal data model matures, you may have to resort to tricks to access certain data.” That may sound like a dig at the current implementation, but I’m actually rather optimistic that improvements along those lines will obviate the need for such shenanigans. Ansible itself suffered similar issues earlier, but now it’s much better about not exposing users to quirks of its implementation details.

It’s unfortunate you got caught trying this now. Someday – soon I hope – users may read this thread and be all confused because by then it’ll just work!

2 Likes