Put using s3_object module doesn’t seem to work

Hello,

I have a working ceph S3 local storage with a bucket named “prova” where I can GET and PUT files. I have a problem if I want to run Ansible to PUT files.

I would like to use amazon.aws. So I write this playbook test.yml:

---
- name: Test S3
  hosts: localhost
  gather_facts: false

  tasks:
    - name: Save to S3 bucket
      amazon.aws.s3_object:
        bucket: prova
        object: "provafile"
        src: "files/ca.crt"
        mode: put
        endpoint_url: https://mys3.example.com
        ceph: true
        access_key: OMISSIS
        secret_key: OMISSIS
        validate_certs: false
        debug_botocore_endpoint_logs: true

I can’t “put”:

# ll files/
total 8.0K
-rw-r--r-- 1 root root 1.2K Oct 17 08:29 ca.crt

# ansible-playbook test.yml -vvv
ansible-playbook [core 2.16.12]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.12/site-packages/ansible
  ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/local/bin/ansible-playbook
  python version = 3.12.1 (main, Aug 23 2024, 00:00:00) [GCC 11.4.1 20231218 (Red Hat 11.4.1-3.0.1)] (/usr/bin/python3.12)
  jinja version = 3.1.4
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
yaml declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
ini declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Skipping due to inventory source not existing or not being readable by the current user
toml declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: test.yml ************************************************************************************************************************************************************************************************************************************************
1 plays in test.yml

PLAY [Test S3] ****************************************************************************************************************************************************************************************************************************************************

TASK [Save to S3 bucket] ******************************************************************************************************************************************************************************************************************************************
task path: /etc/ansible/test.yml:7
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: root
<127.0.0.1> EXEC /bin/sh -c 'echo ~root && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720 `" && echo ansible-tmp-1729168642.1893451-268855-164044994429720="` echo /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720 `" ) && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'test -e files/ca.crt && sleep 0'
Using module file /usr/local/lib/python3.12/site-packages/ansible_collections/amazon/aws/plugins/modules/s3_object.py
<127.0.0.1> PUT /root/.ansible/tmp/ansible-local-268852ja_1q5q4/tmp13v9g9pb TO /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720/AnsiballZ_s3_object.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720/ /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720/AnsiballZ_s3_object.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720/AnsiballZ_s3_object.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1729168642.1893451-268855-164044994429720/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/boto3/s3/transfer.py", line 372, in upload_file
    future.result()
  File "/usr/local/lib/python3.12/site-packages/s3transfer/futures.py", line 103, in result
    return self._coordinator.result()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/s3transfer/futures.py", line 264, in result
    raise self._exception
  File "/usr/local/lib/python3.12/site-packages/s3transfer/tasks.py", line 135, in __call__
    return self._execute_main(kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/s3transfer/tasks.py", line 158, in _execute_main
    return_value = self._main(**kwargs)
                   ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/s3transfer/upload.py", line 762, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File "/usr/local/lib/python3.12/site-packages/botocore/client.py", line 569, in _api_call
    return self._make_api_call(operation_name, kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/botocore/client.py", line 1023, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidRequest) when calling the PutObject operation: Unknown

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 777, in upload_s3file
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/module_utils/retries.py", line 105, in deciding_wrapper
    return retrying_wrapper(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/module_utils/cloud.py", line 119, in _retry_wrapper
    return _retry_func(
           ^^^^^^^^^^^^
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/module_utils/cloud.py", line 68, in _retry_func
    return func()
           ^^^^^^
  File "/usr/local/lib/python3.12/site-packages/boto3/s3/inject.py", line 145, in upload_file
    return transfer.upload_file(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/boto3/s3/transfer.py", line 378, in upload_file
    raise S3UploadFailedError(
boto3.exceptions.S3UploadFailedError: Failed to upload files/ca.crt to prova/provafile: An error occurred (InvalidRequest) when calling the PutObject operation: Unknown

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 1595, in main
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 1110, in s3_object_do_put
  File "/tmp/ansible_amazon.aws.s3_object_payload__6gt7tjv/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 786, in upload_s3file
S3ObjectFailure: Unable to complete PUT operation.
fatal: [localhost]: FAILED! => {
    "boto3_version": "1.35.42",
    "botocore_version": "1.35.42",
    "changed": false,
    "invocation": {
        "module_args": {
            "access_key": "OMISSIS",
            "aws_ca_bundle": null,
            "aws_config": null,
            "bucket": "prova",
            "ceph": true,
            "content": null,
            "content_base64": null,
            "copy_src": null,
            "debug_botocore_endpoint_logs": true,
            "dest": null,
            "dualstack": false,
            "encrypt": true,
            "encryption_kms_key_id": null,
            "encryption_mode": "AES256",
            "endpoint_url": "https://mys3.example.com",
            "expiry": 600,
            "headers": null,
            "ignore_nonexistent_bucket": false,
            "marker": "",
            "max_keys": 1000,
            "metadata": null,
            "mode": "put",
            "object": "provafile",
            "overwrite": "different",
            "permission": [
                "private"
            ],
            "prefix": "",
            "profile": null,
            "purge_tags": true,
            "region": null,
            "retries": 0,
            "secret_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "session_token": null,
            "sig_v4": true,
            "src": "files/ca.crt",
            "tags": null,
            "validate_bucket_name": true,
            "validate_certs": false,
            "version": null
        }
    },
    "msg": "Unable to complete PUT operation.: Failed to upload files/ca.crt to prova/provafile: An error occurred (InvalidRequest) when calling the PutObject operation: Unknown",
    "resource_actions": [
        "s3:GetBucketOwnershipControls",
        "s3:HeadObject",
        "s3:PutObject",
        "s3:HeadBucket"
    ]
}

PLAY RECAP ********************************************************************************************************************************************************************************************************************************************************
localhost                  : ok=0    changed=0    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0

My pip packages:

ansible                   9.11.0
ansible-compat            24.9.1
ansible-core              2.16.12
boto3                     1.35.42
botocore                  1.35.42

amazon.aws is 7.6.1

Log:

2024-10-17T14:37:22.690065+02:00 my-ansible ansible-amazon.aws.s3_object[268868]: Invoked with bucket=prova object=provafile src=files/ca.crt mode=put endpoint_url=https://mys3.example.com ceph=True access_key=OMISSIS secret_key=NOT_LOGGING_PARAMETER validate_certs=False debug_botocore_endpoint_logs=True encrypt=True encryption_mode=AES256 expiry=600 marker= max_keys=1000 sig_v4=True permission=['private'] overwrite=different prefix= retries=0 dualstack=False ignore_nonexistent_bucket=False purge_tags=True validate_bucket_name=True session_token=NOT_LOGGING_PARAMETER profile=None aws_ca_bundle=None aws_config=None region=None dest=None headers=None metadata=None version=None content=None content_base64=None encryption_kms_key_id=None tags=None copy_src=None

I can’t understand how to solve this issue. Do you have any hints?
I really thank you very much!
Kind Regards
Marco

I dont see anything in your example that looks wrong. Im not very familiar with S3 backed by ceph though. I do notice that the region is undefined, maybe try specifying that?

You could check the AWS cloudtrail or bucket logs (if you have them) and see if they say anything more helpful

Ops, I found the solution! My ceph doesn’t offer encryption.
So I added:
encrypt: false

The error in the exception was not very clear. I leave here just if someone could experience this.

Thank you
Marco