Ansible and SSH agent forwarding

So I’m running 3 CentOS 6.5 machines and came upon this thing which I don’t understand.

server 1: client machine
server 2: ansible machine
server 3: any target machine controlled by ansible.

server 2/3 both have my public key. so when I ssh to them from server 1, it all goes well.

I ssh from server 1 to server 2 with the -a flag. (disable agent forwarding). When I run any playbook against server 3 it will fail because of permission denied (I needs my key) So this is expected.

When I ssh with the -A flag (enable agent forwarding) it should work. and so it does. Also as expected.

But now the tricky part:

Inmediatly after I ran the playbook with the ssh -A (enable agent forwarding) I disconnect from server 2 and reconnect with -a (disable agent forwarding)
I run the playbook and it DOESN’t fail?
when I try to ssh from server 2 to 3 it says: permission denied (as expected since it doesn’t have my key)

So the question remains. Who captures my key and leaves it there on server 2. Is this paramiko or is this Ansible. And moreover, why? Is this as designed?
I recreated this occurence on ubuntu 14.04 lts, which should indicate that paramiko is not causing this behaviour but ansible it self is.

I’m having a bit of difficulty following the above, but I did want to point out that ansible is not doing anything to move/store your key.

I’m not sure if I can make it any clearer than this.

Server 1 has the private key. Server 2 and 3 have the public key of server 1
If I connect to server 3 from server 1 through server 2 I would need to use ssh -A, since it needs to take my key from server 1, through 2 to 3.
This is done through SSH_AUTH_SOCK.

If I connect with ssh -a it will go to server 2 but wont patch me through from 2 to 3, since it is missing the key to do the challange from.

Ansible, however, allows to run the playbook when I connect with -A first. run the playbook. disconnect, connect with -a again and run the playbook.
it shouldn’t do this since I dont have a private key send to server 2 to do the challange. So the question remains why Ansible fails to do the SSH key check? since when I don’t run the playbook the second time while I’m logged on with ssh -a (after i did the ssh -A + run the playbook and completed it and logged out) I can’t ssh to server 3.

Hi

Check to see if you still have the control persist sockets open,
ansible will reuse those and the forwarding settings will be the ones
used to create the socket, so if created with -A, forwarding will
continue to work.

So, allthough I have not specified ControlPersist in the playbook, nor in .ssh/config , when I reconnect to the machine in a short enough time and run the playbook to server 3, the ssh tunnel is still open and will be used by Ansible?

So what I want, is probably, a way to close the tunnel when I logout of Ansible, while there is no playbook running.
Am I correct in assuming that disabling ControlPersist will result in breaking a playbook if Ansible is busy running such?

ansible by default tries to use control persist (you can turn this off
in ansible.cfg) if using a new enough version of openssh as it speeds
things up considerably.