Hey,
I’m trying to SSH between two instances to practice Ansible on aws instances but I’m not able to create an ssh connection between instances. Can someone please help me.
are you able to ssh in regularly without ansible and what errors are you getting?
Please run the ssh command with -vvvv and display the output here
Since I enabled all shh I can ping on any instances but using public private key I can’t.
did you put the public keys on the host your trying to connect as? try ssh-copy-id since it copies the public keys and adds the correct permissions. Did you also use ssh-add? what errors are you getting?
Ssh-copy-i is not working. It gives an error.
what error are you getting? do you have enough permissions on the other side?
This is a big question, with far too little information to resolve easily.
Here is a troubleshooting list for you (or anyone) having connectivity issues generally and ssh issues specifically.
1: Check the addresses. Make sure you are using the correct name or IP address for the target instance.
2
: Check the addressing scheme at both ends. Private source requires private target (or a NAT gateway for the source); public source requires public target.
3: Check the security groups and ACLs - does the target security group allow inbound SSH? Does the source security group allow outbound SSH? Do the ACLs protecting the subnets allow ssh inbound to the source? Do they allow high ports outbound? If testing with ping, make sure the security groups and ACL allow ICMP in and out.
4: Check the routing at both ends. If source and target are in the same subnet in the same VPC, this will not be an issue, otherwise make sure that the routing is correct. Check the routing to the IGW, routing to the NAT Gateway, routing over any VPC peering links or Direct Connects - all that are appropriate to your situation.
5: Check that you have the correct SSH key for the target system. The private part needs to be in your ~/.ssh directory, the public part needs to be in ~/.ssh/authorized_keys on the target system. The latter part is done for you by AWS when launching new Linux instances.
6: Check that you are using the correct username for the remote system. By default this will be ec2-user for Amazon Linux and ubuntu for Ubuntu Linux.
7: If you are logged into the source system as a user other than ubuntu or ec2-user, you will need to specify the private key with -i on the ssh command line (or set up a Host stanza in ~/.ssh/config).
8: If the target instance is not a standard AWS AMI, check that it actually allows ssh through local firewalls, has sshd installed, used the default usernames, has the ssh key installed in the usual location and so on.
When asking for help, make clear exactly where the source and target systems are - both in AWS or not? Same or different subnets? Same or different VPCs? Public or private addressing? NAT gateways, Internet gateways, VPC peering, Direct Connects…?
There are a LOT of variables. Usually, access to a new instance is simple and Just Works. If it doesn’t, look at what changes you may have made to the standard set up.
Regards, K.