Unable to SSH to Public Server from Private GCP Instances

We are currently facing an issue while trying to establish an SSH connection to a public server from our GCP network.

SSH is configured to run on a non-standard port (not 22) on the target server. While connections work fine when initiated from a GCP instance with a public IP in the same VPC, attempts to connect from our private instances (which route traffic through a NAT gateway) are timing out.

Could you help us identify what might be causing this issue? We’re particularly interested in understanding if any firewall, routing, or NAT configuration could be interfering with SSH over the custom

Hi @shubham_bito ,

Welcome to Google Cloud Community!

The most likely cause is a restrictive egress firewall rule or NAT configuration blocking the non-standard port for private instances. Recommended solution is to add an egress firewall rule allowing TCP traffic on the custom SSH port from the private subnet to the public server’s IP.

Look for egress deny rules that might override the new rule, if any exist with a lower priority number (higher precedence), adjust their scope or priority to avoid blocking the custom port.

Implied rules - You can create rules that override them as long as your rules have higher priorities (priority numbers less than 65535). Because deny rules take precedence over allow rules of the same priority, an ingress allow rule with a priority of 65535 never takes effect.

Confirm the default route directs traffic through the NAT gateway, see Routes and firewall rules.

Using VPC Flow Logs - This will help identify where packets are dropped. Enabling it to trace the traffic.

For additional troubleshooting, refer to our SSH Troubleshooting Guide.

For more detailed insights you may reach out to Google Cloud Support for assistance.

Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.