Connecting to external IP from GKE pod

Hi guys,
I have a GKE cluster that needs to send traffic to a private IP belonging to another network, TCP and UDP traffic. The GKE cluster is VPC native. I did a connectivity test from one of the nodes to that IP and it works however if I do ping or traceroute or telnet it does not work, neither on the nodes nor on the pods. The traffic to that IP is supposed to be routed through a VPN tunnel that is connected to the VPC containing the cluster, which would explain the connectivity test success but not explain the failure inside the nodes/pods. Also, I set up a masquerade agent and a network policy that allows all ingress and egress traffic to and from the cluster so I still do not understand why can’t I reach that IP.

Any help would be appreciated. I do not know what else can I do from here.

Have a nice day!

What do you mean by connectivity test?

The ones done by the Network Intelligence service in GCP

Hi @tmontenegro ,

Can you share any screenshot or error messages when you mentioned you tried doing a ping or traceroute or telnet test?

Also if you won’t mind, were you following any documentations or guides for this?

Hi, thanks for the response. More than errors I got no connection at all, like when you ping a server that does not exist, and traceroute threw 64 hops all of them timing out.

And yes, I followed documentation for that. I even disabled the firewall for some minutes and got no response.

This is the documentation I followed:’

1 Like

Based from what you are saying, it seems that the issue can be isolated from the firewall and VPN.

Assuming that the IP masquerade is configured correctly, you might want to check if firewall allows ingress of on-premise subnet IP range in VPN vpc network used in cluster.

This might also need a deeper checking in your project, and I might suggest you to contact support and file a ticket for them to see if the GKE is properly exposed, service type, if it is using a private cluster, etc.