简体   繁体   中英

When setting up GKE internal load balancers why can I access the IP address from a separate VM but not within the cluster?

I have a K8s deployment running in GKE which is connected to an internal load balancer service which assigns an IP address to the VPC subnetwork. When I spin up an individual Compute VM in the subnetwork I am able to access the deployment using the ILB IP address, but I cannot access the deployment within the cluster or from another GKE cluster hitting the same IP address.

I am not sure what I am missing, or if an ILB is not the right tool for this use case. The end goal is to communicate between different GKE clusters on the same subnetwork.

If you can access from a VM but not from the cluster is strange. The cluster, VM, ILB must be in the same region and subnet.

Also here [1] you could find an example about how to create an internal load balancer for GKE. You can check the example config and your ILB config.

I test this with a curl to ILB and works from a VM instance, from inside the cluster or from a different cluster in the same zone.

[1] https://cloud.google.com/kubernetes-engine/docs/how-to/internal-load-balancing

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM