简体   繁体   中英

GKE Pod Connect to external VM in same subnet

I Have a pod that needs to connect to a Database located on GCE Vm's with the same subnet as the GKE nodes. I currently have a k8 Service and k8 Endpoint that the pod successfully connects to but the 10.128.0.2 cannot be routed. Im sure this pertains to a GCP firewall rule/route but I havn't had much luck.

subnet -> 10.128.0.0/9

cbr0 -> 10.8.15.0/20

eth0 -> 10.128.0.1

k8 services -> 10.11.224/14

Master Version: 1.9.7-gke.3

kind: Endpoints
apiVersion: v1
metadata:
  name: externalDB
  namespace: default
 subsets:
  - addresses:
      - ip: 10.128.0.2 
    ports:
      - port: 7199 
        name: interface

"

At this point in time, services and endpoints are not routable; however pods are as explained in this article . As @cohenjo mentioned, you should directly connect from the pod.

Edit: I believe that this issue is due to a firewall change on Clusters that are running 1.9.x as described in this article . You can follow the steps provided in the article to allow communication from the GKE cluster to all VM instances on the network or attach the network tag assigned on the node to the VM instance you would like the pod to communicate with.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM