简体   繁体   中英

Problem : creating a kubernetes cluster on AWS using Kops

I'm setting up a new Kubernetes cluster on AWS using kops .

I have a domain name, domainname.com, and a subdomain, subdomain.domainname.com. I configured AWS Route53 to add a hostedZone which has the same name as subdomain.domainname.com.

On my domain.name; I pointed subdomain for each NS.

When I run kubectl get node :

Unable to connect to the server: dial tcp: lookup api.subdomain.domain.com on 8.8.4.4:53: no such host

For details:

When I execute this command:

kops edit cluster subdomain.domain.com --state=s3://block-state-b429

I have this:

metadata:
  creationTimestamp: "2019-09-17T22:46:45Z"
  name: subdomain.domain.com
spec:
  adminAccess:
  - 0.0.0.0/0
  channel: stable
  cloudProvider: aws
  configBase: s3://block-state-b429/subdomain.domain.com
  dnsZone: subdomain.domain.com
  etcdClusters:
  - etcdMembers:
    - name: eu-west-1a
      zone: eu-west-1a
    name: main
  - etcdMembers:
    - name: eu-west-1a
      zone: eu-west-1a
    name: events
  kubernetesVersion: v1.5.8
  masterPublicName: api.subdomain.domain.com
  networkCIDR: 172.20.0.0/16
  networking:
    kubenet: {}
  nonMasqueradeCIDR: 100.64.0.0/10
  zones:
  - cidr: 172.20.32.0/19
    name: eu-west-1a

And when I execute this command:

cat /root/.kube/config

I get:

apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: 
.... (certificat)
    server: https://api.subdomain.domain.com
  name: subdomain.domain.com
contexts:
- context:
    cluster: subdomain.domain.com
    user: subdomain.domain.com
  name: subdomain.domain.com
current-context: subdomain.domain.com
kind: Config
preferences: {}
users:
- name: subdomain.domain.com
  user:
    client-certificate-data: 
.... (certificat)
    password: **PASSWORD**
    username: **USER**
- name: subdomain.domain.com-basic-auth
  user:
    password: **PASSWORD**
    username:  **USER**

ROUTE 53:

I create managed zone for subdomain subdomain.domain.com.

NS
ns-1365.awsdns-42.org. 
ns-322.awsdns-40.com. 
ns-2043.awsdns-63.co.uk. 
ns-909.awsdns-49.net

for each NS: I was pointed it on my domain.com

NS:
subdomain ns-1365.awsdns-42.org
subdomain ns-322.awsdns-40.com
subdomain ns-2043.awsdns-63.co.uk
subdomain ns-909.awsdns-49.net

and I created my cluster with:

kops create cluster \
  --name=subdomain.domain.com \
  --state=s3://block-state-b429 \
  --zones=eu-west-1a \
  --node-count=2 \
  --node-size=t2.micro \
  --master-size=t2.micro \
  --dns-zone=subdomain.domain.com

I have change just the version, delete the actually cluster and create another by following this link: https://github.com/kubernetes/kops/blob/master/docs/aws.md

And it's OK!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM