简体   繁体   中英

kubectl authentication to aws eks cluster

I have tried every solution I could get my 'google' on

   aws eks --region $AWS_REGION update-kubeconfig --name $EKS_CLUSTER
   aws-iam-authenticator token -i $EKS_CLUSTER
   aws eks get-token --cluster-name $EKS_CLUSTER
   kubectl edit -n kube-system configmap/aws-auth
   curl -o aws-auth-cm.yaml.tmpl https://amazon-eks.s3.us-west-2.amazonaws.com/cloudformation/2020-08-12/aws-auth-cm.yaml
   cat aws-auth-cm.yaml.tmpl | sed 's\/<ARN of instance role (not instance profile)>\' "$EKS_CLUSTER_NODE_ROLE_ARN/g" > aws-auth-cm.yaml  - rm aws-auth-cm.yaml.tmpl
   kubectl apply -f aws-auth-cm.yaml
   kubectl config set-context $EKS_CLUSTER

But I get the following error

$ cat ./k8s/deployment.yaml.tmpl | sed 's/\$ZONE_ID'"/a/g" | kubectl apply -f - 

error: You must be logged in to the server (the server has asked for the client to provide credentials)
ERROR: Job failed: exit code 1

I would start by checking the aws cli version. If it is not a recent version update it. Next I will go over https://docs.aws.amazon.com/eks/latest/userguide/add-user-role.html and see if the IAM roles are set properly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM