简体   繁体   中英

azure aks deploying to multiple k8s clusters simultaneously with jenkins

Problem statement

i had created multiple pipelines in my jenkins env which can deploy kubernates objects to multiple cluster. if i execute single job at a time it works well but it might provide unstable output if multiple jobs executed for different environments

Basic steps for deploying to AKS cluster

  • login to azure
az login --service-principal -u $AZURE_CLIENT_ID -p $AZURE_CLIENT_SECRET -t $AZURE_TENANT_ID

  • get credentials
az aks get-credentials --resource-group "+resourceGroup+" --name "+clustername+" --overwrite-existing
  • kubectl apply
kubectl apply -f myk8sfiles.yml

when i execute single pipeline job it works fine but when i try to execute multiple pipeline jobs i assume my az aks get-credentials and kubectl apply commands will provide unstable output.

How can i execute deployment to multiple AKS clusters in parallel?

just save credentials to a specific place on disk for each cluster and use those specific credentials from the kubectl.

reading: https://kubernetes.io/docs/tasks/access-application-cluster/configure-access-multiple-clusters/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM