This is our simplified gitlab-ci.yml
stages:
- build
- deploy
# fetch home repo that'll run the pipeline processes
before_script:
- git clone git@git.org:home_repo/home_repo.git
- cd home_repo
build:
stage: build
needs: []
resource_group: $CI_PROJECT_NAME'_build'
script:
- ./pipeline.sh build
deploy_environment1:
stage: deploy
needs: [build]
resource_group: deploy_environment1
script:
- ./pipeline.sh deploy env1
deploy_environment2:
stage: deploy
needs: [build]
resource_group: deploy_environment2
script:
- ./pipeline.sh deploy env2
Adding 5-10 environments would blow up the size by a lot. How can we handle that?
Possible solution 1: Have one job that will create a list of all environments available and deploy to them. However, we won't have a good pipeline view of how each deployment went.
deploy_environments:
stage: deploy
needs: [build]
resource_group: deploy_environments
script:
- ./pipeline.sh deploy to_all
Possible solution 2: before_script
fetches a list of all environments and inserts as many deploy_environment
job definitions as needed. But we don't know how to do that. Hence the question in the title.
Can I change gitlab-ci.yml file mid execution?
No, not really . The closest thing to that would be using dynamic child pipelines where the YAML for a child pipeline is generated dynamically by another job in the parent pipeline.
That would allow you to achieve the effect you want.
Adding 5-10 environments would blow up the size by a lot. How can we handle that?
Alternative to (or even in combination with) using dynamic child pipelines, you could use a parallel matrix for concise definitions.
Something like this:
deploy:
stage: deploy
script:
- ./pipeline.sh deploy $ENV_NAME
parallel:
matrix:
- ENV_NAME: [env1, env2]
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.