简体   繁体   中英

How to update submodules in a gitlab CI/CD

I have a project organized in the following tree

|.
|..
|-- devops
|-- project1
|-- project2

In the devops folder, I have included the other two projects as submodules, since these two projects are developed independently by two different teams.

|.
|..
|-- project1@0deed0fa
|-- project2@0beef0fb
|-- .gitlab-ci.yml

I have setup the pipeline to deploy the projects. Whenever there are new commits on any of the projects, a trigger is setup to run the devops project pipeline. As part of the devops jobs, I run git submodule commands to fetch and merge. Then build. It works.

The problem I have is, over a period of time, there are a lot of changes made to the submodules. The changes from the last submodule commit to the devops project folder is replayed every time there is a commit on any of the projects. Once a month, I manually update the devops project folder and update to the latest commit of the submodule projects. I can commit the changes from the devops pipeline task, but that will generate new pipeline in the same devops pipeline. (I didn't test it but it seems obvious).

Is there any way I can update the submodules to the latest commit as part of the devops pipeline?

Thanks.

Only Build Your Binaries Once

Using git submodules are not the best practice for implementing an integration pipeline. The seminal book Continuous Delivery states the following in the section Only Build Your Binaries Once (Chapter 5):

Many build systems use source code held in the version control system as the canonical source for many steps. The code will be compiled repeatedly in different contexts during the commit process, again at acceptance test time, [etc.] Every time you compile the code, you run the risk of introducing some difference.

Also, recompiling takes a lot of time, resulting in longer feedback cycles. The recommendation is:

You should only build your binaries once, during the commit stage of the build. These binaries should be stored in a filesystem somewhere [...] where it is easy to retrieve them for later stages of the pipeline.

CI/CD Pipeline Flow

Following this paradigm, your workflow would look something like this:

  1. Developers working on feature branches in project1 and project2 will push a commit
  2. A project pipeline is triggered that builds the binaries, runs unit/component tests and packages the container
  3. If everything passes, the binaries or containers are deployed to a "development" repository
  4. The project pipelines will trigger your downstream devops pipeline, which will download the binaries/containers from the repositories.
  5. The devops pipeline integrates pieces and runs end-to-end tests
  6. If E2E tests pass, the binaries/containers are deployed (aka promoted) to a release repository
  7. If you're doing Continuous Delivery, you would deploy to production here, but many teams prefer pushing a button in a manual job

Notice how the source code is only built once.

Popular Binary Repositories

There are a number of popular binary repositories available. Most have a free and paid pro version. Check their websites for more info.

  1. GitLab Job Artifacts (between stages or pipelines)
  2. GitLab Package Registry or GitLab Container Registry
  3. GitHub Package Registry or GitHub Container Registry
  4. JFrog Artifactory
  5. Nexus Repository
  6. Cloud storage container registry ( AWS ECR , GCP CR , Azure CR )

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM