I have a currently working gitlab CICD setup... which takes my conan recipe, my library repo.. and whichever git tag you hard code.. it will clone that and build the package... and push it to gitlab package manager.. GREAT!
What I am wondering is.. how should I automate this so it looks at the git repo and builds ALL git tags.. so that I can roll back and forth more easily on conan packages.
For reference here is my conan.py
from conans import ConanFile, CMake, tools
class TwsApiConan(ConanFile):
name = "twsapi"
version = "10.17.01"
license = "IBKR"
author = "someemail"
url = "https://github.com/ibkr/tws-api/"
description = "Built from a mirror of the actual TWS API files in Github"
topics = ("tws", "interactive brokers")
settings = "os", "compiler", "build_type", "arch"
options = {"shared": [True, False]}
default_options = {"shared": False}
generators = "cmake"
def source(self):
self.run("git clone --depth 1 --branch 10.17.01 git@github.com:ibkr/tws-api.git")
tools.replace_in_file("tws-api/CMakeLists.txt", " LANGUAGES CXX )",
''' LANGUAGES CXX )
add_compile_options(-std=c++17)''')
def build(self):
cmake = CMake(self)
cmake.configure(source_folder="tws-api")
cmake.build()
def package(self):
self.copy("*.h", dst="include", src="tws-api/source/cppclient/client")
self.copy("*hello.lib", dst="lib", keep_path=False)
self.copy("*.dll", dst="bin", keep_path=False)
self.copy("*.so", dst="lib", keep_path=False)
self.copy("*.dylib", dst="lib", keep_path=False)
self.copy("*.a", dst="lib", keep_path=False)
def package_info(self):
self.cpp_info.libs = ["twsapi"]
The gitlab CICD routine so far
variables:
GITHUB_DEPLOY_KEY_BASE64: $GITHUB_DEPLOY_KEY_BASE64
stages: # List of stages for jobs, and their order of execution
- build
build-job: # This job runs in the build stage, which runs first.
stage: build
image: registry.gitlab.com/jrgemcp-public/gitlab-cicd-docker/build-conan-docker:latest
before_script:
- 'which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )'
- eval $(ssh-agent -s)
- MY_SECRET_DECODED="$(echo $GITHUB_DEPLOY_KEY_BASE64 | base64 -d)"
- echo "$MY_SECRET_DECODED" | tr -d '\r' | ssh-add -
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- ssh-keyscan github.com >> ~/.ssh/known_hosts 2>/dev/null;
- chmod 644 ~/.ssh/known_hosts
script:
- conan profile new default --detect
- conan profile update settings.compiler.libcxx=libstdc++11 default
- conan remote add gitlab https://gitlab.com/api/v4/projects/${CI_PROJECT_ID}/packages/conan
- conan user myusername -r gitlab -p ${CI_JOB_TOKEN}
- conan create . mypackagename/prod
- conan upload "*" --remote=gitlab --all --confirm
You could generate your config dynamically in a script. That is to say, you might script getting all the tags/refs you want to build and create a yaml file containing a job for each ref that will checkout the correct ref and build it.
Basic idea in bash:
for tag in "$(get-all-tags-to-build)"; do
job_yaml="job ${tag}: {\"script\": \"make build ${tag}\"}"
echo job_yaml >> generated-config.yml
done
The idea being that make build
is configured to checkout the tag provided as the argument and run the build.
Using that generated config artifact, it will cause the created child pipeline to contain a job for every ref returned by the get-all-tags-to-build
script (you implement this).
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.