QT Azure Pipeline Integration

From Qt Wiki
Jump to: navigation, search

Setting up a debian VM in Azure to work with the pipeline jobs

When you want to use a Microsoft Azure Pipeline for your continuous integration with a QT project using a linux server, you'll need to create a self-managed linux VM and integrate it with QT and the azure pipeline system. This is because the Microsoft-hosted build servers do not natively support QT.

This article describes the several moving pieces using debian as an example and assumes that you have the proper Azure accounts:

  • Azure: azure.microsoft.com (used to create a VM.)
  • Azure devops: dev.azure.com/devops (dev.azure.com/YourDevopsAccount)

You will need to do the following:

  • Prep your QT project file (.pro)
  • Create a debian VM
  • Set up an agent pool and add the VM
  • Set up the pipeline job

Preface & definitions

Once you have created the linux VM and installed the agent code, the VM will poll the pipeline queue for pending jobs. When it finds a job, it will download the source code to a location you have configured. At that point, the agent will change to the directory which is the root of your repo, then run the scripts you have defined for the pipeline job.

This guide assumes that the code is in an Azure repo. However, the Azure pipeline can equally read from github.

In our repo, we have two directories directly below the root of the git repo:

  • ./builds
  • ./ProjectSource

The QT project files on our development workstations are configured to build to the builds/release/projectName/librarySubDir directories. We don't care about checking-in the compiled files so we have .gitignore in the "builds" directory with '**/* to ignore all of the files in the builds directory. All of the libraries are in a subdirs project with the source in ProjectSource and they all build with a similar path. When we want to link one library with another, we use relative paths. The ensures that building on a different server works as we don't have any hard-coded paths in the QT project files. This is an important point that will be further explained later.

The script for the pipeline is defined in a file called "azure-pipeline.yml" which is stored in the repo. The basic script is like the following:

 mkdir -p builds/release/ProjectName
 cd builds/release/ProjectName
 qmake -makefile -o ./Makefile ../../../ProjectSoure/project.pro
 make -f Makefile
 

Basically it does the following:

  • Make sure the build directories are pre-created
  • Run qmake in "-makefile" mode to create a Makefile
  • Run make -f Makefile

Prep the QT project file(s) (.pro)

The ".pro" project file is the central configuration for the Azure pipeline. You need to have it fully configured in order for qmake to be able to correctly make the Makefile. To help with this step, the build script listed above will likely work on your development workstation! So, you can take the pipeline script and run it in the root of your repo on your development workstation to pre-test it and make sure it works.

On Your Development Workstation

cd RootOfYourRepo

Run the commands of the script you will use for the pipeline and ensure that the script works completely on your development workstation. Ensure that all paths are relative as the paths will be very different on the build server. If it won't run on your development workstation, it's sure to fail on the build server. You can work out most of the bugs on your workstation.

Note: the paths on the build server will NOT match what you have on your workstation. So, be sure to use relative paths in your project! You can use variables such as $$OUT_PWD (directory where the build writes the compiled files) or $$PWD (path where the source files are located).

We use a "subdirs" project for one of our applications. So, to include the library of one subdir in the application in another subdir, we used entries in the .pro file like: LIBS += -L$$OUT_PWD/../devicelibrary-abstract/ -ldevicelibrary-abstract

We also had to ensure dependencies were defined in the main .pro file. For example: devicelibrary-impl.depends = devicelibrary-abstract

These dependency definitions are important to make sure the Makefiles are set up correctly when qmake runs.

Fix any errors you find with "make -f Makefile" and check those changes back into the repo. You have the foundation setup and will likely need to make a few tweaks once you run it on the build server. But, you have the basics of the QT project files set up.

Create the debian VM

We have chosen debian as the target for our application, so the build server will need to be the same version to ensure that the builds are compatible. I will not give much detail on that as it's all basic Azure VM creation that is well documented in the Azure documentation. Be sure to give it a public IP so you can connect to it via ssh.

You will need to create a user that will be used for the builds. It can't be root. So, pick a username that makes sense. Add your public key to that user's .ssh/authorized_keys file to make access easier.

Install the build dependencies

You will need to install whatever third-party libraries you use. (e.g., sdk's, etc). We have a script called "aptPackages.sh" that we use to maintain what is needed for a development workstation, build server and target production server. That can be added to the VM post creation build script. Be sure to install the appropriate version of QT. (e.g., apt-get install QT5-default). You may need to reboot the vm after installing QT. This can happen when you are learning how to get it set up and the server gets "confused." Once you have the process down, you won't need to reboot the VM.

At this point, you should be able to connect to the VM via ssh. The server is nearly ready to work as a build server. But, you will need to install the agent pool software. This software can only be retrieved after the agent pool is set up which is described in the next section.

Set up the agent pool

In order for the deployment pipeline to know which server it can use to run the build, there has to be a linkage between the VM and the pipeline. This is done with "Agent Pools" in Azure devops. In the azure-pipeline.yml file associated with the pipeline job, there is a setting for "Agent Pool." You can create an agent pool, add the VM to the pool then specify that pool in the yml file and voila, the pipeline job will know which server(s) it can use for builds.

To manage the agent pools, login to the Azure devops site:

 https://dev.azure.com/YourOrganization

In the bottom-left of the screen, click on "Organization Settings." Under "Pipelines" click on "Agent pools." From here you can create, edit & delete agent pools. Create an Agent Pool for your build server(s). To add the VM to the agent pool, you'll install the software on the VM and configure it to use the agent pool you just created.

Install the agent software on the VM

At this point, you have all of the configurations for the Azure pipeline setup. You need to install the agent software on the VM. Navigate to the Agent Pool you just created. You'll find a link and script that you need to run on the build VM. Connect via ssh to your VM then follow the instructions for installing the agent pool software onto the VM. The install is composed of creating a directory, exploding a tarball then running ./config.sh; ./svc.sh install; ./svc.sh start. In the configuration step, you'll specify the Agent Pool to use. The svc.sh install command sets up up as a service. The svc.sh start command starts it running. You will likely need to enable the service to run at startup. You can do that with the normal systemd command (systemctl enable vstsUnitName). You can find the name that it created with:

 systemctl list-units | grep vsts

Once you have the agent software installed and running, look at the agent pool and see if your server is listed as a know server in the agent pool. If so, you have a server that is ready to run your builds.

Set up the pipeline job

At this point, you are ready to create a build pipeline and build QT on newly create VM. Login to the Azure devops site:

 https://dev.azure.com/YourOrganization

Click on "Pipelines" the click on "+ New" to create a new pipeline. Follow the prompts and enter the information as needed. You will end up with a azure-pipelines.yml file that will be checked into the root of your repo.

Be sure you edit the pipeline in the visual designer and select the appropriate pool that you just created. Even if you put the pool in the yaml file, you have to "authorize" it in the designer.

This is a sample file:


 # C/C++ with GCC  
 # Build your C/C++ project with GCC using make.
 # Add steps that publish test results, save build artifacts, deploy, and more:
 # https://docs.microsoft.com/azure/devops/pipelines/apps/c-cpp/gcc
 # Trigger on changes to the develop brandh
 # Use servers in the "Debian server" agent pool
 # 4 Steps in the script to build the site.
 trigger:  
 - develop
 
 pool:
   name: 'YourAgentPoolName'
 
 
 steps:
 - script: |
     pwd
     cd builds/release/ProjectName
     qmake -makefile -o ./Makefile ../../../ProjetSource/Project.pro
     make -f Makefile  
  displayName: 'make'
 -script: |  
   cd builds/release/MyProject
   ./library-test/library-test
   if  $? -ne 0  ; then
     echo Unit test failure in library-test
     exit 1
   if
  displayName: 'Unit tests'


NOTE: I added an additional step in the script here to illustrate how to run unit tests and cause the pipeline job to succeed or fail based on the output.

Troubleshooting the Azure Pipeline

Pipeline Setup

If the pipeline job never kicks off, then it may be a set up issue with the pipeline.

Assume you created a VM for the pipeline named "MyBuildServerABC." You need to check to be sure everything is set up correctly.

  • Check the VM itself
    • Make sure the VM is running and you can log into
    • Make sure that the agent software is running
    • The script suggests installing in ~/myagent. Be sure that you ran ~/myagent/svc.sh install && ~/myagent/svc.sh start
    • The script uses systemd to create a service that starts with vsts.... You can enable the service to run at startup.
      • systemctl list-units | grep vsts
    • You can check the logs for the process with journalctl -u NameFoundInLastStep
  • Make sure the azure pipeline is set up correctly.
    • Log into https://dev.azure.com/YourOrganization
    • At the root / organization level, click on "Organization Settings" in the bottom-left of the screen.
    • Click on Deployment pools and check your deployment pool has at least 1 server running.
      • You can click on the pool name and see the names of the targets.
      • You should see the name of your VM (MyBuildSeverABC)
  • Navigate into your project. This is inside of the organization.
    • Click on Pipelines
    • Click on Deployment Groups
      • You should see your deployment group with "1 Online"
      • Click on your deployment group to see the name of your build server (MyBuildServerABD)
    • Navigate into the project settings (bottom-left of the screen)
      • Note, the menu item at the bottom-left of the screen changes from "Organization Settings" to "Project Settings" when you navigate into a project. To get back to the organization level, click on "Azure DevOps" in the upper-left of the screen.
    • Click on Agent pools.
    • Click on the agent pool you set up.
      • You should see your VM listed (MyBuildServerABC)

If the pipeline job kicks off, but you get an error like, "Could not find a pool with name {PoolName}. The pool does not exist or has not been authorized." You need to edit the pool in visual mode, click on "Edit in the visual designer," and be sure the pool you want to use is selected. See also: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=vsts#troubleshooting-authorization-for-a-yaml-pipeline.

Pipeline script fails

If the pipeline job runs, but the script fails, you can debug it on the VM. First verify with the logs in the azure devops portal by looking at the build logs for the failed job. Once you are comfortable that the source code is being pulled properly, you will be able to debug the job on the VM. Navigate to the root of the repo. The default is in the home directory of the user in _work (~/_work/...). There is usually some path below that like ~/_work/1/s. You can ssh into the VM, navigate to the root of the repo (e.g., ~/_work/1/s) and run the script commands to simulate what is happening with the build job. With that, you can debug errors in the script.