C#, .Net and Azure

Using Azure DevOps to deploy Azure functions from Github


This post serve as a getting started guide for all my Azure function based projects. It complements the documentation of all my azure functions by giving newcomers the basics of:

After this tutorial you should have an understanding of the basics and be able to deploy one of my azure function projects from github successfully.

Azure DevOps

Azure DevOps allows you to host code, track workitems, automate CI & CD and much more.

For the context of this tutorial we will focus on the CI/CD part of it named Azure Pipelines.

If you have not yet created an account, then sign up for free at dev.azure.com. It’s super easy & straight forward.

You do not have to use your github account for sign up as you can create the github connection later as well (I recommend you use the same account that you are also using for your azure subscription).

More details can be found here.

Creating the project

Unlike Github (where everything is stored directly in your personal account) Azure DevOps uses the concept of projects. Each project is its own container for everything from work items, repositories, builds, releases & artifacts (think: Github organizations, except that in Azure DevOps you can have as many projects as you want).

For now you must create one project, but whether you want to use just one for all your code or multiple to separate certain projects is up to you (I use one for all my opensource code and then multiple separate projects for private code).

create project

I recommend you make the project private for starters (you can always switch between private/public later).

If you pick private (like I did for multiple other projects that I didn’t opensource) then:

It’s possible to purchase more minutes but I have found that the free build minutes are quite sufficient (you can run builds for 60 minutes every day).

If you pick public (like I did at dev.azure.com/marcstan/Opensource) then:

I made my builds opensource (alongside clones of my repos and the wiki) so people can have a look at them alongside my sourcecode on github.

Note: If you make your project public and then build code from a private github repository (or expose secrets in the builds/artifacts) you end up leaking sensitive information!

Connecting Github

Now that you have an empty project it’s time to connect Github.

This allows you to run automated builds based on changes inside the github repositories (e.g. trigger a build on every commit).

Alternatively if you want to host your code directly on Azure DevOps, you can create a new repository directly in the Azure Repos section (or push to the existing repository).

If you use Azure Repos, then you can skip this section and start with this tutorial instead before coming back. Whether you want your source to be on github or Azure Repos is up to you (I personally use Github as the source but then automatically mirror my repositories as described in my previous post).

In the bottom left corner of your project click on settings to connect github:

project settings

Be sure that it says “project settings” and not “organization settings” (you must navigate into the Azure DevOps project you created to see “project settings”).

Once in the project settings you should be able to find “Service Connections” as part of the pipelines section (if it’s missing you are probably in the “organization settings”).

With these connections we can connect third party services to Azure DevOps (like Github in this case). We’ll also use it later to connect to Azure.

You can either use the (outdated access token flow) or directly authorize Azure Pipelines (recommended) as a Github app onto your github account (you can revoke the permission from your github account at any time):

github service connection

After you have setup the connection we can move on to create our first pipeline.

Azure Pipelines

Now that the code is in a repository and its connected it’s time to automate build & deployment.

Azure Pipelines allow you to build, test and deploy applications and we’ll spend the rest of the tutorial covering them and their setup.

Note that there are currently two ways pipelines can be used:

We’ll focus only on YAML pipelines.

As of this writing they are still in preview and must be enabled manually as a preview feature (top right corner) (might as well enable the preview feature “for this organization” in the drop down so everyone gets the same experience):

preview multistage

If enabled, you will see these entries in the left menu (note Pipelines & Environments):


(Without the preview feature enabled, it will only say Builds & Releases instead).

For the rest of the tutorial we’ll focus on my Let’s Encrypt Azure project as an example as it has a YAML pipeline in code that builds, tests and deploys the Azure function.

You can already see that YAML pipelines make it very easy to not only automate the CI/CD pipeline but to also document them for other users!

Creating the first YAML pipeline

Click on pipelines in the left menu and then on “new pipeline”.

You will be asked to pick the source repository so select yours (whether you chose Github or Azure Repos in the steps before).

The wizard will suggest a few predefined YAML templates to you based on the code in the repository.

These are in fact great starters if you created a new project but for now we are going to select “Existing Azure Pipelines YAML file” (at the very bottom) since a YAML file is already defined in the repository:

create pipeline

(The dropdown should even suggest the existing YAML file to you).

You will then get a chance to edit the pipeline before having it run for the first time.

In the case of my pipelines they all use:

  vmImage: 'windows-latest'

windows-latest is a Microsoft hosted agent meaning it uses the free build minutes from Azure DevOps and you don’t have to creatw a VM. The agent is also reset everytime and outputs build logs to the release making it easy to anaylze failures.

At the top you should see a set of variables defined:


The resource group will be created in Azure and all resources will be deployed into it. In my projects I use the convention to name all resources the same as the resource group (e.g. resource group “lets-encrypt-azure” -> azure function is also named “lets-encrypt-azure”).

That way I don’t have to specify & remember distinct names for the various resources.

Since many resources must be globally unique (e.g. every azure function receives a url <function name>.azurewebsites.net) you must change the resource group name before you can run the pipeline (or else it will fail due to a name conflict).

I recommend you pick something unique like <your username>-letsencrypt. Note that depending on the resources deployed there are limits on the names that can be used (e.g. storage accounts must be at most 23 characters, lowercase and contain no dashes, which is why my ARM templates all transform the resource group name like this when used as a storage account: [toLower(replace(resource group().name, '-', ''))]).

That way I can use a resource group name like “Lets-Encrypt-Azure” but then have a storage account “letsencryptazure” without having to specify two names.

In the case of my pipelines there is also a scheduled trigger section (in addition to the master trigger) because I like to periodically check whether my pipelines are still working.

If you don’t want the deployment to run weekly you can remove that in the editor now:

create pipeline

After you clicked on Save & Run the pipeline will fail complaining about a missing Azure connection “Opensource Deployments”:

auth issue

That’s because we haven’t yet set up the service connection that the pipeline is using.

Head back to the “service connections” dialog and create a new connection of type “Azure Resource Manager” (obviously you need an azure account for that, if you don’t have one yet, create one here for free).

I named mine “Opensource Deployments”, you can pick whichever name you like (just keep in mind that you then need to update the YAML pipeline with this new name later).

The connection dialog will try connecting to your azure subscription (pick the correct one) and allows you to easily set up a connection (by creating the necessary resources in the background).

If you don’t care about the details, just select the correct subscription and click create. This will authorize Azure DevOps to perform all actions in your subscription. You can then skip the Azure section below and continue with the section Running the pipeline.

Azure connection in detail

If you want to be in full control of the Azure <-> Azure DevOps connection (and want to know what’s going on under the hood) then follow the steps in this section.

First of all, click on the “use the full version of the service connection dialog” link at the bottom of the connection dialog in Azure DevOps.

This will expand the dialog and allow you to enter all the details.

Under the hood Azure DevOps will use a service principal. For the purpose of this tutorial you can think of a service principal as a technical account that resides inside your azure AD tenant (each subscription is tied to a specific tenant).

This technical account can be authorized to perform certain actions and its credentials can then be used in automation environments (such as Azure DevOps).

In our case we want the service principal to become a contributor on one (or multiple) resource groups. As a contributor the service principal is authorized to create, update & delete azure resources.

These permissions are required to create the infrastructure and deploy the code to it.

To create a service principal, head to azure and navigate to the Azure Active Directory blade where you can create an app registration

App registration

Give it a name and click create.

Inside the overview you can see the Application (client) ID & Directory (tenant) ID. We’ll need both in the service connection dialog.

We’ll also need the password, so click on Certificates & secrets inside the app registration and create a new password.

You can chose to rotate it yearly (by letting it expire) but then you must remeber in a year from now that the credentials expired and you need to update them. (You can also delete a password at any time and set a new one).

Copy the password as you will not get to see it again in the future.

The final step we have to perform in Azure is to specify what permissions this service principal will have (currently it can’t do anything).

To do so we will configure RBAC rules.

You can chose between:

The first is more convenient but also comes at a greater risk: If your credentials are compromised the principal can be used to edit/access/delete all your azure resources.

For my opensource deployments I manually create the resource groups and assign the permissions on them. In case of a compromise of the credentials the deployment principal can only edit/delete/modify my opensource resources and has no access to other resources I have deployed in azure.

Decide which you want to do:

Full permissions

Click on the Subscriptions icon on the left (or search for it at the top). On your subscription click on Access control (IAM), add a new role assignment of type Contributor and search for the service principal by name.

After adding it it now has full permissions in your azure subscription to perform the necessary steps.

Restrict permissions to specific resource groups

Before setting up the permissions on each resource group we must first allow the principal to read resource groups.

To do so follow the steps from the Full permissions section above and add the principal as a Reader on the subscription level.

Then head to each resource group that you want the principal to deploy resources into (or create a new one if you don’t have one yet) and in the Access control (IAM) section of each resource group add the principal as a Contributor.

It now can create/update/delete resources in the resource groups that you allowed it into but not in others.

Filling in connection details

With the details from Azure you can now fill out the service connection dialog in Azure DevOps.

Select the correct subscription, fill in the tenant ID and enter the application (client) ID in the Service principal client ID field.

Likewise the password you created on the service principal now belongs in the Service principal key field.

You can click on Verify connection and it should verify successfully as you authorized the service principal to be at least Reader on the subscription.

Next head back to the YAML pipeline.

Running the pipeline

Now that the connection between Azure & Azure DevOps is set up, it’s time to run the pipeline again.

If you picked a different connection name, you must replace all instances of azureSubscription: 'Opensource Deployments' in the YAML file with your name.

Now trigger the pipeline manually and with the required connections created the pipeline should now run successful.

First it will compile the Azure function, then it will run an ARM template deployment to create the resources in Azure and finally it will deploy the function code to the instance (you can click on each step to view detailed log outputs).

If you look in Azure after the pipeline finished you will see all the resources deployed in the resource group.

All my projects also deploy an application insights instance which ingests telemetry from the app.

In case of the lets encrypt function it stores the logs (and errors) of every run making it easy to check previous runs (note that app insights has up to 5 minute delay for telemetry ingestion. If you just ran the function you might have to wait a bit for the logs to show up).

Whenever you now make and push any changes to the repository the pipeline will automatically trigger, run the tests and deploy the changes to the function app.


I hope this tutorial helps people to get started with deploying resources to Azure by using Azure DevOps YAML pipelines.

For more detailed documentation on the Let’s Encrypt function (and others) check the documentation of my individual github repositories.

If you want to learn more about the individual concepts I encourage you to check out the offical documentation as it covers all of it in great detail: