Deploying serverless GCP Cloud Function via Bitbucket Pipelines

Image for post
Image for post
Google Cloud Function and Bitbucket

Having a CI/CD pipeline setup can save software engineers tons of time, making sure the deployment steps are consistent and reduce potential errors by automating repeatable steps. In our company Interviewer.AI, we are using Bitbucket as our version control repository hosting service and deploying various serverless applications and functions to GCP Cloud Function and AWS Lambda. This post will be explaining the steps to configure Bitbucket Pipelines as a CI/CD tool to deploy function code to GCP Cloud Function.

Image for post
Image for post
CI/CD workflow

As a recap of the workflow, as a software engineer commit and push the changes to a Bitbucket repository, the CI/CD process should be kick-started and complete by the time the function code gets deployed to GCP Cloud Function successfully.

Prerequisite

  • A GCP account and project (and have the permission to create a new service account)
  • Bitbucket account (and have the permission to create/manage repository)
  • Application / Function code (in the example below we will be using Python)

Steps

With that in mind, here’s the outline of the steps I have taken:

  1. Create a repository in Bitbucket and enable Pipelines
  2. Create a new service account in GCP following the principle of least privilege
  3. Update repository variable settings
  4. Update repository with function code and bitbucket-pipelines.yml
  5. Commit changes and push to Bitbucket to see the pipelines happening in action
  1. Create a repository in Bitbucket and enable Pipelines

After you created a new empty repository, head to the Settings tab, find the Pipelines section, and click on the the Enable Pipelines checkbox.

Image for post
Image for post
Enabling Bitbucket Pipelines

2. Create a new service account in GCP following the principle of least privilege

Create a new service account in GCP that has Cloud Function Developer and Service Account User roles. This allows us to deploy to Cloud Function and act as the service account that can run the function (by default Cloud Function will use the App Engine service account).

Image for post
Image for post
Service Account Permissions

Once created, download the Service Account credentials key in JSON format.

Image for post
Image for post

3. Update repository variable settings

There are 2 ways we can use the newly download credentials in our pipelines:

  1. We can include the JSON key file in the repository and reference it when the pipeline is running. This is not recommended as the key file should be considered as confidential information and shouldn’t be checked into a version control system.
  2. We can use the Bitbucket Pipelines repository variable so that we can pass down the information when the pipeline is running. We can also encrypt a variable on the repository variable by checking the Secured checkbox beside the variable.
Image for post
Image for post
Bitbucket Pipelines repository variables

One thing to note, since the value of the repository variable is expected to be a string, for the DEV_GCP_KEYFILE variable we have to encode the content of the key file with base64 encoding. In Mac, that can be done via the command in the CLI below:

# Option 1: Read in the file and base64 encode itmore ~/Downloads/keyfile.json | base64
# Option 2: Supply the filename directly to base64 command
base64 ~/Downloads/keyfile.json

Once a variable is added, we can access the value of it by adding a $ sign in the bitbucket-pipelines.yml file, eg: $DEV_GCP_KEYFILE

4. Update repository with function code and bitbucket-pipelines.yml

My git repository is structured as followed:

Image for post
Image for post
Folder / files structure

The function entry point is written in app/main.py . Below is a basic template generated from GCP Cloud Function when you create a function via the console.

Cloud Function Python template (main.py)

The CI/CD workflow for Bitbucket is specified in the file bitbucket-pipelines.yml

Sample bitbucket-pipelines.yml

Line 1-4: To use python3.7 as the base image and run the pipelines in a docker container

Line 6–13: Define a reusable step as a variable. In this case, we know that we want to run linting and coverage testing for every branch in the repository, so we create this lintingtesting step to be referenced in the subsequent pipeline configuration.

Line 15–17: Steps to be run by default (i.e. every branch)

Line 19–29: Steps to be run when we pushed to development branch. In this case, on top of lintingtesting step, we:

  1. Get the encoded string $DEV_GCP_KEYFILE from the repository variable, output it a temp JSON file
  2. Use the temp JSON file to authenticate against GCP via gcloud auth activate-service-account command
  3. Deploy the function to GCP via gcloud functions deploy command. We supply the function name, entry point, source, project, region, and trigger type here. Feel free to change any of these Cloud Function flags according to your requirement. Note that we get the project ID and region from the repository variable as well on line 29.

Line 31–40: Similar to the previous one, but for production (master branch).

5. Lastly, commit changes and push to Bitbucket to see the pipelines happening in action

Image for post
Image for post
Successful pipeline
Image for post
Image for post
Function get deployed to GCP
Image for post
Image for post
Test the function on GCP console

Conclusion

By leveraging on Pipelines offered by Bitbucket, we can create a CI/CD workflow that deploys to GCP Cloud Function. The steps listed in the bitbucket-pipelines.yml file automates the CI process (linting, testing), ensures consistency, and minimize error that can be caused by engineers.

The sample repository that comes with the basic template including the function code, linting, unit, and coverage testing, as well as the bitbucket-pipelines.yml can be found here.

Co-founder of Interviewer.AI. GCP Professional Cloud Architect. Passionate about architecting and implementing cloud native solution to make human efficient!

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store