CI

How to setup Gitlab pages from a folder

Deploying a site to gitlab pages when using monorepos

Table of Contents
  1. Getting Gitlab CI setup
    1. Creating the first job - Testing the build
    2. Creating the second job - Deploying to pages
  2. Tie it all together

This article will help you set up Gitlab pages from a specific folder in your repository. Let's assume that your documentation is living inside your project repository and built using Sphinx.

To deploy your documentation site with Pages, you need to run a CI job to build the site. Gitlab will take care of the rest. You don't need to worry about domain names or Gitlab instance configurations if you do not use a dedicated Gitlab instance.

When your documentation is deployed, the URL will look similar to this https://<username or organisation>.gitlab.io/<repo name>

Getting Gitlab CI setup

Pages will be deployed once the CI job finishes running. The good thing is that if you already have a CI flow setup, all you need to do is add another job to deploy the pages. If you don't have one, you might be interested in reading about pages ci template.

Create a .gitlab-ci.yaml file at the root of your project. We will create two jobs to test if the documentation will build on every merge request and deploy your documentation from a branch called production.

Since the first job will run on every deployment, you will know if something is broken before deploying a new version of your documentation.

Creating the first job - Testing the build

We are now ready to start working on our CI template. Let's assume that your documentation lives in a folder inside the root directory of your project and that you have a requirements.txt file with some dependencies, for example:

text
1sphinx
2sphinx-copybutton
3sphinxawesome-theme

Let's create a .gitlab-ci.yaml file in the root directory of your project to install the requirements and run the build command to see if everything will build correctly.

yaml
1image: python:3.8-alpine
2
3stages:
4 - checks
5 - deploy
6
7test-docs:
8 stage: checks
9 script:
10 - pip install -r requirements.txt
11 - cd documentation
12 - sphinx-build -b html . preview
13 artifacts:
14 expose_as: "docs-preview"
15 paths:
16 - preview/

The script portion of the job will install the dependencies and then build your docs in a preview folder.

The artifacts portion will allow Gitlab to expose the built docs to you so you can see a preview of your documentation. Gitlab will add a collapsed section to the CI pipeline status once the CI has finished so you can see the built.

Gitlab pipeline

Creating the second job - Deploying to pages

The second job will be similar to our test-docs one, with some minor changes. We will change the stage to deploy, we will add a rule to run the job only when it's added to the production branch, and we will call this job pages.

yaml
1pages:
2 stage: deploy
3 rules:
4 - if: $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME =~ /^production/
5 script:
6 - pip install -r requirements.txt
7 - cd documentation
8 - sphinx-build -b html . public
9 artifacts:
10 expose_as: "docs-deploy-preview"
11 paths:
12 - public/

Let's take a brief moment to talk about how Gitlab expects you to pass things, so it knows where to find your files. The job deployed to Gitlab pages has to have the name pages. But another thing that you need to do is that the files are located in the public folder.

Depending on your project, you might not want to run the deploy stage every time you commit to the production branch. You can change how the job starts by adding another rule when: manual. This means you will have to click the start icon to start the job manually.

yaml
1pages:
2 stage: deploy
3 rules:
4 - if: $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME =~ /^production/
5 when: manual
6 script:
7 - pip install -r requirements.txt
8 - cd documentation
9 - sphinx-build -b html . public
10 artifacts:
11 expose_as: "docs-deploy-preview"
12 paths:
13 - public/

Tie it all together

That's all there is to it. I hope this article was helpful for you, and hopefully, I have saved you some time. When I was trying to implement this, I found the Gitlab documentation confusing. It seems to focus more on setting up pages if you are not using a shared Gitlab instance and need to set up everything yourself.

Our finished .gitlab-ci.yaml file looks like this

yaml
1image: python:3.8-alpine
2
3stages:
4 - checks
5 - deploy
6
7test-docs:
8 stage: checks
9 script:
10 - pip install -r requirements.txt
11 - cd documentation
12 - sphinx-build -b html . preview
13 artifacts:
14 expose_as: "docs-preview"
15 paths:
16 - preview/
17pages:
18 stage: deploy
19 rules:
20 - if: $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME =~ /^production/
21 when: manual
22 script:
23 - pip install -r requirements.txt
24 - cd documentation
25 - sphinx-build -b html . public
26 artifacts:
27 expose_as: "docs-deploy-preview"
28 paths:
29 - public/

Webmentions

0 Like 0 Comment

You might also like these

While working on adding tests to Pyscript I came across a use case where I had to check if an example image is always generated the same.

Read More
Python

How to compare two images using NumPy

How to compare two images using NumPy

This article will show you how to setup DynamoDB locally, so you can test your code without having to use your AWS account.

Read More
Databases

How to setup DynamoDB locally

How to setup DynamoDB locally

This article will show you how to create an use indexes in DynamoDB using the python library aioboto3.

Read More
Databases

How to create and use indexes in DynamoDB

How to create and use indexes in DynamoDB

Cheatsheet for LunarVim

Read More
Cheatsheet

LunarVim

LunarVim