SuperNimbusKnowledge Base

UE5 Project Pipeline


In this section of the tutorial series we will be setting up the Unreal Engine on the base-Windows image we created in the a tutorial.

This tutorial is part of a series of tutorials on building UE5 projects using Jenkins and AWS CodePipline.
You can see Part 4 of this tutorial below.

Before we being it is important to know the reason for running a build pipeline and the benefits it can provide.

The project we are going to set up build-jobs for is a basic UE5 project.

It is a multiplayer game that needs both a client and a dedicated server to be built.

The project’s server is hosted on AWS GameLift.

At the moment we are only building Windows clients and servers (though you could also build for linux).

The project files are hosted on a perforce server.

We will be providing documentation for Github and Bitbucket pipelines in the future.

Let’s imagine that this is a project that is worked on during weekdays by a team of developers.

There might be 5-10 changes that are pushed to the project’s Main source control Stream (or equivalent branch if you are using another repository service).

It is important for us to know that the project still builds, cooks and packages correctly after each commit. To achieve this, we will set up a build job that runs after every commit made to Main.

Our incremental build will keep build-files on the remote agent disk which results in quicker builds. The packaged client and server will be uploaded to an S3 bucket for storage and delivery to tester or devs

Lastly, we also want to make sure that a “Clean Nightly” build can run without issues. This Clean build will always download files and build from scratch. It will therefore take longer to complete. With this build, we will also upload our server to GameLift and check that the GameLift fleet is activated successfully. If successfully activated, our Nightly alias will switch its target and point to the latest fleet. This means the servers will be ready to test in the morning.

We have written our build scripts in Python which makes them easier to work on and will mean they are platform-agnostic. We have scripts for building the project, uploading the packaged zips, launching GameLift fleets and polling GameLift fleets for any activation errors.

Additional Resources

Perforce Server

To store both our project and our build scripts, we use a Perforce server.

The setup and hosting of a Perforce server are beyond the scope of this tutorial. However, the folks at Perforce provide excellent resources to get you up and running with a Perforce server. You can find a tutorial on how to set up the server on AWS here, and a tutorial on how to configure your Perforce server to host Unreal Engine projects here.

If your project is hosted on a different type of version control, you can still make use of most of the following information. Jenkins has built-in support for most version control software but we will be focusing on Perforce for this tutorial.

AWS S3 Bucket – Our Build Destination

We store our build zips in an S3 object storage bucket.

S3 is great for storing build artifacts. It makes it easy to control access and optimize storage costs using object lifecycle configurations.

Let’s create a new bucket step-by-step and configure it.


We have two separate Stream Depos to configure that we will be using for our builds.


This is our project stream. It holds our UE5 project and some additional files related to GameLift.

We will be syncing from the Main branch for both our incremental and nightly builds.


These are our build scripts.

Creating The Bucket

1. In the AWS portal, head to the S3 services, select “Buckets” on the sidebar and click on “Create Bucket”.

2. Name your new bucket and choose the region you wish to deploy it in.

3. Decide on the bucket’s object ownership. In our case we want all objects in the bucket to be owned by our organisation.

4. Make sure that the bucket is completely private.

5. We will leave versioning disabled.

6. Provide some tags if required.

7. To make set up easier, keep the server-side encryption the default values. You can then click “Create bucket” at the bottom of the wizard.

S3 – Creating A Lifecycle Rule

We will utilise S3 Lifecycle rules with both our incremental and nightly build artefacts.

1. Head to the S3, select Buckets in the sidebar and click on the bucket we just created.

2. Head to the Management tab and click on “Create lifecycle rule”.

3. Name your rule.

4. Our rules use prefixes. The “nightly-builds/” prefix will apply the rule to all objects whose key matches.

5. Since our bucket is not versioned we only have to worry about the current versions of the objects.

6. Set up your object class transitions and expirations.

7. You can then click “Create Rule”.

Nightly Builds Lifecycle

Our nightly builds live in the “nightly-builds/” directory in our storage bucket. These builds will be uploaded in the Standard storage class.

After 30 days, we know that the builds are not likely to be used again. We can therefore move them into a cheaper storage class (Standard-IA).

Lastly, builds that are older than 80 days will likely never be needed again. We therefore expire them, which marks them for deletion.

In the case we do want to keep certain builds for longer, we will manually move them to a permanent location.

Incremental Builds Lifecycle

Our incremental builds will have a very simple lifecycle. After 10 days, our builds will expire.

There will be several incremental builds per day, so it is important to keep only recent builds to save on excessive storage costs.

Configuring SES

To send email notifications from our Jenkins pipelines, we need to configure both SES and Jenkins.

The goal is to use the Jenkins native email plugin in combination with SES.

First we need to set up SES in the AWS web dashboard. This means creating SMTP credentials that we can use in Jenkins and verifying various email addresses that we plan on using.

Creating SES Credentials

1. Navigate to the SES service and select SMTP settings in the sidebar.

Then click on Create SMTP.

2. Name your new credential. The policy statement should look the same as the one below.

3. Add your meta-data tags to the credential and click on “Create user” in the bottom right.

4. You will now be shown your newly created SMTP credentials. Make sure to save these, since you will not be able to retrieve them again.

We will only need the SMTP user name and password. The IAM user name will not be needed in Jenkins.

You can download the credential in a .csv file if you want to save them somewhere permanently.

Verifying Email Addresses

Jump to section