Streamlined Blogging: How to Use Hugo, GitHub Actions, and S3 for Hosting - Part 2
In the previous post, we covered the essential steps of configuring an AWS S3 bucket to host our Hugo-based blog. For a detailed walkthrough, you can revisit the post here. Now, let’s delve into the intricacies of GitHub and GitHub Actions for seamless blog management
GitHub & GitHub actions
Repository setup
To maximize the potential of Hugo themes, I’ve established three separate sites: a main website, a resume page, and a blog. Each site operates as an independent Hugo instance with its unique theme. To streamline the development and deployment process, I’ve configured three distinct GitHub actions, each triggered by a push event occurring within a specific path.
GitHub actions
Pre-requisites
AWS
- If you are creating a dedicated user for the access key, ensure that the new user is assigned the following minimum IAM policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowS3BucketManipulation",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:ListMultipartUploadParts",
"s3:AbortMultipartUpload",
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::<bucket name>/*"
},
{
"Sid": "AllowS3BucketListing",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::<bucket name>"
},
{
"Sid": "CFInvalidation",
"Effect": "Allow",
"Action": "cloudfront:CreateInvalidation",
"Resource": "arn:aws:cloudfront::<AWS account ID>:distribution/<CF distribution ID>"
}
]
}
- Generate a new Access Key. Ensure to copy or download the generated secrets immediately, as they will only be visible once.
- In the target S3 bucket, ensure that ACLs (Access Control Lists) are enabled, and grant read access to everyone/public access. For more information, refer to this Stackoverflow post)
GitHub
In the GitHub repository’s secrets section, include the following keys:
- AWS_S3_BUCKET: The name of the S3 bucket (only the name, not the ARN).
- AWS_ACCESS_KEY_ID: The Access Key ID generated earlier.
- AWS_SECRET_ACCESS_KEY: The Secret Access Key generated earlier.
Jobs
To maintain a streamlined pipeline (direct deployment to the PROD environment), we will primarily work with two main jobs: building and releasing the Hugo website, and deploying the contents of the public
folder to the designated S3 bucket.
Hugo build
Ensure to update the run
step to navigate inside the target folder. If the .toml
file resides in the root of the repository, disregard the initial line in the run command.
- name: Build Hugo site
run: |
cd ./blog
hugo --minify
Deploy to AWS S3
Be sure to modify the SOURCE_DIR environment variable to reflect the path of the output public folder, which is the result of the hugo –minify command.
env:
AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
SOURCE_DIR: "blog/public"
Put it all together
Following the preceding adjustments, the final GitHub Action file should resemble the following:
name: Deploy Blog Site to S3
on:
push:
branches: ["main"]
paths:
- 'blog/**' # only build when there are changes pushed to main branch on the path of blog folder
pull_request:
branches: ["main"]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Setup Hugo
uses: peaceiris/actions-hugo@v2
with:
hugo-version: 'latest'
- name: Build Hugo site
run: |
cd ./blog
hugo --minify
- name: Deploy to S3
uses: jakejarvis/s3-sync-action@v0.5.0
with:
args: --acl public-read --follow-symlinks --delete
env:
AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
SOURCE_DIR: "blog/public"
Final thoughts
This workflow is far from optimal; there are several areas that could be improved, including:
- Enhancing the flow to initially deploy to a staging environment for validation purposes.
- When deploying to S3, the current approach involves deleting all files and then uploading new ones. A better strategy would be to synchronize files, copying only the differences.