Table of contents
Open Table of contents
Context
This blog is built with AstroJS and, as all static generated website, it needs to be deployed to a server after a build step.
For other websites I made in the past, I used Netlify to deploy them. It’s a great service, works very well just connecting your GitHub repository and have a generous free tier but, for this blog, I wanted to try something different.
Since I bought a domain from OVH I already have an included space to host my website and I wanted to use it. By default, the hosting has the the old-but-gold FTP protocol to upload files. And, trust me, it’s enough for deploying a static website.
The question: should I perform the build step locally and then upload the files to the server or should I use something automated?
Keep It Simple, Stupid
The answer is always the same: keep it simple!
Even if the build step is not that complex, I don’t want to remember to do it every time I want to publish a new post. Also, I don’t want to be very error-prone and forget to upload some files. Better, I want to allow mobile publishing/editing, so I need a way to automate the build and deploy process.
What is the simpler way to automate a process on a GitHub repository? GitHub Actions!
I don’t want to reinvent the wheel, so I put together some pieces to:
- build the website
- optimize the build
- upload the files to the server
Build and optimizations
In the package.json I have the following script:
{
...,
"scripts": {
...
"build": "astro check && astro build && jampack ./dist",
...
},
...
}
astro check
is a command to run diagnostics on the project (eg: on .astro files), it will exit with a non-zero code if there are errors.
astro build
is the command to build the website. It doesn’t do much, it just generates the files in the dist
folder.
jampack ./dist
is a command to optimize the build. It will minify the HTML, CSS and JS files. It also optimizes images to properly serve them with the srcset
attribute. This is really powerful since you just need to take care of the original image and the tool will generate all the needed sizes.
Does jampack works? Yes! See my last build result:
Action | Compressed | Original | Compressed | Gain |
---|---|---|---|---|
.webp | 4/4 | 73.39 KB | 65.90 KB | -7.49 KB |
.js | 3/4 | 181.37 KB | 181.35 KB | -28.00 B |
.css | 1/1 | 40.56 KB | 38.04 KB | -2.52 KB |
.html | 18/18 | 346.11 KB | 333.45 KB | -12.66 KB |
.png | 5/5 | 154.22 KB | 87.21 KB | -67.01 KB |
.jpg | 1/1 | 145.29 KB | 131.35 KB | -13.94 KB |
.txt | 0/1 | 329.00 B | 329.00 B | |
.xml | 0/3 | 3.11 KB | 3.11 KB | |
.ttf | 0/2 | 207.63 KB | 207.63 KB | |
Total | 32/39 | 1.13 MB | 1.02 MB | -103.64 KB |
Have you ever think about what if all the websites in the world would be optimized like this? It would be a huge gain in terms of bandwidth and speed!
FTP upload
The last step is really easy with FTP deploy action, it requires only few parameters to work:
name: Deploy Website
...
jobs:
build-and-deploy:
...
steps:
...
- name: Deploy
uses: SamKirkland/FTP-Deploy-Action@v4.3.5
with:
server: ${{ secrets.FTP_SERVER }}
username: ${{ secrets.FTP_USERNAME }}
password: ${{ secrets.FTP_PASSWORD }}
server-dir: /www/
local-dir: ./dist/
The server
, username
and password
are stored in the GitHub repository secrets. The server-dir
is the folder where the website will be uploaded and the local-dir
is the folder to upload.
Bonus: it mantains an internal state to upload only the changed files. This is really useful to avoid to upload all the files every time, and also speed up the deploy process.
Conclusion
The steps above are really simple and can be easily adapted to other static website generators. I haven’t reinvented the wheel, I just put together some pieces to automate the build and deploy process. And it works!
If you are curious, here is the whole workflow file:
name: Deploy Website
on:
push:
branches:
- master
jobs:
build-and-deploy:
name: Build & Deploy
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version-file: .nvmrc
- name: Install dependencies
run: npm ci
- name: Build
run: npm run build
- name: Deploy
uses: SamKirkland/FTP-Deploy-Action@v4.3.5
with:
server: ${{ secrets.FTP_SERVER }}
username: ${{ secrets.FTP_USERNAME }}
password: ${{ secrets.FTP_PASSWORD }}
server-dir: /www/
local-dir: ./dist/
Stats
My last 6 builds took from 45s to 1m 8s to complete, where in the last one I had:
- 5s of NodeJS setup
- 11s of dependencies installation
- 20s of build time
- 8s of FTP deploy time
As you can see, the build time is the most time-consuming step, but it’s not that bad. The FTP deploy is really fast, since it uploads only the changed files.
I hope you found this article useful, let me know if you have any questions or suggestions!