Up and Running

Up and Running

June 20, 2020
homelab
CI/CD

Hello, World!

At long last I’m finally using my personal domain osullivan.tech for something. Not just various subdomains for various services I run on my home servers. In this post I’ll talk about what I did to get this site setup and deployed, and what I plan on doing in the future with it.

The Why #

When I originally purchased this domain I had intended to use it to selfhost a standard personal site with some basic info about me, and my work. Recently at work I’ve been spending a lot more time writing documentation. Writing is not something I’m particularly good at, so here we are practicing that skill.

I also have been putting more work into my Dungeon and Dragons (D&D) games, and have ideas for adventures or cool monster concepts that don’t quite fit my current campaign. I wanted a way to share those ideas and improve my writing and design skills by getting feedback on them.

The What #

I decided to keep things pretty simple for this site. To that end I used the static site generator hugo. I had encountered this before, but recently rediscovered it and decided now was the time to get my site going. I’m using the book theme for this site. It has pretty basic blog support, but I don’t need anything fancy there. It has the ability to build a menu dynamically based on how I structure a particular folder, which I think will be really nice once I get some D&D stuff going.

I’ve been using caddy as a reverse proxy for a while for my homelab (I’ll eventually do a post detailing it). It’s super simple and has automatic https integration with Let’s Encrypt for free ssl certs. caddy has actually been configured to serve a site at osullivan.tech since I put it in place it just had no content.

My Gitlab instance has been my most consistently used service I selfhost. Now that isn’t saying much. I’ve gone through a few droughts on working on projects. Idea’s that don’t pan out, or I just don’t have time for.

The How #

The setup was pretty simple only took me a few hours to get everything deployed to an accessible site. TODO: update timing for full deploy. I started with a looking for a hugo theme I liked. After I found one I followed the hugo Quick Start guide. My personal desktop is Manjaro Linux so I had to adjust the install instructions. It was just a sudo pacman -S hugo away, rather than brew install hugo. I ran through the guide, but also added a quick ‘Hello, World!’ page under /docs/ to test out the book theme. This is where I ran into my first problem. Making the side menu look the way I wanted it to. I wanted to match the example site just have thing in a slightly different place. Following the documentation on the menu didn’t quite give me the results I wanted. I didn’t want to have to manually maintain the menu, I preferred a more automated approach. Looking at their example site config gave me some hints, but still wasn’t quite what I wanted. After some time reading the hugo docs about various things and learning about Hugo Page Bundles I decided to look at the example site content setup. I found the example blog _index.md file and found that I can configure the menu from the front matter of a post! That is exactly what I needed. I created an empty post and pushed my changes.

At this point I set my sights on deploying my site. I decided I wanted two versions of my site deployed; a preview site with drafts deployed, and the main site. For preview site I used Gitlab Pages and for the main site I used my existing caddy server. Hugo has a really easy guide on getting setup with Gitlab Pages I used this to get a basic .gitlab-ci.yml setup. I made a few slight changes to their setup for my use case. I had decided on two long lived branches for the site. The default master branch would be what was deployed to osullivan.tech, while preview would be what was deployed to Gitlab Pages. So I changed the only branch filter to preview and I changed the script to hugo -D. Here is my first draft of the .gitlab-ci.yml

image: monachus/hugo

variables:
  GIT_SUBMODULE_STRATEGY: recursive

pages: 
  script:
  - hugo -D
  artifacts:
    paths:
    - public
  only:
  - preview

I pushed this up and had a site deployed a few minutes later. Unfortunately this is where I ran into my second problem. While all my html was loading, none of my css, js, or links were loading. I opened up my network tab and say that I was getting 404s for all of them. I went a ran a quick build locally to confirm it output everything correctly, and it did. I checked the build artifact directly and it had everything. I was a little stumped at this point so I went back to the network log to see if I could gather some more details. After a few minutes I noticed that the requests for static assets were hitting the base site not the site it was actually hosted on. For those that don’t know, Gitlab Pages defaults to host sites on [username].[pages-url]/[project-name]/. All my requests for assets were going to [username].[pages-url]/. That was it I found my problem. I knew from my reading of the docs earlier that the hugo command had a way to pass in a baseURL via the cli. I also vaguely recalled some pages related environment variables that Gitlab passed into builds. A quick search got me to the list of predefined variables from Gitlab. A quick ctrl+f for ‘pages’ gave me two options. CI_PAGES_DOMAIN and CI_PAGES_URL. CI_PAGES_URL sounded like what I wanted so I added it to the build. After a few short minutes, and a few early reloads I had a working site! Here is my final working .gitlab-ci.yml

image: monachus/hugo

variables:
  GIT_SUBMODULE_STRATEGY: recursive

pages: 
  script:
  - hugo -D --baseURL $CI_PAGES_URL
  artifacts:
    paths:
    - public
  only:
  - preview

After getting my site on pages I needed to get my site on my caddy server. I figured the simplest way to do this would be via Deployment with Rsync guide from hugo. Using this guide as a base, I just needed to make it work as part of my CI/CD workflow within Gitlab, and setup access for the runner to deploy to my caddy server. For access I generated a ssh keypair.

$ ssh-keygen -t ed25519 -f ./ed25519

This gave me the private key in ./ed25519 and a public key in ./ed25519.pub. I spent a little while trying to adapt one of my ansible playbooks to deploy this key, but my understanding of ansible is pretty limited, so I decided to deploy it manually. This isn’t a task I’ll repeat super often so not having it in ansible is ok. I ended up just sshing into my server and adding it to the ~/.ssh/authorized_keys file for the caddy user I have setup to run the webserver. From the home directory

$ mkdir .ssh
$ chmod 0700 .ssh
$ touch .ssh/authorized_keys
$ chmod 0644 .ssh/authorized_keys
$ echo "[GENERATED PUBLIC KEY]" > .ssh/authorized_keys

Now that server access is setup I moved onto the setting up .gitlab-ci.yml to deploy on commit. Using the Shell Script section as a base, this was pretty straightforward. It declared a couple variables for the deploy destination. I decided to make some custom environment variables in Gitlab for this. I also added the ssh keypair as a file type variable. For testing I left these variable unprotected so I could test my change directly on my branch. Here is my initial additions to my .gitlab-ci.yml

deploy: 
  script:
  - hugo
  - rsync -avz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i $DEPLOY_PRIVATE_KEY" --delete public/ $DEPLOY_USER@$DEPLOY_HOST:$DEPLOY_PATH
  artifacts:
    paths:
    - public
  only:
  - add-caddy-deploy

The first test of this was unsuccessful. I got a pretty clear error message /bin/bash: line 104: rsync: command not found. After a quick lookup as to what flavor of linux my image: was using (Debian). I installed rsync with these modifications

deploy: 
  script:
  - apt-get -qq update
  - apt-get install -yqq rsync
  - hugo
  - rsync -avz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i $DEPLOY_PRIVATE_KEY" --delete public/ $DEPLOY_USER@$DEPLOY_HOST:$DEPLOY_PATH

I initially didn’t have apt-get -qq update, but the install failed. I should have had it initially I was just being lazy and paid for it. After getting rsync fully installed I ran into another error.

rsync: Failed to exec ssh: No such file or directory (2)
rsync error: error in IPC code (code 14) at pipe.c(85) [sender=3.1.3]
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: error in IPC code (code 14) at io.c(235) [sender=3.1.3]

Turns out openssh-client also isn’t installed, so I added that.

  - apt-get install -yqq rsync openssh-client

Then I ran into yet another error.

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@         WARNING: UNPROTECTED PRIVATE KEY FILE!          @
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
Permissions 0666 for '/builds/nicosullivan/osullivan.tech.tmp/DEPLOY_PRIVATE_KEY' are too open.
It is required that your private key files are NOT accessible by others.
This private key will be ignored.
Load key "/builds/nicosullivan/osullivan.tech.tmp/DEPLOY_PRIVATE_KEY": bad permissions
Permission denied, please try again.
Permission denied, please try again.

This was pretty straightforward, but I was cautions about modifying the permissions on a file given to me by Gitlab. I tried to find a way to tell ssh to ignore the error, but that didn’t seem possible, so I modified the permissions on the key file as part of the setup.

  script:
  - apt-get -qq update
  - apt-get install -yqq rsync openssh-client
  - chmod 0600 $DEPLOY_PRIVATE_KEY
  - hugo
  - rsync -avz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i $DEPLOY_PRIVATE_KEY" --delete public/ $DEPLOY_USER@$DEPLOY_HOST:$DEPLOY_PATH

And that was it. I finally deployed my site to my webserver.

I mad a few more changes to my .gitlab-ci.yml before merging to only deploy my master branch. This is the deploy stage in it’s final form.

deploy: 
  script:
  - apt-get -qq update
  - apt-get install -yqq rsync openssh-client
  - chmod 0600 $DEPLOY_PRIVATE_KEY # ssh doesn't like open permissions on private key files
  - hugo
  - rsync -avz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i $DEPLOY_PRIVATE_KEY" --delete public/ $DEPLOY_USER@$DEPLOY_HOST:$DEPLOY_PATH
  artifacts:
    paths:
    - public
  only:
  - master

Finally done with the tech side of things I started working on the content side of things. I wrote this post and added some other basic content to the site.

The Future #

There are a few things I’ll likely clean up in the future. I might drop the two branches to a automated pages deploy with a manual server deploy. It depends on if I want to keep some things around as a ‘preview’ for a while. This is most likely to happen with D&D related content, if I ever get around to play testing any of it. I might experiment with my own hugo shortcodes for some fancy D&D monster templates.

Eventually I’ll likely migrate of of a self hosted solution and push into the cloud. Most likely for cost (old servers are kinda power hungry), but also for experience working with some cloud tech.

Thanks for reading this far! Hopefully I’ll dedicate some time to create some more content about my lab and other projects that I have going on.

DigitalOcean Referral Badge