Why I chose Hugo to regain control of my content
I’ve had an online presence since the first day I could browse the Internet many years ago. Over the years, I’ve accumulated a lot of content for three fundamental reasons: I love writing immensely, I love documenting my work and thoughts, and I love sharing knowledge. These three motivations naturally led me to position myself as someone who creates and publishes content.
Recently, I made an important decision: abandon third-party platforms like Medium to redevelop my personal website and host everything I had to publish. This transition to Hugo wasn’t trivial, and I wanted to share with you the reasons that drove me to it and my feedback.
My technical evolution over the years
My publishing journey has followed the web’s evolution. Initially, I naturally used WordPress, this globally recognized solution that I relied on heavily in the past for personal projects. WordPress remains an excellent solution, but it no longer matches my modern idea of a publishing platform.
Then, with my decreased publishing frequency, I migrated to basic static HTML that required no maintenance. It was simple, efficient, but limited. Finally, preferring to focus on writing rather than technology, I chose to rely on platforms like Medium to publish my content.
Why I needed to take back control
Two main reasons pushed me to abandon third-party platforms. The first is the overall degradation of content quality on this type of platform. Medium, which was initially a beautiful promise for content creators, gradually lost its added value in my eyes.
The second reason, more personal, is my desire to repatriate my content and regain control over it. I realized that my articles, my thoughts, all this writing work was scattered across platforms I didn’t control. This dependency no longer suited me.
I wanted to regain total control over my publications: their presentation, organization, and sustainability. Entrusting your content to third-party platforms means accepting their evolutions, algorithm changes, and policy modifications. I needed to return to something that truly belonged to me.
My discovery of Hugo and what seduced me
I didn’t want to implement something heavy like WordPress. I somewhat randomly came across Hugo, which I had heard about regularly during my readings here and there, but never really focused on. This time, I decided to invest some time to understand how this publishing tool worked.
I admit I was seduced quite quickly. Hugo isn’t a CMS per se, it’s a tool developed in Go that allows publishing a static website. By static, I mean that during the project build, Hugo generates the entire site tree including all necessary files (images, CSS, JavaScript) and integrates them into a completely static HTML version.
It’s very clever because beyond the technical aspect, it produces extremely lightweight sites to load. And we keep quite strong power over customization and layout since we can still rely on JavaScript to make certain parts dynamic.
In practice, it’s surprisingly simple:
hugoThis simple command generates the entire static site in the public/ folder, ready to be deployed on any web server.
Hugo’s other strength is its ability to handle a wide variety of templates and different types of content, all in very standard formats for technical people: articles are written in Markdown, we have YAML databases… Everything is written in very simple text files that are easy to understand, maintain, evolve, backup, or rework when necessary.
User experience: not for everyone
Let’s be honest: the editing and adding experience for new articles is still much less simple for regular users than public CMS like WordPress. Here, we’ll create Markdown files directly in the file system, and at build time, these files will be processed and converted to HTML by Hugo.
It’s true that for someone who publishes a lot of articles, this platform is certainly less suitable than a classic CMS. You need to be comfortable with command lines, file management, and have a more technical approach to publishing.
But for my usage and for someone who publishes a maximum of one article per week on average, it’s quite feasible. With a few command lines, you create the new file, edit it, and deploy it.
My technical implementation
I took the opportunity to repatriate all the content from my different blogs: the one on sports and the one I published with Delphine for our expatriation experience in Pune (India). Now, all my content is grouped under a single website with a certain priority to my technical articles, but the rest is also available.
Everything is hosted on a Git repository with a CI/CD that allows automatic deployment to my server quite simply. I have fairly basic branch management but it allows direct deployment to my server according to the desired environment.
I implemented a differential deployment system, meaning I don’t push all content to my server systematically, but only new content that gets added. Then, there’s a swap so that production deployment happens without any downtime.
Here’s the GitHub Actions workflow I use to automate this process:
name: Deploy Hugo via FTP (delta + zero downtime)
on:
push:
branches: [ main ]
jobs:
deploy:
runs-on: ubuntu-latest
env:
FTP_HOST: ${{ secrets.SFTP_HOST }}
FTP_PORT: ${{ secrets.SFTP_PORT || 21 }}
FTP_USER: ${{ secrets.SFTP_USER }}
FTP_PASSWORD: ${{ secrets.SFTP_PASSWORD }}
steps:
- uses: actions/checkout@v4
- uses: peaceiris/actions-hugo@v2
with:
hugo-version: 'latest'
extended: true
- name: Build
run: hugo
# 🔹 PUSH EN DELTA dans un slot persistant (www_a/)
- name: Upload (delta) to slot www_a via FTP
uses: SamKirkland/FTP-Deploy-Action@v4.3.5
with:
server: ${{ env.FTP_HOST }}
username: ${{ env.FTP_USER }}
password: ${{ env.FTP_PASSWORD }}
port: ${{ env.FTP_PORT }}
protocol: ftp
local-dir: public/
server-dir: www_a/ # <-- slot persistant
exclude: |
**/.git*
**/.DS_Store
# Par défaut l'action maintient un fichier d'état .ftp-deploy-sync-state.json
# dans le répertoire cible => uploads différentiels
- name: Install lftp
run: sudo apt-get update && sudo apt-get install -y lftp
# 🔹 SWAP ATOMIQUE: www -> www_b, www_a -> www, www_b -> www_a
- name: Atomic swap (slot → live)
run: |
lftp -u "$FTP_USER","$FTP_PASSWORD" -p "$FTP_PORT" "$FTP_HOST" -e '
set ftp:ssl-allow true;
set ftp:ssl-force false;
set ftp:passive-mode true;
set net:max-retries 2;
# Ne pas échouer si www n'existe pas encore (1er déploiement)
set cmd:fail-exit false;
mv www www_b;
# Étape critique : la mise en prod
set cmd:fail-exit true;
mv www_a www;
# On garde l'ancienne prod dans www_a pour le prochain delta
set cmd:fail-exit false;
mv www_b www_a;
bye
'This architecture gives me great flexibility: I can write my articles in my favorite editor, version them with Git, and have a completely automated publishing process while keeping total control over the infrastructure.
What I take away from this today
This transition to Hugo has been a real success for me. I rediscover the pleasure of publishing without third-party platform constraints, while having a robust and efficient technical solution. Being able to centralize all my content under a single digital identity was important, and it’s now done.
Hugo’s learning curve isn’t negligible, but it’s well worth it if you have a minimum of technical appetite. The speed of generated sites, template flexibility, and maintenance simplicity once everything is in place largely compensate for the initial investment.
My recommendations are simple: if you publish occasionally, like to keep control over your content, and aren’t afraid to get your hands dirty with code, Hugo is an excellent solution. However, if you’re looking for absolute simplicity and publish very regularly, a classic CMS will remain more suitable.
For my part, this solution perfectly matches my philosophy: regaining control over my data, having infrastructure I master, and being able to evolve my platform according to my specific needs. I absolutely don’t regret abandoning Medium for this more artisanal but more personal approach.