Jul 4, 2022
As a software engineer with a healthy bit of job-induced paranoia, I love the idea of escaping the claws of big corporate via self-hosting. In recent years, a plethora of amazing free and open source projects have emerged, which allow you to replace everything from Dropbox and iTunes to Android Home. However, I don’t want to manage yet another server in the cloud, I already do that for work. Managed container services like Google Cloud Run or AWS Fargate are nice, but they cost money, and persistence and configuration is not always trivial, and you end up handing off a lot of control again.
So, why not host my services on a machine which
- I have to update and maintain anyways
- Which has good and reliable backups set up already
- Is available (to me) even if the internet or power goes down
- Does not cost me additional money to run
That would be my personal laptop.
After all, there is no place like 127.0.0.1, so they saying goes (or
Of course, there are some limitations:
- Not reachable from outside my home, i.e. when my Phone is not in my home WiFi
- … that’s it I think.
Caveat: I am not hosting any “critical” service like contacts sync, email, calendar on my Laptop. For those, I still “trust” the big G.
What I currently run on my Laptop:
- Photoprism for my pictures
- Navidrome serving my old iTunes music collection
- Shiori for my bookmarks
- Syncthing to sync files between my phone and my Laptop
You might note that the first three of those 10 years ago would have been normal “desktop apps”. Today, many people use cloud services for those however. Syncthing did replace Dropbox for me, though.
I use syncthing to sync files, photos, and notes between my Laptop and Phone. The official Syncthing for Android app simply syncs my DCIM directory from my phone to my Laptop, giving me an entirely self-hosted and synchronized “camera roll”. I also use Joplin Notes on my phone and laptop, using local folder storage, which is then also synched via Syncthing.
I am a firm believer in Docker-compose for production deployments.
For simple localhost hosting, it is enough to mount some volumes from the local FS for persistence, and specify
Where a database is needed, SQLite is in use.
Warning: some additional caveats apply regarding firewall and network ports, see https://github.com/chaifeng/ufw-docker. Make sure that the self-hosted apps are not reachable from the outside (LAN).
For example, this is my
docker-compose.yml for Shiori:
# Setup: # # mkdir -p ~/.shiori/data # must be owned by 1000:1000 # docker-compose run --rm shiori migrate # docker-compose up -d # # Then, add account in settings (http://localhost:2344/#setting): # default login: shiori:gopher version: '2.4' services: shiori: image: ghcr.io/go-shiori/shiori:latest restart: unless-stopped user: 1000:1000 ports: - 2344:8080 volumes: # persist all data at ~/.shiori/data - "~/.shiori/data:/shiori/"
version: '3.5' services: photoprism: image: photoprism/photoprism:latest restart: unless-stopped user: 1000:1000 security_opt: - seccomp:unconfined - apparmor:unconfined ports: - "127.0.0.1:2342:2342" environment: PHOTOPRISM_HTTP_PORT: 2342 PHOTOPRISM_ADMIN_PASSWORD: "..." PHOTOPRISM_PUBLIC: "false" PHOTOPRISM_READONLY: "true" PHOTOPRISM_DATABASE_DRIVER: "sqlite" PHOTOPRISM_SITE_URL: "http://localhost:2342/" volumes: - "~/Pictures/fotos:/photoprism/originals:ro" - "~/.photoprism:/photoprism/storage"
To update the apps, a simple
docker-compose pull; docker-compose up -d is enough.
I back up my laptop via Borg anyways, so there is no need for separate backup.
The only thing to consider there would be to run
sqlite file.db '.backup file-backup.db' before backups, because it is not safe to simply copy in-use SQLite files.