Publishing apt and yum/dnf repos on GitHub Pages

27Feb26

TL;DR

GitHub Pages is a practical way to host a low volume repo for apt and yum/dnf. The relevant metadata can be generated using GitHub Actions, and the process can be triggered by a release from the source repo.

Background

In my last post I wrote about creating .deb and .rpm packages (for our Dart binaries), but most people would rather not install those things manually with low level tools. They’d prefer to use their package manager, and then it can take care of subsequent updates.

Those package managers rely on metadata that’s added around the packages themselves. So that needs to be generated.

More fundamentally this stuff needs to be hosted somewhere. It’s ‘just’ a website, but websites need servers, and those servers need connectivity. GitHub Pages provides a free way to do the hosting, so long as you’re comfortable with its limits – no more than 1GB of files and 100GB of bandwidth/month.

Creating metadata

Once there’s a new release in place I need to download the .deb and .rpm packages and create the metadata for their respective repos.

The download piece is common across both types, using the gh command line tool to fetch files from the latest release.

Apt

My update-repo workflow for apt then uses the apt-ftparchive tool to generate metadata and gpg to provide signatures.

Rpm

Yum/dnf repos are ostensibly simpler, but that doesn’t show very much in the different update-repo workflow[1]. The main difference is that it uses createrepo_c for metadata.

Triggering the rebuilds

Our normal process for a NoPorts release is to first do a pre-release, which creates all the binaries and packages. Once we’re happy with that the release can be promoted to ‘latest’ and at this point we want to trigger updates to the apt and rpm repos.

That’s taken care of by an update-packages workflow that listens for the release being promoted, then fires off a bunch of repository_dispatch messages that start the update-repo workflows.

Brewey bonus

We also rebuild our homebrew-tap at the same time. Homebrew is much easier to deal with from a hosting perspective as it doesn’t involve slinging around huge packages full of binaries. It’s a very GitHub friendly approach that puts metadata in place pointing to the archives from the GitHub release – so no worries about storage or bandwidth limits.

Busting past the limits

A typical .deb/.rpm for NoPorts is around 50MB, and we support 4 architectures, making each release around 200MB. So we can fit around 5 releases in place before we need a housekeeping job to start clearing out old versions. Not great, not terrible.

That also means that we get around 2000 downloads per month before a repo hits its bandwidth limit. I’d consider that a quality problem (more people using NoPorts). But it’s good to have a plan…

Package URL

The GitHub pages sites are configured to use apt.noports.com and rpm.noports.com using a custom CNAME. So if we need to move hosting elsewhere we can just point the DNS to the new server/service.

Cheap hosting

Free is everybody’s favourite price, which is why I like GitHub Pages (for this and many other things). But I also appreciate that free gets exploited and abused, which is why GitHub have to have limits.

For a previous project that involved lots of people downloading large binaries I used cheap Virtual Private Servers (VPS) of the kind that show up on LowEndBox. It’s possible to get servers with more space than we’ll ever use, and TB of monthly bandwidth, for a few $/m. But… VPS providers can be flaky, and there’s an admin overhead in running those servers.

My likely upgrade route today would be AWS CloudFront. Corey Quinn recently posted The Complete Guide to CloudFront’s Flat-Rate Pricing, and $15/m for 50TB of data transfer is a bargain; and that’s about 1M package downloads.

Conclusion

It’s pretty straightforward to automate package metadata using GitHub Actions around the various package management tools, and GitHub Pages provides free and easy hosting for a low volume site.

Pages won’t be enough for higher volume, but at least the investment in generating metadata etc. isn’t wasted, as that can be carried over to a hosting environment that offers more storage and bandwidth.

Note

[1] After taking an afternoon to put together the apt automation the rpm derivative took a few minutes, and worked first time :)



No Responses Yet to “Publishing apt and yum/dnf repos on GitHub Pages”

  1. Leave a Comment

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.