Announcing new collaborative clusters: Pacman cache

Hey guys,

I’m excited to announce a new collaborative cluster which holds all pkgs for pacman for caching and decentralized distribution purposes.

The Cluster is currently updated from a Tier 1 ArchLinux mirror, feel free to join!

Best regards



That’s awesome. I know most of the discussion around ipfs is about a decentralized web but I find the current state of software package distribution to be even more concerning. Virtually all of the software that runs the world is being distributed by a handful of players. A decentralized web isn’t going to do much good without decentralized access to the software that runs it.

I’d also like to see machine learning datasets and models distributed over ipfs. I’m tired of downloading huge models from some gdrive/Dropbox/etc that could go away at any moment that is probably sitting on the hard drive of the guy sitting next to me.

Linux software is already pretty well available, since there are many generous providers which setup an rsync with http.

Just look at what’s available for Arch Linux:

So my intention is not to fix an issue with availability, but help to accelerate a transition from sharing updates via static servers to spread updates from the developer machine to the cluster, accessing them via their CID instead of their filename and everyone help to share the file, making them longer and better available.

If we would transition from static compressed databases to directory listings in IPFS as ‘update lists’, we could also safe a lot of traffic, since IPFS is able to deliver delta updates.

A background process for pacman could refresh the package list constantly, without using a lot of traffic for this. Avoiding any delay for ‘syncing’.

A background process could also start to pin new updates when they are available and when they are fully available a gui program could offer a popup to the user, that new updates are available for installation.

I like also the idea of local community networks, and have a history building them. Software like IPFS can help to reduce internet traffic if there’s just low bandwidth available by sharing updates and software with other computers on the local network.

I also like to add arch-related stuff in the long term on the cluster, for example an aur cache, where scripts parsing the air pkgs can ask ipfs if there’s already a file stored with this hash. If it’s stored on ipfs it’s received via ipfs instead of the link provided in the pkg.

This helps a lot against outages of software developer servers, which was quite an issue for me in the last years on aur packages.

My server would just look at any update on the pkgs and download any new file appearing on the updates. Then checking the sha sum. If it’s fine, the file would be added with the checksum as filename to the cluster, making it available for download from the cache.

Old files are just dropped after a while from the cluster, to make room for new software.

Other arch related stuff would be non-official repos, if someone has trouble paying for the bandwidth necessary to offer a custom repo for the Arch community, we can add the files automatically to the cluster, offering a fast cache.

1 Like