Building a GeoCities like system on IPFS

Hey everyone,

Pretty new to community so forgive me if I am sounding too naive. I’ve recently started exploring IPFS and the possibilities with IPFS. While I was doing my first PoC and I really missed Geocities. As a learning exercise I actually want to build a Geocities on IPFS, and I had couple of questions that I thought community might be able to help me with.

  • I am going to host my IPFS daemon on a my local laptop (or a raspberry Pi), and build something running on Node.js and then calling my daemon API. This is going to be a limited storage machine, and I am suspecting if it really works I am going to be out of storage space. Will IPFS keep working and redistributing files as I adding more, and more to my node without pinning?
  • I will use IPNS and figure out something to give everybody consistent URLs. I’ve noticed IPNS takes some time to publish entries. Imagine people updating their files, would it be a good idea to publish IPNS with every directory update? Or should I have a periodic job to do so?

I want to start simple and evolve it into something complex. Would love to share code on my github as I build it. Happy to listen to suggestions, and open to ideas as well. I know there might be ideas out there doing something similar already, but I might pivot it into something different like JS fiddle or JS bin if I get the basics right.

4 Likes

Hello, first off really nice Proof of Concept.

As for your questions as per the documentation IPFS nodes have a StorageGCWatermark value that determines at what percentage of StorageMax the garbage collector runs and cleans up unpinned files. They also have a GCPeriod variable that tells the garbage collector how frequently to run which by default is set to one hour. If the files are unpinned when that happens and no one else has them it will probably become unavailable. Automatic Garbage Collection is turned of by default so it will probably just run out of disk space.

As for updating IPNS entries it would probably be better to either have users commit to changes when they’re done editing, or like you suggested have a job that automatically updates them.

Hope this helps.

1 Like

Thanks @Kharaa is there a where I can ask file to have N copies across IPFS? I know ipfs-cluster allows replication but is there a way to ensure we have at-least few seeders for current version of file?

IPNS I am actually converging towards doing periodic updated because IPNS commit with my experiments is taking forever!

Without using a IPFS cluster there is unfortunately no guarantee that the content will be pinned a certain number of times.You kind of have to rely on your users to either pin their own sites, which hopefully they’re tech literate enough to do otherwise they probably won’t. Content is cached by nodes that download it so that keeps it alive in the short term but if all the copies get axed by the Garbage Collector no one will be left to seed it.

As for solutions well you could setup an IPFS cluster, again the only real problem with that is having to rely on users to join and pin the content. You could also spread it accross multiple raspberry pi’s this way.

Or you could use a variety of IPFS pinning services but those cost money if you are pinning a significant amounts of content.

If anyone else can think of a method or has a suggestion feel free to add it.

2 Likes

I recommend having each ‘geocities’ webpage exist separately from each other, they would all be linked to your webpage which would serve as a decentral repository for links to these pages and a subscription feed. It would also host a search engine and a trending + featured tab to find new pages.