The Index - Metadata search system for IPFS hashes

If anyone was familiar with The Foundation, it was an IPFS hosted app that I wrote a while ago that allowed you to write descriptions of, and classify, IPFS hashes, upload those, and then have other users search for them so that we could have some sort of organized meta-directory of interesting files. It’s main problem was that it relied on a central server to store the information. I rewrote it once I read about pubsub and realized we could use that for the same purpose, but without having to rely on any outside servers.

What can you do with The Index? You can search file names and descriptions for the stuff you want, and can also upload those files that aren’t already in the app. Syncing should be automatic, so as soon as you add a file, all other online users will be able to find it as well.

Before you start using it, you have to do some configuration on your IPFS daemon. Run the following commands to allow the app to communicate with your daemon.

ipfs config --json API.HTTPHeaders.Access-Control-Allow-Origin "[\"*\"]"
ipfs config --json API.HTTPHeaders.Access-Control-Allow-Credentials "[\"true\"]"
ipfs config --json API.HTTPHeaders.Access-Control-Allow-Methods "[\"PUT\", \"POST\", \"GET\"]"

Then start up your daemon using the following command to enable pubsub

ipfs daemon --enable-pubsub-experiment

Now navigate to either of the following addresses. The IPNS one might be updated in the future, but the IPFS address will last forever.

http://localhost:8080/ipns/QmPZHE9MgGVvoshoFoHeigkZeHPMKdbDJsCQKtEJtjNqoZ http://localhost:8080/ipfs/QmXny7UjYEiFXskWr5Un6p5DMZPU87yzdmC3VEQcCx9xBC

You should be able to do a search and find the IPFS Readme as an initial file in The Index.

How does it work?

I made the simplest CDRT possible, one that allows only adds. When the app starts up, it asks through the pubsub connection all other active apps for data. It then gathers all of their responses (which are in the form of IPFS hashes of their own databases), gets all of those database files, and adds whatever entries it doesn’t already have to it’s own database. At the end, it publishes it’s own database again so that everyone else can get the most up to date database. This doesn’t rely on a central server, only on other databases being online at the same time, so it’s generally a good idea for you to leave your daemon running and the app on in the background so you can get up to date index files and share yours with others.

4 Likes

I’m new to the community, so please excuse me if this is off topic or not of particular interest.

Have you thought about storing this type of info in an Ethereum smart contract? I’m coming from the Ethereum world, and over there, there is a huge amount of talk of solving certain problems (such as data storage) using IPFS. Is there a likewise interest in (or use of) Ethereum to solve problems from the IPFS perspective?

Do you have a public Github repo, ar any source code access for this project? Look pretty damn amazing!

1 Like

Also see this other discussion about CRDTs

I added some links to reference info there:

and this

3 Likes

I’ve got a github repo here: https://github.com/cakenggt/ipfs-foundation-frontend. Even though it is called ‘frontend’, it is actually the entire codebase.

3 Likes