Best Way to Write Big Files That Are Continuously Updated?

I’m writing an open sourced social network with a Python wrapper that calls the API. It’s basically a Flask server.

Conceptually, status updates will be stored in a JSON list, dumped into a json file, and then that file will be hosted on IPFS.

What’s the best way to make modifications to said file? My current approach is storing it on the hard drive and then uploading it to the network every time there’s a status update.

My main concern is that I’m needlessly writing and rewriting data. Is there a way to write the file contents directly to IPFS? Using the API is ineffective as URLs are limited in number of characters. Could be maybe offer POST requests for file writing via IPFS?

Being new to IPFS, do you guys have a better approach?

I’m such an idiot! I’ve been using GET requests for all my url calls and didn’t realize POST from cURL actually meant to use the POST method.

I think I would use a MFS folder on each user’s node and add just a file for each statusupdate. The filenames could be an ISO-8601-UTC timestamp.

Each user could publish his folder via IPNS, and user who “follow” the user can search for updates of the IPNS records.

If you want to have the system scale, and load better, you could split it the folder with //, to avoid having to load an extremely large folder after a while - since that’s a costly operation for IPFS.

You probably want to have directory sharding deactivated since you probably won’t run out of folder space and sharding adds some lookup time.

The folder could hold also basic stuff, like the profile and the profile picture.

You’re website/app could be loaded via IPNS and then read the local “follow”-list. If the local user sends a status-update, a new file will be created locally and the directory CID will be pushed to the user’s IPNS record.

1 Like