Need help organizing approach for IPFS I/O

I have a pretty simple goal, but I’m having trouble figuring out what to do here.

I have a simple python app, here:

that uses JSON files. I want the user to be able to download the file from a directory I’ve created using Pinata:

then at the end of the session, upload them back to IPFS.

Things I’ve tried so far:

I can open a json file in python like this:
url = “

with urllib.request.urlopen(url) as url1:
data = json.loads(

But I can’t seem to figure out how to get a list of the files in a directory, then isolate individual urls for the user to be able to pick one to load. The pinata hosted urls don’t seem to be readable by python either.

So that’s my input problem.

For output, I got my python script sending POST requests successfully to the Pinata API. The problem is that it doesn’t add to my original directory and I can’t find anything in the pinata documentation that explains how to do that.

I’m also a little confused at how my users will be able to find and browse the files.
Thanks for your help and patience,

I think your have to wrap your files and directory to keep the metadata, using the Files API:


Never used it myself, though.

hi if you use a JSON-FILES why do not use ipld ?

I ended up solving this by using pinata, which saves some metadata, and also by saving the metadata in a json file locally, then also saving all the jsons into a master metadata file on the IPFS, so that at any time, if my app or servers go down, they can revert to their most recent state through that master metadata file. Please check it out Check out my flashcard app that uses the IPFS!
I’d appreciate some feedback

I read that but couldn’t find a way to make it work for me. ended up using pinata, seems to be working.
Please check it out Check out my flashcard app that uses the IPFS!