I am coding an application and it can potentially have many items in a que to be added to IPFS. I would like to create threads to add them, as some files could be large. The main reason to use threads for this is to insure a good UX for the app.
I currently donāt use use a python3 IPFS module to interface with IPFS, I use the command line interface. I have found over the last few years those libraries lag too far behind the currently available release of kubo.
Iām open to other suggestions on that, but whatever method I use (direct API on pot 5001 or cmd line wrapper for it) can I do multiple simultaineous āipfs add ā commands?
1 Like
I donāt see why not. Whatās happening in the background is you have a Kubo node running, when you run ipfs add ...
itās connecting to that Kubo node, and sending it the data you requested over the RPC. Itāll handle multiple of these requests without issue, though Iām unsure on what happens specifically but I see two options:
- Itāll spawn a couple goroutines and process both sets of data at the same time
- Itāll queue / batch the adds
Either way your use-case sounds fine. Donāt be afraid to experiment
.
Thank you for that info, but unfortunately the command line interface doesnāt appear to be multithreaded.
returncode=1, stdout=āā, stderr='Error: lock /home/owner/.ipfs/repo.lock: someone else has the lock
The files are already pinned, so adding them again goes as fast as possible. I can add them successfully on the command line manually, but when I try to add them in a multithreaded way with 2 threads, each thread adding 2 files sequencially I get the ālockā error, consistently.
I was truly hoping to avoid that effort by hearing from someone that knew for certain, but it was no big deal to use the API with curl.
The good news is that bypassing the IPFS command line (using the API with curl writing to port 5001 directly) appears to work without a problem. I still need to test more files of various sizes, and files that arenāt already added.