I deployed IPFS on the server and used it http://localhost:5001/api/v0/files/cp?arg= &arg=I have built a directory for the relevant files, and now I have encountered a question. Is there a Kubo version or mature solution that can support backing up directories and related files to another server to prevent file and build directory loss caused by server damage? Thank you
No version of Kubo would work since the /v0/files/cp
endpoint does not store the files in your node in the first place. Instead it creates a lazy link which is downloaded on demand when you try to view them.
v0/pin/add
is what you seem to want right now.
More generally reliability wise flatfs is fine (afaik) however there rare issues with gc and pinning.
Using shared flatfs or S3 pools while never running GC is what some (Most ?) big Kubo clusters run.
But one single node can still loose data if disks are lost and I’m personally not comfortable with backups in hot systems.
What I am doing personally is exporting .car
files with ipfs dag export
(theses can be used as cold storage of the IPFS blocks) and backing them up on archival storage like LTO tapes, backblaze, AWS glacier, …