How do you use IPFS before production to test out that the images are uploaded without polluting the IPFS network?

Hey,

I am going to try to explain it clearly. I am currently working on an NFT project, with a collection of 2,000+ images.

I am still in the development phase and am currently creating the back-end, which will receive requests from the front-end, create a random NFT, push it onto IPFS, take the returned hash and JSON, and then sit it into the smart contract (to put it simply).

I want to test out that the whole flow works before we launch - but I am not sure how to do that without actually uploading files to the IPFS. Is there a specific IPFS domain or network where everyone can push their images for testing before switching to the ‘main’ network?

Should I simply push all these files to the IPFS network and simply not record them? Seems like it would pollute the network with extra files that aren’t necessary.

I am really not sure what to do, as I also want to pin the files in order to ensure that they’ll be returned correctly and displayed on OpenSea.

Please let me know if I am clear, and thank you for your time.

I hope to get some insight.

You don’t upload files to network, You are adding files to your own node.

1 Like

As hsn10 mentioned, there is no upload happening, if you’re using IPFS, it’s like making those those files available in a p2p network.

That said, I’d suggest checking out https://nft.storage/ if you’re working with NFTs.

2 Likes

You could run a separate IPFS network, or just an IPFS test daemon that is on an isolated network. I am sure there is documentation about it.

I publish websites generated with Gatsby to IPFS. I make content available on beta.example.com before rolling it to example.com and www.example.com, but that is just the configuration of the entrypoint.

of course, you could have your IPFS client check on the URL to know which server must use.

the ipfs test daemon could be local to the test browser, for example, by creating DNS static host entries on the browsing test environment.

Surely, you should be able to get a full test environment with a few tips.

Also, if you use gitlab pipelines, you could spawn a whole test environment on every run, with a private ipfs node, a headless browser, your server, ipfs client… and had some automated testing checking it all.

you could probably be interested in filebase

powered by sia, storj and skynet

“Filebase offers a 5GB free tier to all users, with no expiration or trials. If you need more storage, simply upgrade to our pay-as-you-go pricing model. A subscription costs only $5.99 per month and includes 1 TB of storage and 1 TB of transfer.”