Scaling of IPFS for storage of 10 million+ small files


We are looking to deploy an NFT project which would rely on IPFS for storage. However, we are looking to get more information on how scalable this is, as we have a little over 10 million PNG files (approx. 5 kb each).

Mainly, we are wondering if it is possible to upload and pin these files in a scalable manner.

Is this currently possible with IPFS? Are there any solutions to make this process easier? Can webhooks be incorporated on a project of this many files to update metadata as needed?

Thank you for the help!

We’d love to help at Filebase - Give things a try and let us know if you want to chat!

Our S3-Compatible API, Multipart Upload Support and much more should make things very simple/easy for you to integrate.

1 Like

IPFS should be able to handle this scale; it’s more a question of the specific infrastructure to which you deploy IPFS or specific pinning service you decide to use.

I recommend checking out, which is a pinning service that uses a hyper scalable implementation of IPFS.

To get a sense of’s magnitude, here are some numbers from their stats page:

The quickstart guide has a number of their tools for uploading: NFT.Storage Docs

What are you trying to do with webhooks exactly? Tracking the progress of the upload process?

It may make sense to write a custom upload script that tracks the progress of the whole process in a local file or an SQLite DB.

Maybe you can share more information on what you mean by updating metadata?