There are projects like Sia, Filecoin, Storj to store data in a decentralized cloud. I can use Ceramic, for example, to create a network of data on IPFS. But what if I wanted to save content on Sia (which AFAIK is fast enough to retrieve files in real time), then make it available on IPFS, but when I receive a request for that file, I deliver it while I’m downloading from Sia, so that I don’t actually have to store that data in my computer? I think if that is possible it would be easier to enable anyone to use any blockchain to store their data, while keeping them in a single network, enabling normal pins as well and reducing costs for everyone. But is it?
I don’t actually run a node, it’s just a hypothesis.
OP advertises on IPFS that they have some content H, even though they don’t have a local copy of it.
OP sees a request for content H.
OP begins downloading H off of Sia, and starts streaming it to the requester before the download is finished.
Basically, OP is discussing the existence of IPFS nodes that don’t have the data in local storage, but know that they can fetch the data because it’s available on another platform, and so their node serves as a sort of bridge.
The implementation would be more complicated than described here because IPFS doesn’t actually serve whole files under the hood, it serves blocks. But in theory, I don’t see why something like this couldn’t be possible.
It would be kind of cool if you could register an arbitrary, deterministic function rather than just a url. functionstore? Then you could trade off storage/cpu. cid => imageResize(400, 600, some_other_cid). That way you could GC it and just recompute it if you need it. Kinda like memoizing with IPFS. URLStore functionality would be a subset with something like cid => UrlGet(some_url)